Computer science

777 quotes found

"Mathematics and logic, historically speaking, have been entirely distinct studies. Mathematics has been connected with science, logic with Greek. But both have developed in modern times: logic has become more mathematical and mathematics has become more logical. The consequence is that it has now become wholly impossible to draw a line between the two; in fact, the two are one. They differ as boy and man: logic is the youth of mathematics and mathematics is the manhood of logic. This view is resented by logicians who, having spent their time in the study of classical texts, are incapable of following a piece of symbolic reasoning, and by mathematicians who have learnt a technique without troubling to inquire into its meaning or justification. Both types are now fortunately growing rarer. So much of modern mathematical work is obviously on the border-line of logic, so much of modern logic is symbolic and formal, that the very close relationship of logic and mathematics has become obvious to every instructed student. The proof of their identity is, of course, a matter of detail: starting with premises which would be universally admitted to belong to logic, and arriving by deduction at results which as obviously belong to mathematics, we find that there is no point at which a sharp line can be drawn, with logic to the left and mathematics to the right. If there are still those who do not admit the identity of logic and mathematics, we may challenge them to indicate at what point, in the successive definitions and deductions of Principia Mathematica, they consider that logic ends and mathematics begins. It will then be obvious that any answer must be quite arbitrary."

- Logic

0 likesComputer scienceMathematicsMindPhilosophyLogic
"In 1991, Ellen Spertus, now a computer scientist at Mills College, published a report on women’s experiences in programming classes. She cataloged a landscape populated by men who snickered about the presumed inferiority of women and by professors who told female students that they were “far too pretty” to be studying electrical engineering; when some men at Carnegie Mellon were asked to stop using pictures of naked women as desktop wallpaper on their computers, they angrily complained that it was censorship of the sort practiced by “the Nazis or the Ayatollah Khomeini.” As programming was shutting its doors to women in academia, a similar transformation was taking place in corporate America. The emergence of what would be called “culture fit” was changing the who, and the why, of the hiring process. Managers began picking coders less on the basis of aptitude and more on how well they fit a personality type: the acerbic, aloof male nerd. The shift actually began far earlier, back in the late ’60s, when managers recognized that male coders shared a growing tendency to be antisocial isolates, lording their arcane technical expertise over that of their bosses. Programmers were “often egocentric, slightly neurotic,” as Richard Brandon, a well-known computer-industry analyst, put it in an address at a 1968 conference, adding that “the incidence of beards, sandals and other symptoms of rugged individualism or nonconformity are notably greater among this demographic.” In addition to testing for logical thinking, as in Mary Allen Wilkes’s day, companies began using personality tests to select specifically for these sorts of caustic loner qualities. “These became very powerful narratives,” says Nathan Ensmenger, a professor of informatics at Indiana University, who has studied this transition. The hunt for that personality type cut women out. Managers might shrug and accept a man who was unkempt, unshaven and surly, but they wouldn’t tolerate a woman who behaved the same way. Coding increasingly required late nights, but managers claimed that it was too unsafe to have women working into the wee hours, so they forbid them to stay late with the men."

- Programming

0 likesComputer science
"As I understand the theory of period information doubling, this states that if we take one period of human information as being the time between the invention of the first hand axe, say around 50,000 BC and 1 AD, then this is one period of human information and we can measure it by how many human inventions we came up during that time. Then we see how long it takes for us to have twice as many inventions. This means that human information has doubled. As it turns out, after the first 50,000-year period, the second period is about 1500 years, say around the time of the Renaissance. By then we have twice as much information. To double again, human information took a couple of hundred years. The period speeds up—between 1960 and 1970, human information doubled. As I understand it, at the last count human information was doubling around every 18 months. Further to this, there is a point sometime around 2015 where human information is doubling every thousandth of a second. This means that in each thousandth of a second we will have accumulated more information than we have in the entire previous history of the world. At this point I believe that all bets are off. I cannot imagine the kind of culture that might exist after such a flashpoint of knowledge. I believe that our culture would probably move into a completely different state, would move past the boiling point, from a fluid culture to a culture of steam."

- Information

0 likesComputer sciencePhilosophySemiotics
"You can't physically touch software. You can hold a floppy disk or CD-ROM in your hand, but the software itself is a ghost that can be moved from one object to another with little difficulty. In contrast, a road is a solid object that has a definite size and shape. You can touch the material and walk the route... Software is a codification of a huge set of behaviors: if this occurs, then that should happen, and so on. We can visualize individual behaviors, but we have great difficulty visualizing large numbers of sequential and alternative behaviors... The same things that make it hard to visualize software make it hard to draw blueprints of that software. A road plan can show the exact location, elevation, and dimensions of any part of the structure. The map corresponds to the structure, but it's not the same as the structure. Software, on the other hand, is just a codification of the behaviors that the programmers and users want to take place. The map is the same as the structure... This means that software can only be described accurately at the level of individual instructions... A map or a blueprint for a piece of software must greatly simplify the representation in order to be comprehensible. But by doing so, it becomes inaccurate and ultimately incorrect. This is an important realization: any architecture, design, or diagram we create for software is essentially inadequate. If we represent every detail, then we're merely duplicating the software in another form, and we're wasting our time and effort."

- Software

0 likesComputer scienceSoftware
"The integration technology and infrastructure elements available today, in 1993, would enable an enterprise to develop a significant integration infrastructure. However, integration projects are constrained by cultural inertia, financial and resource limitations, and, significantly, risk management Thus, projects and their supporting integration infrastructures tend to be deployed in an incremental and evolutionary manner. Since each enterprise chooses its integration path based on particular business needs, the corporations visited in this study each presented a different road map of integration efforts to date and a unique snapshot of current integration infrastructure.... DoD, in concert with leading companies, should formulate an R&D strategy to create a new generation of enterprise architectures, models, tools, and software systems, and to determine the potential for new business operations, engineering practices, and manufacturing concepts. To achieve potential functional and performance improvements, integrators should combine the leverage of several emerging threshold technologies, such as operational integration frameworks, object-based and knowledge-based product and process representations, application-oriented network services, near-term and mid-term solutions to database integration, and wide-area object brokerage and execution.-"

- Enterprise architecture

0 likesBusinessComputer scienceHolismOrganizational theoryChronologically ordered theme pages to be converted to alphabetical ordering
"In the early '80's, there was little interest in the idea of Enterprise Reengineering or Enterprise Modeling and the use of formalisms and models was generally limited to some aspects of application development within the Information Systems community. The subject of "architecture" was acknowledged at that time, however, there was little definition to support the concept. This lack of definition precipitated the initial investigation that ultimately resulted in the "Framework for Information Systems Architecture." Although from the outset, it was clear that it should have been referred to as a "Framework for Enterprise Architecture," that enlarged perspective could only now begin to be generally understood as a result of the relatively recent and increased, world-wide focus on Enterprise "engineering." The Framework as it applies to Enterprises is simply a logical structure for classifying and organizing the descriptive representations of an Enterprise that are significant to the management of the Enterprise as well as to the development of the Enterprise’s systems. It was derived from analogous structures that are found in the older disciplines of Architecture/Construction and Engineering/Manufacturing that classify and organize the design artifacts created over the process of designing and producing complex physical products (e.g. buildings or airplanes.)"

- Enterprise architecture

0 likesBusinessComputer scienceHolismOrganizational theoryChronologically ordered theme pages to be converted to alphabetical ordering
"Generically, an architecture is the description of the set of components and the relationships between them. Simple enough. The trouble starts when you tack on an adjective: There are software architectures, hardware architectures, network architectures, system architectures, and enterprise architectures. People have their own preconceived notions and experiences about “architecture.” A software architecture describes the layout of the software modules and the connections and relationships among them. A hardware architecture can describe how the hardware components are organized. However, both these definitions can apply to a single computer, a single information system, or a family of information systems. Thus “architecture” can have a range of meanings, goals, and abstraction levels, depending on who’s speaking. An information system architecture typically encompasses an overview of the entire information system—including the software, hardware, and information architectures (the structure of the data that systems will use). In this sense, the information system architecture is a meta-architecture. An enterprise architecture is also a meta-architecture in that it comprises many information systems and their relationships (technical infrastructure). However, because it can also contain other views of an enterprise—including work, function, and information—it is at the highest level in the architecture pyramid. It is important to begin any architecture development effort with a clear definition of what you mean by “architecture.”"

- Enterprise architecture

0 likesBusinessComputer scienceHolismOrganizational theoryChronologically ordered theme pages to be converted to alphabetical ordering
"The concept of EAs dates back to the mid-1980s. At that time, John Zachman, widely recognized as a leader in the field, identified the need to use a logical construction blueprint (i.e., an architecture) for defining and controlling the integration of systems and their components. Accordingly, Zachman developed a “framework” or structure for logically defining and capturing an architecture. Drawing parallels to the field of classical architecture, and, later, to the aircraft manufacturing industry, in which different work products (e.g., architect plans, contractor plans, shop plans, bills of lading) represent different views of the planned building or aircraft, respectively, Zachman’s framework identified the kind of work products needed to understand and thus build a given system or entity. In short, this framework provides six perspectives or windows from which to view how a given entity operates. The perspectives are those of the (1) strategic planner, (2) system user, (3) system designer, (4) system developer, (5) subcontractor, and (6) system itself. Associated with each of these perspectives, Zachman also proposed six abstractions of the entity, or models covering (1) how the entity operates, (2) what the entity uses to operate, (3) where the entity operates, (4) who operates the entity, (5) when entity operations occur, and (6) why the entity operates. Zachman’s framework provides a way to identify and describe an entity’s existing and planned component parts and the parts’ relationships before the costly and time-consuming efforts associated with developing or transforming the entity begin."

- Enterprise architecture

0 likesBusinessComputer scienceHolismOrganizational theoryChronologically ordered theme pages to be converted to alphabetical ordering
"[T]he average company’s enterprise system - i.e. the overall system of IT related entities - is today highly complex. Technically, large organizations possess hundreds or thousands of extensively interconnected and heterogeneous single IT systems performing tasks that varies from enterprise resource planning to real-time control and monitoring of industrial processes. Moreover are these systems storing a wide variety of sometimes redundant data, and typically they are deployed on several different platforms... Organizationally, the enterprise system embraces business processes and business units using as well as maintaining and acquiring the IT systems. The interplay between the organization and the IT systems are further determined by for instance business goals, ownership and governance structures, strategies, individual system users, documentation, and cost. Lately, Enterprise Architecture (EA) has evolved with the mission to take a holistic approach to managing the above depicted enterprise system. The discipline’s presumption is that architectural models are the key to succeed in understanding and administrating enterprise systems. Compared to many other engineering disciplines, EA is quite immature in many respects. This thesis identifies.. firstly, the lack of explicit purpose for architectural models... [A] company’s Chief Information Officer (CIO) should guide the rationale behind the development of EA models. In particular, distribution of IT related information and knowledge throughout the organization is emphasized as an important concern uncared for. Secondly, the lack of architectural theory is recognized..."

- Enterprise architecture

0 likesBusinessComputer scienceHolismOrganizational theoryChronologically ordered theme pages to be converted to alphabetical ordering
"EA originated from and is influenced by a number of business areas: The manufacturing industry, with Material Requirements Planning (MRP) and later Manufacturing Resource Planning (MRP II). These approaches developed into the so-called supply chain or value chain (Porter, et al). Not only the incoming logistics and internal operations were considered, but also the flow of material to customers and back. The second origin of EA growth was from Process Modelling and Design approaches, e.g. Business Process Re-engineering (Hammer, et al). These approaches seek to depict the enterprise in terms of business processes, leading to process improvements and “end-to-end” process integration. Corporate and process governance, organisational adaptability and IM/IT system integration were typical considerations. Organisations were consequently often restructured, to become “flatter” (less management layers) and coined process-centred or process-oriented organisations. A third development is a type of backward integration where software developers trie to better understand and serve the business world with “functioning and value-adding” software solutions (business applications). It is a well-known fact that enterprise integration software (ERP, etc.), according to business users and owners, are often considered to be failures."

- Enterprise architecture

0 likesBusinessComputer scienceHolismOrganizational theoryChronologically ordered theme pages to be converted to alphabetical ordering
"The concepts of purposive behavior and teleology have long been associated with a mysterious, self-perfecting or goal-seeking capacity or final cause, usually of superhuman or super-natural origin. To move forward to the study of events, scientific thinking had to reject these beliefs in purpose and these concepts of teleological operations for a strictly mechanistic and deterministic view of nature. This mechanistic conception became firmly established with the demonstration that the universe was based on the operation of anonymous particles moving at random, in a disorderly fashion, giving rise, by their multiplicity, to order and regularity of a statistical nature, as in classical physics and gas laws. The unchallenged success of these concepts and methods in physics and astronomy, and later in chemistry, gave biology and physiology their major orientation. This approach to problems of organisms was reinforced by the analytical preoccupation of the Western European culture and languages. The basic assumptions of our traditions and the persistent implications of the language we use almost compel us to approach everything we study as composed of separate, discrete parts or factors which we must try to isolate and identify as potential causes. Hence, we derive our preoccupation with the study of the relation of two variables. We are witnessing today a search for new approaches, for new and more comprehensive concepts and for methods capable of dealing with the large wholes of organisms and personalities."

- Cybernetics

0 likesScienceEngineeringComputer scienceChronologically ordered theme pages to be converted to alphabetical ordering
"From the start, the cyborg was more than just another technical project; it was a kind of scientific and military daydream. The possibility of escaping its annoying bodily limitations led a generation that grew up on Superman and Captain America to throw the full weight of its grown-up R&D budget into achieving a real-life superpower. By the mid-1960s, cyborgs were big business, with millions of US Air Force dollars finding their way into projects to build exoskeletons, master-slave robot arms, biofeedback devices, and expert systems. For all the big bucks and high seriousness, the prevailing impression left by old cyborg technical papers is of a rather expensive kind of science fiction. Time and again, scientific reasoning melts into metaphysical speculation about evolution, human boundaries, and even the possibility of what Clynes and Kline call "a new and larger dimension for man's spirit." The cyborg was always as much a creature of scientific imagination as of scientific fact. It wasn't only the military that was captivated by the possibilities of the cyborg. The dream of improving human capabilities through selective breeding had long been a staple of the darker side of Western medical literature. Now there was the possibility of making better humans by augmenting them with artificial devices. Insulin drips had been used to regulate the metabolisms of diabetics since the 1920s. A heart-lung machine was used to control the blood circulation of an 18-year-old girl during an operation in 1953. A 43-year-old man received the first heart pacemaker implant in 1958. By the 1970s, the idea of an augmented human had entered the mainstream. Steve Austin, The Six Million Dollar Man, and his cohort Jaime Sommers, The Bionic Woman (with bionic limbs and a super-sensitive bionic ear), were popular heroes, their custom superpowers bought off the shelf like a digital watch. The cyborg had grown from a lecture-room fantasy into the stuff of prime-time TV."

- Cybernetics

0 likesScienceEngineeringComputer scienceChronologically ordered theme pages to be converted to alphabetical ordering
"Since the 1960s, Japan has produced a considerable number of cyborg narratives in manga and anime, particularly in works targeting male children and adolescents. From early manga examples such as Kazumasa Hirai and Hiro Kuwata's 8 Man and Shotaro Ishinomori's Cyborg 009, and their subsequent anime versions, the protagonist is commonly cyborged against their will or desires. This positions them as victims, regardless of how physically powerful they are. Their sense of inferiority and vulnerability usually underpins these narratives, either subtly or explicitly. The depiction of female cyborgs adds complexity to the positioning of cyborgs in manga and anime, especially in terms of gender. Female cyborgs may be equipped with remarkable physical strength, combined with voluptuous, eroticized bodies (for instance Major Motoko Kusanagi in Masamune Shirow's original manga and Mamoru Oshii's anime version of Ghost in the Shell); and these powerful female cyborgs are also frequently ascribed roles as protectors or supporters of incompetent and insecure male protagonists. Although some female cyborgs may possess characteristics that indicate a transgression of the conventional boundaries of gender, this transgression is often limited and undermined by other elements of their depiction. As Kumiko Sato points out in her essay "How Information Technology Has "Not, Changed Feminism and Japanism", "female cyborgs and androids have been domesticated and fetishized into maternal and sexual protectors of the male hero" and thus "their functions is usually reduced to either a maid or a goddess obediantly serving her beloved male master, the sole reason for her militant nature.""

- Cybernetics

0 likesScienceEngineeringComputer scienceChronologically ordered theme pages to be converted to alphabetical ordering
"The power of custom is enormous, and so gradual will be the change, that man's sense of what is due to himself will be at no time rudely shocked; our bondage will steal upon us noiselessly and by imperceptible approaches; nor will there ever be such a clashing of desires between man and the machines as will lead to an encounter between them. Among themselves the machines will war eternally, but they will still require man as the being through whose agency the struggle will be principally conducted. In point of fact there is no occasion for anxiety about the future happiness of man so long as he continues to be in any way profitable to the machines; he may become the inferior race, but he will be infinitely better off than he is now. Is it not then both absurd and unreasonable to be envious of our benefactors? And should we not be guilty of consummate folly if we were to reject advantages which we cannot obtain otherwise, merely because they involve a greater gain to others than to ourselves? “With those who can argue in this way I have nothing in common. I shrink with as much horror from believing that my race can ever be superseded or surpassed, as I should do from believing that even at the remotest period my ancestors were other than human beings. Could I believe that ten hundred thousand years ago a single one of my ancestors was another kind of being to myself, I should lose all self-respect, and take no further pleasure or interest in life. I have the same feeling with regard to my descendants, and believe it to be one that will be felt so generally that the country will resolve upon putting an immediate stop to all further mechanical progress, and upon destroying all improvements that have been made for the last three hundred years. I would not urge more than this. We may trust ourselves to deal with those that remain, and though I should prefer to have seen the destruction include another two hundred years, I am aware of the necessity for compromising, and would so far sacrifice my own individual convictions as to be content with three hundred. Less than this will be insufficient.”"

- Artificial intelligence

0 likesArtificial intelligenceComputer scienceTechnologyMindBelief
"I have grown accustomed to the disrespect expressed by some of the participants for their colleagues in the other disciplines. "Why, Dan," ask the people in artificial intelligence, "do you waste your time conferring with those neuroscientists? They wave their hands about 'information processing' and worry about where it happens, and which neurotransmitters are involved, but they haven't a clue about the computational requirements of higher cognitive functions." "Why," ask the neuroscientists, "do you waste your time on the fantasies of artificial intelligence? They just invent whatever machinery they want, and say unpardonably ignorant things about the brain." The cognitive psychologists, meanwhile, are accused of concocting models with neither biological plausibility nor proven computational powers; the anthropologists wouldn't know a model if they saw one, and the philosophers, as we all know, just take in each other's laundry, warning about confusions they themselves have created, in an arena bereft of both data and empirically testable theories. With so many idiots working on the problem, no wonder consciousness is still a mystery. All these charges are true, and more besides, but I have yet to encounter any idiots. Mostly the theorists I have drawn from strike me as very smart people – even brilliant people, with the arrogance and impatience that often comes with brilliance – but with limited perspectives and agendas, trying to make progress on the hard problems by taking whatever shortcuts they can see, while deploring other people's shortcuts. No one can keep all the problems and details clear, including me, and everyone has to mumble, guess and handwave about large parts of the problem."

- Artificial intelligence

0 likesArtificial intelligenceComputer scienceTechnologyMindBelief
"What makes the goal of accuracy so vexing for chatbots is that they operate probabilistically when choosing the next word in a sentence; they aren’t trying to find the light of truth in a murky world. “These models are built to generate text that sounds like what a person would say — that’s the key thing,” Jesse Dodge says. “So they’re definitely not built to be truthful.” I asked Margaret Mitchell, a computer scientist who studied the ethics of A.I. at Google, whether factuality should have been a more fundamental priority for A.I. Mitchell, who has said she was fired from the company for criticizing how it treated colleagues working on bias in A.I. (Google says she was fired for violating the company’s security policies), said that most would find that logical. “This common-sense thing — ‘Shouldn’t we work on making it factual if we’re putting it forward for fact-based applications?’ — well, I think for most people who are not in tech, it’s like, ‘Why is this even a question?’” But, Mitchell said, the priorities at the big companies, now in frenzied competition with one another, are concerned with introducing A.I. products rather than reliability. The road ahead will almost certainly lead to improvements. Mitchell, who now works as the chief ethics scientist at the A.I. company Hugging Face, told me that she foresees A.I. companies’ making gains in accuracy and reducing biased answers by using better data. “The state of the art until now has just been a laissez-faire data approach,” she said. “You just throw everything in, and you’re operating with a mind-set where the more data you have, the more accurate your system will be, as opposed to the higher quality of data you have, the more accurate your system will be.” Jesse Dodge, for his part, points to an idea known as “retrieval,” whereby a chatbot will essentially consult a high-quality source on the web to fact-check an answer in real time. It would even cite precise links, as some A.I.-powered search engines now do. “Without that retrieval element,” Dodge says, “I don’t think there’s a way to solve the hallucination problem.” Otherwise, he says, he doubts that a chatbot answer can gain factual parity with Wikipedia or the Encyclopaedia Britannica."

- Artificial intelligence

0 likesArtificial intelligenceComputer scienceTechnologyMindBelief
"Even if conflicts like this don’t impede the advance of A.I., it might be stymied in other ways. At the end of May, several A.I. researchers collaborated on a paper that examined whether new A.I. systems could be developed from knowledge generated by existing A.I. models, rather than by human-generated databases. They discovered a systemic breakdown — a failure they called “model collapse.” The authors saw that using data from an A.I. to train new versions of A.I.s leads to chaos. Synthetic data, they wrote, ends up “polluting the training set of the next generation of models; being trained on polluted data, they then misperceive reality.” The lesson here is that it will prove challenging to build new models from old models. And with chat-bots, Ilia Shumailov, an Oxford University researcher and the paper’s primary author, told me, the downward spiral looks similar. Without human data to train on, Shumailov said, “your language model starts being completely oblivious to what you ask it to solve, and it starts just talking in circles about whatever it wants, as if it went into this madman mode.” Wouldn’t a plug-in from, say, Wikipedia, avert that problem, I asked? It could, Shumailov said. But if in the future Wikipedia were to become clogged with articles generated by A.I., the same cycle — essentially, the computer feeding on content it created itself — would be perpetuated."

- Artificial intelligence

0 likesArtificial intelligenceComputer scienceTechnologyMindBelief
"Barack Obama: My general observation is that it has been seeping into our lives in all sorts of ways, and we just don’t notice; and part of the reason is because the way we think about AI is colored by popular culture. There’s a distinction, which is probably familiar to a lot of your readers, between generalized AI and specialized AI. In science fiction, what you hear about is generalized AI, right? Computers start getting smarter than we are and eventually conclude that we’re not all that useful, and then either they’re drugging us to keep us fat and happy or we’re in the Matrix. My impression, based on talking to my top science advisers, is that we’re still a reasonably long way away from that. It’s worth thinking about because it stretches our imaginations and gets us thinking about the issues of choice and free will that actually do have some significant applications for specialized AI, which is about using algorithms and computers to figure out increasingly complex tasks. We’ve been seeing specialized AI in every aspect of our lives, from medicine and transportation to how electricity is distributed, and it promises to create a vastly more productive and efficient economy. If properly harnessed, it can generate enormous prosperity and opportunity. But it also has some downsides that we’re gonna have to figure out in terms of not eliminating jobs. It could increase inequality. It could suppress wages."

- Artificial intelligence

0 likesArtificial intelligenceComputer scienceTechnologyMindBelief
"The 19th and first half of the 20th century conceived of the world as chaos. Chaos was the oft-quoted blind play of atoms, which, in mechanistic and positivistic philosophy, appeared to represent ultimate reality, with life as an accidental product of physical processes, and mind as an epi-phenomenon. It was chaos when, in the current theory of evolution, the living world appeared as a product of chance, the outcome of random mutations and survival in the mill of natural selection. In the same sense, human personality, in the theories of behaviorism as well as of psychoanalysis, was considered a chance product of nature and nurture, of a mixture of genes and an accidental sequence of events from early childhood to maturity. Now we are looking for another basic outlook on the world -- the world as organization. Such a conception -- if it can be substantiated -- would indeed change the basic categories upon which scientific thought rests, and profoundly influence practical attitudes. This trend is marked by the emergence of a bundle of new disciplines such as cybernetics, information theory, general system theory, theories of games, of decisions, of queuing and others; in practical applications, systems analysis, systems engineering, operations research, etc. They are different in basic assumptions, mathematical techniques and aims, and they are often unsatisfactory and sometimes contradictory. They agree, however, in being concerned, in one way or another, with "systems," "wholes" or "organizations"; and in their totality, they herald a new approach."

- Information theory

0 likesScienceSemioticsComputer science
"Historically, information management has been a fragmented activity shared among the traditionally independent elements of an organization. Many of the critical data-handling activities (payroll, invoices, payments, inventories, etc.) of an organization have been located in the administrative or financial management offices. Automation of these activities has resulted in placing management responsibilities for computers and information systems in the office of an organization's administrator or controller. Since information-related programs also may be administered by other elements in an organization, in many instances a dispersed information management structure has resulted. For example, activities such as information and library services, statistical functions, information programs, and associated activities (policy, reports, management, procurement, and communications) may not be centrally managed. Often, responsibility for managing these activities and services is shared, and in some instances the jurisdictional responsibility may not be clear. As a result of this fragmented approach, information resources sometimes have been poorly managed and inappropriately used. The current rationale for comprehensive management of information-related activities is that these activities contribute to an organization's effectiveness. According to the general IRM concept, the IRM office within an organization should provide a central focus for all those information activities that support and serve the organization. Also, this office should reflect the organization's specific directions and goals and be consistent with good management practices. The objectives and goals of the IRM office should be formulated to provide a cohesive management framework consistent with organization requirements and values. The IRM policies and procedures should provide a foundation for developing the information architecture and relevant programs required by the organization."

- Information management

0 likesManagementComputer science
"Generically, an architecture is the description of the set of components and the relationships between them. Simple enough. The trouble starts when you tack on an adjective: There are software architectures, hardware architectures, network architectures, system architectures, and enterprise architectures. People have their own preconceived notions and experiences about “architecture.” A software architecture describes the layout of the software modules and the connections and relationships among them. A hardware architecture can describe how the hardware components are organized. However, both these definitions can apply to a single computer, a single information system, or a family of information systems. Thus “architecture” can have a range of meanings, goals, and abstraction levels, depending on who’s speaking. An information system architecture typically encompasses an overview of the entire information system—including the software, hardware, and information architectures (the structure of the data that systems will use).In this sense, the information system architecture is a meta-architecture. An enterprise architecture is also a meta-architecture in that it comprises many information systems and their relationships (technical infrastructure). However, because it can also contain other views of an enterprise—including work, function, and information—it is at the highest level in the architecture pyramid. It is important to begin any architecture development effort with a clear definition of what you mean by “architecture.”"

- Software architecture

0 likesComputer science
"What do you think of using UML to generate implementation code? James: I think it’s a terrible idea. I know that I disagree with many other UML experts, but there is no magic about UML. If you can generate code from a model, then it is programming language. And UML is not a well-designed programming language. The most important reason is that it lacks a well-defined point of view, partly by intent and partly because of the tyranny of the OMG standardization process that tries to provide everything to everybody. It doesn't have a well-defined underlying set of assumptions about memory, storage, concurrency, or almost anything else. How can you program in such a language? The fact is that UML and other modelling language are not meant to be executable. The point of models is that they are imprecise and ambiguous. This drove many theoreticians crazy so they tried to make UML "precise", but models are imprecise for a reason: we leave out things that have a small effect so we can concentrate on the things that have big or global effects. That's how it works in physics models: you model the big effect (such as the gravitation from the sun) and then you treat the smaller effects as perturbation to the basic model (such as the effects of the planets on each other). If you tried to solve the entire set of equations directly in full detail, you couldn't do anything."

- Executable UML

0 likesComputer scienceChronologically ordered theme pages to be converted to alphabetical ordering
"The integration technology and infrastructure elements available today, in 1993, would enable an enterprise to develop a significant integration infrastructure. However, integration projects are constrained by cultural inertia, financial and resource limitations, and, significantly, risk management Thus, projects and their supporting integration infrastructures tend to be deployed in an incremental and evolutionary manner. Since each enterprise chooses its integration path based on particular business needs, the corporations visited in this study each presented a different road map of integration efforts to date and a unique snapshot of current integration infrastructure.... DoD, in concert with leading companies, should formulate an R&D strategy to create a new generation of enterprise architectures, models, tools, and software systems, and to determine the potential for new business operations, engineering practices, and manufacturing concepts. To achieve potential functional and performance improvements, integrators should combine the leverage of several emerging threshold technologies, such as operational integration frameworks, object-based and knowledge-based product and process representations, application-oriented network services, near-term and mid-term solutions to database integration, and wide-area object brokerage and execution.-"

- Object-orientation

0 likesComputer scienceChronologically ordered theme pages to be converted to alphabetical ordering
"The strategic use of information technology (I/T) is now and has been a fundamental issue for every business. In essence, I/T can alter the basic nature of an industry. The effective and efficient utilization of information technology requires the alignment of the I/T strategies with the business strategies, something that was not done successfully in the past with traditional approaches. New methods and approaches are now available. The strategic alignment framework applies the Strategic Alignment Model to reflect the view that business success depends on the linkage of business strategy, information technology strategy, organizational infrastructure and processes, and I/T infrastructure and processes... We [have looked] at why it may not be sufficient to work on any one of these areas in isolation or to only harmonize business strategy and information technology. One reason is that, often, too much attention is placed on technology, rather than business, management, and organizational issues. The objective is to build an organizational structure and set of business processes that reflect the interdependence of enterprise strategy and information technology capabilities. The attention paid to the linkage of information technology to the enterprise can significantly affect the competitiveness and efficiency of the business. The essential issue is how information technology can enable the achievement of competitive and strategic advantage for the enterprise."

- Information technology

0 likesComputer science
"Now let me pull out so we’re clear about the problem we all face and how we got here. The attacks against us in Rappler began 5 years ago when we demanded an end to impunity on two fronts: Duterte’s drug war and Mark Zuckerberg’s Facebook. Today, it has only gotten worse – and Silicon Valley’s sins came home to roost in the United States on January 6 with mob violence on Capitol Hill. What happens on social media doesn’t stay on social media. Online violence is real world violence. Social media is a deadly game for power and money, what Shoshana Zuboff calls surveillance capitalism, extracting our private lives for outsized corporate gain. Our personal experiences are sucked into a database, organized by AI, then sold to the highest bidder. Highly profitable micro-targeting operations are engineered to structurally undermine human will – a behavior modification system in which we are Pavlov’s dogs, experimented on in real time with disastrous consequences in countries like mine, Myanmar, India, Sri Lanka and so many more. These destructive corporations have siphoned money away from news groups and now pose a foundational threat to markets and elections. Facebook is the world’s largest distributor of news, and yet studies have shown that lies laced with anger and hate spread faster and further than facts on social media. These American companies controlling our global information ecosystem are biased against facts, biased against journalists. They are – by design – dividing us and radicalizing us. Without facts, you can’t have truth. Without truth, you can’t have trust. Without trust, we have no shared reality, no democracy, and it becomes impossible to deal with our world’s existential problems: climate, coronavirus, the battle for truth."

- Information technology

0 likesComputer science
"Nowadays, mathematicians routinely use computers to solve problems, even great problems. Computers are good at arithmetic, but mathematics goes far beyond mere ‘sums’, so putting a problem on a computer is seldom straightforward. Often the hardest part of the work is to convert the problem into one that a computer calculation can solve, and even then the computer may struggle. Many of the great problems that have been solved recently involve little or no work with a computer. Fermat’s last theorem and the Poincaré conjecture are examples. When computers have been used to solve great problems, like the four colour theorem or the Kepler conjecture, the computer effectively plays the role of servant. But sometimes the roles are reversed, with mathematics as the servant of computer science. Most of the early work on computer design made good use of mathematical insights, for example the connection between Boolean algebra – an algebraic formulation of logic – and switching circuits, developed in particular by the engineer Claude Shannon, the inventor of information theory. Today, both practical and theoretical aspects of computers rely on the extensive use of mathematics, from many different areas. One of the Clay millennium problems lies in the borderland of mathematics and computer science. It can be viewed both ways: computer science as a servant of mathematics, and mathematics as a servant of computer science. What it requires, and is helping to bring about, is more balanced: a partnership. The problem is about computer algorithms, the mathematical skeletons from which computer programs are made. The crucial concept here is how efficient the algorithm is: how many computational steps it takes to get an answer for a given amount of input data. In practical terms, this tells us how long the computer will take to solve a problem of given size."

- P versus NP problem

0 likesComputer science