777 quotes found
"On two occasions, I have been asked [by members of Parliament], "Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?"...I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question."
"It makes sense to examine Plato and pottery together in order to understand the Greek world, Descartes and the mechanical clock together in order to understand Europe in the seventeenth and eighteenth centuries. In the same way, it makes sense to regard the computer as a technological paradigm for the science, the philosophy, even the art of the coming generation."
"The clock has been the center of Western technology since its invention in the Middle Ages. Computer technology too finds it indispensable, although it has changed the clock from a mechanical device to a wholly electronic one."
"In Hollywood, they think drawn animation doesn't work anymore, computers are the way. They forget that the reason computers are the way is that Pixar makes good movies. So everybody tries to copy Pixar. They're relying too much on the technology and not enough on the artists."
"I have bought this wonderful machine- a computer. Now I am rather an authority on gods, so I identified the machine- it seems to me to be an Old Testament god with a lot of rules and no mercy."
"Trust The Computer. The Computer is your friend."
"Starting when computer technology first emerged during World War II and continuing into the 1960s, women made up most of the computing workforce. By 1970, however, women only accounted for 13.6% of bachelor's in computer science graduates. In 1984 that number rose to 37%, but it has since declined to 18% -- around the same time personal computers started showing up in homes. According to NPR, personal computers were marketed almost exclusively to men and families were more likely to buy computers for boys than girls."
"To an outsider, the most significant innovation in the global warming controversy is the overt reliance that is being placed on models. Back in the days of nuclear winter, computer models were invoked to add weight to a conclusion: "These results are derived with the help of a computer model." But now, large-scale computer models are seen as generating data in themselves. No longer are models judged by how well they reproduce data from the real world—increasingly, models provide the data. As if they were themselves a reality. And indeed they are, when we are projecting forward."
"This fascination with computer models is something I understand very well. Richard Feynman called it a disease. I fear he is right."
"If you don't know anything about computers, just remember that they are machines that do exactly what you tell them but often surprise you in the result."
"The simple fact is that without supporting directives or a mechanism for feedback, security is defined differently by each person and verified by no one. There is no metric for compliance with a "culture", and a "culture of security" is overridden by a culture of "get the job done" every time. If there are rules, write them down. If technology is put in place to implement or monitor the rules, write that down too. If people break the rules, follow up. If the rules prevent legitimate business from getting done, change them. It's that simple."
"It always bothers me that, according to the laws as we understand them today, it takes a computing machine an infinite number of logical operations to figure out what goes on in no matter how tiny a region of space, and no matter how tiny a region of time. How can all that be going on in that tiny space? Why should it take an infinite amount of logic to figure out what one tiny piece of space/time is going to do? So I have often made the hypotheses that ultimately physics will not require a mathematical statement, that in the end the machinery will be revealed, and the laws will turn out to be simple, like the chequer board with all its apparent complexities."
"Spock: Computers make excellent and efficient servants, but I have no wish to serve under them. Captain, a starship also runs on loyalty to one man, and nothing can replace it or him."
""So computers are tools of the devil?" thought Newt. He had no problem believing it. Computers had to be the tools of somebody, and all he knew for certain was that it definitely wasn't him."
"Where a calculator like the ENIAC today is equipped with 18,000 vacuum tubes and weighs 30 tons, computers in the future may have only 1,000 vacuum tubes and perhaps weigh only 1½ tons."
"What do such machines really do? They increase the number of things we can do without thinking. Things we do without thinking — there's the real danger."
"Computers are good at following instructions, but not at reading your mind."
"These machines have no common sense; they have not yet learned to "think," and they do exactly as they are told, no more and no less. This fact is the hardest concept to grasp when one first tries to use a computer."
"The Analytical Engine has no pretentions whatever to originate anything. It can do whatever we know how to order it to perform."
"Dare to be gorgeous and unique. But don't ever be cryptic or otherwise unfathomable. Make it unforgettably great."
"The Joker: Do you know how many times we've come close to world war three over a flock of geese on a computer screen?"
"The reality is that future cyber warfare will likely resemble medieval siege warfare – as critical infrastructure and vital services to a city's population are shut-down and locked-out as a result of a ransomware attack."
"Today's computers are not even close to a 4-year-old human in their ability to see, talk, move, or use common sense. One reason, of course, is sheer computing power. It has been estimated that the information processing capacity of even the most powerful supercomputer is equal to the nervous system of a snail—a tiny fraction of the power available to the supercomputer inside [our] skull."
"But if these machines were ingenious, what shall we think of the calculating machine of Mr. Babbage? What shall we think of an engine of wood and metal which can not only compute astronomical and navigation tables to any given extent, but render the exactitude of its operations mathematically certain through its power of correcting its possible errors? What shall we think of a machine which can not only accomplish all this, but actually print off its elaborate results, when obtained, without the slightest intervention of the intellect of man?"
"Never underestimate the bandwidth of a station wagon full of tapes hurtling down the highway."
"An adversary capable of implanting the right virus or accessing the right terminal can cause massive damage."
"By the '80s, the early pioneering work done by female programmers had mostly been forgotten. In contrast, Hollywood was putting out precisely the opposite image: Computers were a male domain. In hit movies like "Revenge of the Nerds," "Weird Science," "Tron," "WarGames" and others, the computer nerds were nearly always young white men. Video games, a significant gateway activity that led to an interest in computers, were pitched far more often at boys, as research in 1985 by Sara Kiesler, a professor at Carnegie Mellon, found. "In the culture, it became something that guys do and are good at," says Kiesler, who is also a program manager at the National Science Foundation. "There were all kinds of things signaling that if you don't have the right genes, you're not welcome.""
"A computer would deserve to be called intelligent if it could deceive a human into believing that it was human."
"Computers with their binary on–off logic seem to appeal to the military mind. This is because the military, in order to counter the inherent confusion and danger of war, is forever seeking ways to make communications as terse and unambiguous as humanly possible. Computers by their very nature do just that. Had they only been able to stand at attention and salute, in many ways they would have made ideal soldiers."
"With computers acting as the stimulus, the theory of war was assimilated into that of microeconomics. . . . Instead of evaluating military operations by their product –that is, victory – calculations were cast in terms of input–output and cost effectiveness. Since intuition was replaced by calculation, and since the latter wasto be carried out with the aid of computers, it was necessary that all the phenomena of war be reduced to quantitative form. Consequently everything that could be quantified was, while everything that could not be tended to be thrown onto the garbage heap."
"It used to be said of a man who had suffered a catastrophic setback in his line of work that he had been handed his head on a platter. We are being handed our heads with tweezers now."
"The danger of computers becoming like humans is not as great as the danger of humans becoming like computers."
"Anyone who slaps a 'this page is best viewed with Browser X' label on a Web page appears to be yearning for the bad old days, before the Web, when you had very little chance of reading a document written on another computer, another word processor, or another network."
"Einstein argued that there must be simplified explanations of nature, because God is not capricious or arbitrary. No such faith comforts the software engineer."
"Interviewer: Is studying computer science the best way to prepare to be a programmer? Bill Gates: No. the best way to prepare is to write programs, and to study great programs that other people have written. In my case, I went to the garbage cans at the Computer Science Center and I fished out listings of their operating system. You got to be willing to read other people's code, then write your own, then have other people review your code. You've got to want to be in this incredible feedback loop where you get the world-class people to tell you what you're doing wrong."
"Around computers it is difficult to find the correct unit of time to measure progress. Some cathedrals took a century to complete. Can you imagine the grandeur and scope of a program that would take as long?"
"A refund for defective software might be nice, except it would bankrupt the entire software industry in the first year."
"The computer was a fairly new thing in 1971, and most people were not well acquainted with it. There was only one five-week course in programming at Carleton. I had taken the class when I was a junior, and the next trimester, I was recruited to be a lab assistant to help other students who were taking the course. I was fascinated by the power of the computer to not only calculate, but also to interact with written language. I had been thinking about writing a program to interact with a human through language, but the content of such a program remained a mystery to me."
"The only legitimate use of a computer is to play games."
"The primary duty of an exception handler is to get the error out of the lap of the programmer and into the surprised face of the user."
"I think there is a world market for maybe five computers."
""I refuse to prove that I exist," says God, "for proof denies faith, and without faith, I am nothing." "Oh," says man, "but the Babel fish is a dead give-away, isn't it? It proves You exist, and so therefore You don't." "Oh, I hadn't thought of that," says God, who promptly vanishes in a puff of logic. "Ah, that was easy," says man, and for an encore goes on to prove that black is white, and gets killed on the next zebra crossing. Most leading theologians claim that this argument is a load of dingo's kidneys."
"All men are mortal. Socrates was mortal. Therefore, all men are Socrates."
"You can prove anything you want by coldly logical reason—if you pick the proper postulates. We have ours and Cutie [robot QT-1] has his." "Then let’s get at those postulates in a hurry. The storm’s due tomorrow." Powell sighed wearily. "That’s where everything falls down. Postulates are based on assumptions and adhered to by faith. Nothing in the Universe can shake them. ..."
"Aristotle is noted for his writings on logic, physics, biology, psychology, metaphysics, ethics, politics, and literature. These works are marked by sober tone, subtle analysis, and empirical accuracy. In logic, he produced the first textbooks ever written. They deal with some of the problems which Plato had suggested but not considered in detail."
"In the logic of science there is a principle as important as that of parsimony: it is that of sufficient reason. The former directs us to look for simplest causes, the later cautions us not to simplify so far that the explanation is inadequate to the facts to be explained....Parsimony is not itself a simple criterion of a good methodology; we cannot simply count the factors of explanation and say that the theory containing the smallest number is the best. The ideal of parsimony cannot be expressed without the proviso that the conditions for which it is a norm shall themselves be adequate."
"LOGIC, n. The art of thinking and reasoning in strict accordance with the limitations and incapacities of the human misunderstanding. The basic of logic is the syllogism, consisting of a major and a minor premise and a conclusion -- thus:"
"R. A. Fisher, J. Neyman, R. von Mises, W. Feller, and L. J. Savage denied vehemently that probability theory is an extension of logic, and accused Laplace and Jeffreys of committing metaphysical nonsense for thinking that it is."
"No, no, you're not thinking; you're just being logical."
"If the world were a logical place, men would ride side saddle."
"Logic is a large drawer, containing some useful instruments, and many more that are superfluous. A wise man will look into it for two purposes, to avail himself of those instruments that are really useful, and to admire the ingenuity with which those that are not so, are assorted and arranged."
"Contrariwise," continued Tweedledee, "if it was so, it might be; and if it were so, it would be; but as it isn't, it ain't. That's logic."
"You can only find truth with logic if you have already found truth without it."
"Utility and necessity of logic - It would be a mistake to imagine that, above and beyond what is called the Natural Logic of sound common sense, the study of the Science of Logic is absolutely necessary for right reasoning. Men reasoned rightly before Aristotle ever formulated a canon of logic. It was, in fact, by an analysis of such reasonings that he discovered those canons: they could never have been discovered otherwise. Here as elsewhere the art came before the science; theory followed practice. A man may reason rightly without knowing a single rule of the syllogism; or, conversely, he may know all the details of logic and be an indifferent guide to truth just as a first-rate geometrician may be a failure as an engineer. But still, just as his knowledge of geometry will enable the geometrician to detect the defects in a piece of engineering, so too will an explicit knowledge of the canons of reasoning enable us to discover more readily where the fallacy of a misleading argument lies. Without professing to guard us infallibly from error, logic familiarizes us with the rules and canons to which right reasoning processes must conform, and with the hidden fallacies and pitfalls to which such processes are commonly exposed."
"The real trouble with this world of ours is not that it is an unreasonable world, nor even that it is a reasonable one. The commonest kind of trouble is that it is nearly reasonable, but not quite. Life is not an illogicality; yet it is a trap for logicians. It looks just a little more mathematical and regular than it is; its exactitude is obvious, but its inexactitude is hidden; its wildness lies in wait."
"The morbid logician seeks to make everything lucid, and succeeds in making everything mysterious."
"We know that mathematicians care no more for logic than logicians for mathematics. The two eyes of exact science are mathematics and logic: the mathematical sect puts out the logical eye, the logical sect puts out the mathematical eye; each believing that it can see better with one eye than with two."
"Pierce wrote as a logician and James as a humanist."
"It is with logic as it is with other sciences. They draw wisdom from the mysterious source of plain experience. Agriculture, e. g., aims to teach the farmer how to cultivate the soil; but fields were tilled long before any agricultural college had begun its lectures. In the same way human beings think without ever having heard of logic. But by practice they improve their innate faculty of thought, they make progress, they gradually learn to make better use of it. Finally, just as the farmer arrives at the science of agriculture, so the thinker arrives at logic, acquires a clear consciousness of his faculty of thought and a professional dexterity in applying it."
"Adherents of formal logic may be compared to a maker of porcelain dishes who would contend that he was simply paying attention to the form of his dishes, pots, and vases, but that he did not have anything to do with the raw material."
"“It’s logical,” Sario said. “Lots of people don’t like coping with logic when it dictates hard decisions. That’s a problem with people, not logic.”"
"These, briefly, are the key elements of the stereotype: logic cripples and constrains; it forces one into narrow and mechanical modes of thought that cut one off from a vast range of superior thoughts, feelings and perceptions; logic is an enemy of wit and humor (Mr. Spock's face was always an impassive mask); logic makes us dull and pedantic (Mr. Spock always spoke in a monotone); logic presupposes a simple-minded, black-and-white, yes-no conception of the world. ... Logic misses the point of half the things we ordinarily say and cannot match the insight of the humblest person's common sense."
"From a drop of water, a logician could infer the possibility of an Atlantic or a Niagara without having seen or heard of one or the other."
"The first question we should face is: What is the aim of a physical theory? To this question diverse answers have been made, but all of them may be reduced to two main principles: "A physical theory," certain logicians have replied, "has for its object the explanation of a group of laws experimentally established." "A physical theory," other thinkers have said, "is an abstract system whose aim is to summarize and classify logically a group of experimental laws without claiming to explain these laws... Now these two questions — Does there exist a material reality distinct from sensible appearances? and What is the nature of reality? — do not have their source in experimental method, which is acquainted only with sensible appearances and can discover nothing beyond them. The resolution of these questions transcends the methods used by physics; it is the object of metaphysics. Therefore, if the aim of physical theories is to explain experimental laws, theoretical physics is not an autonomous science; it is subordinate to metaphysics... Now, to make physical theories depend on metaphysics is surely not the way to let them enjoy the privilege of universal consent."
"Logic is usually understood nowadays as a study of certain formal systems, though in former times there were philosophers who held that the subject matter of logic was the formal rules of human thought."
"The want of logic annoys. Too much logic bores. Life eludes logic, and everything that logic alone constructs remains artificial and forced."
"To find themselves utterly alone at night where company is desirable and expected makes some people fearful; but a case more trying by far to the nerves is to discover some mysterious companionship when intuition, sensation, memory, analogy, testimony, probability, induction — every kind of evidence in the logician's list — have united to persuade consciousness that it is quite in isolation."
"Logic by definition is that which makes sense: nothing more, nothing less. No miracles. No supernatural hocus-pocus. Everything that takes place, every event, every effect, is logically caused by something, something which preceded it in time, and which provided the physico-chemical causative chain that resulted in the effect. These premises are the foundations of our intellectual existence."
"Logic is a feeble reed, friend. "Logic" proved that airplanes can't fly and that H-bombs won't work and that stones don't fall out of the sky. Logic is a way of saying that anything which didn't happen yesterday won't happen tomorrow."
"To understand this for sense it is not required that a man should be a geometrician or a logician, but that he should be mad."
"Logic is logic. That's all I say."
"Logic is one thing and commonsense another."
"But in this age, logic was a flame that must be frequently starved of fuel."
"I have expos'd myself to the enmity of all metaphysicians, logicians, mathematicians, and even theologians; and can I wonder at the insults I must suffer?"
"Logician: A cat has four paws. Old Gentleman: My dog had four paws. Logician: Then it's a cat. Old Gentleman: So my dog is a cat? Logician: And the contrary is also true."
"Moreover, growing uncertainty surrounded even the one too which the academic philosophers felt they could trust: logic. Two centuries before, Kant had asserted in his Logik (1800): ‘There are but few sciences that can come into a permanent state, which admits of no further alteration. To these belong Logic … We do not require any further discoveries in Logic, since it contains merely the form of thought.’ As late as 1939, a British philosopher asserted: ‘Dictators may be powerful today, but they cannot alter the laws of logic, nor indeed can even God do so.’ Thirteen years later the American philosopher Willard Quine calmly accepted that the definition of logic was undergoing fundamental change: ‘What difference is there in principle between such a shift and the shift whereby Kepler succeeded Ptolemy, or Einstein Newton, or Darwin Aristotle?’ In the decades that followed, many rival systems to classical logic emerged: Bochvar’s many-valued logic, new systems by Birkhoff and Destouches-Février and Reichenbach, minimal logic, deontic logics, tense logics. It became possible to speak of empirical proof or disproof of logic."
"What would be the consequences for the theory of truth, asked one worried logician,’… of the adoption of a non-standard system’? Another, observing systems of modal logic, observed: ‘One gets an uneasy feeling as one discerns and studies more of the systems belonging to this family that it is literally a family, and has the power of reproducing and multiplying, proliferating new systems [of logic] without limit.’ In a world in which even the rules of logic shifted and disintegrated, it is not surprising that modern times did not develop in ways the generation of 1920 would have considered ‘logical’."
"Logic hasn't wholly dispelled the society of witches and prophets and sorcerers and soothsayers."
"Logic is neither a science nor an art, but a dodge."
"Logic is concerned with arguments, good and bad. With the docile and the reasonable, arguments are sometimes useful in settling disputes. With the reasonable, this utility attaches only to good arguments. It is the logician's business to serve the reasonable. Therefore, in the realm of arguments, it is the logician who distinguishes good from bad."
"The book, as it stands, seems to me to be one of the most frightful muddles I have ever read, with scarcely a sound proposition in it beginning with page 45, and yet it remains a book of some interest, which is likely to leave its mark on the mind of the reader. It is an extraordinary example of how, starting with a mistake, a remorseless logician can end up in bedlam."
"Metaphysics may be, after all, only the art of being sure of something that is not so, and logic only the art of going wrong with confidence."
""There is one basis of science," says Descartes, "one test and rule of truth, namely, that whatever is clearly and distinctly conceived is true." A profound psychological mistake. It is true only of formal logic, wherein the mind never quits the sphere of its first assumptions to pass out into the sphere of real existences; no sooner does the mind pass from the internal order to the external order, than the necessity of verifying the strict correspondence between the two becomes absolute. The Ideal Test must be supplemented by the Real Test, to suit the new conditions of the problem."
"Anyone who has heard (Jacques Derrida) lecture in French knows that he is more performance artist than logician. His flamboyant style--using free association, rhymes and near-rhymes, puns, and maddening digressions--is not just a vain pose (though it is surely that). It reflects what he calls a self-conscious "acommunicative strategy" for combating logocentrism."
"The contemporary mathematical and symbolic logic is certainly very different from its classical predecessor, but they share the radical opposition to dialectical logic. In terms of this opposition, the old and the new formal logic express the same mode of thought. it is purged from that “negative” which loomed so large at the origins of logic and of philosophic thought—the experience of the denying, deceptive, falsifying power of the established reality. And with the elimination of this experience, the conceptual effort to sustain the tension between “is” and “ought”, and to subvert the established universe of discourse in the name of its own truth is likewise eliminated from all thought which is to be objective, exact, and scientific. For the scientific subversion of the immediate experience which establishes the truth of science as against that of immediate experience does not develop the concepts which carry in themselves the protest and the refusal. The new scientific truth which they oppose to the accepted one does not contain in itself the judgment that condemns the established reality. ... In contrast, dialectical thought is and remains unscientific to the extent to which it is such judgment."
"This fallacy [appeal to authority] is not in itself an error; it is impossible to learn much in today's world without letting somebody else crunch the numbers and offer us explanations. And teachers are sources of necessary information. But how we choose our "authorities" and place a value on such information, is just another skill rarely taught in our education systems. It's little wonder that to most folk, sound bites and talking heads are enough to count as experts. […] Teaching is reinforcing the appeal to authority, where anybody who seems more intelligent than you must ultimately be right. […] We educators must simply role-model critical thinking. […] Educators themselves have to be prepared to show that “evidence” and “answers” are two separate things by firmly believing that, themselves."
"The pedant and the priest have always been the most expert of logicians—and the most diligent disseminators of nonsense and worse."
"Logic, like whiskey, loses its beneficial effect when taken in too large quantities."
"Logic is a systematic method of coming to the wrong conclusion with confidence."
"Able logicians have a trick of being unutterably wrong when they come to write about life: the instinctive, intuitive sense of human values often fails them quite. Logic is more likely to make a man a fool than the lack of logic is."
"It might ... have been supposed that logicians and psychologists would have devoted special attention to meaning, since it is so vital for all the issues with which they are concerned. But that this is not the case will be evident[1] to anyone who studies the Symposium in Mind (October 1920 and following numbers) on "The Meaning of 'Meaning.'""
"Logicians tell us that a system of ideas containing a contradiction can be used to deduce any statement whatsoever, no matter how absurd."
"Logic and mathematics seem to be the only domains where self-evidence manages to rise above triviality; and this it does, in those domains, by a linking of self-evidence on to self-evidence in the chain reaction known as proof."
"Three conceptions are perpetually turning up at every point in every theory of logic, and in the most rounded systems they occur in connection with one another. They are conceptions so very broad and consequently indefinite that they are hard to seize and may be easily overlooked. I call them the conceptions of First, Second, Third. First is the conception of being or existing independent of anything else. Second is the conception of being relative to, the conception of reaction with, something else. Third is the conception of mediation, whereby a first and second are brought into relation."
"Logical analysis applied to mental phenomenon shows that there is but one law of mind, namely that ideas tend to spread continuously and to affect certain others which stand to them in a peculiar relation of affectibility. In this spreading they lose intensity, and especially the power of affecting others, but gain generality and become welded with other ideas."
"A certain maxim of Logic which I have called Pragmatism has recommended itself to me for diverse reasons and on sundry considerations."
"It is by logic that we prove, but by intuition that we discover. To know how to criticize is good, to know how to create is better."
"The utility of a science which enables men to take cognizance of the travellers on the mind's highway, and excludes those disorderly interlopers, verbal fallacies, needs but small attestation. Its searching penetration by definition alone, before which even mathematical precision fails, would especially commend it to those whom the abstruseness of the study does not terrify, and who recognise the valuable results which must attend discipline of mind. Like a medicine, though not a panacea for every ill, it has the health of the mind for its aim, but requires the determination of a powerful will to imbibe its nauseating yet wholesome influence: it is no wonder therefore that puny intellects, like weak stomachs, abhor and reject it."
"Of course I'm inconsistent! Only logicians and cretins are consistent!"
"Mathematics and logic, historically speaking, have been entirely distinct studies. Mathematics has been connected with science, logic with Greek. But both have developed in modern times: logic has become more mathematical and mathematics has become more logical. The consequence is that it has now become wholly impossible to draw a line between the two; in fact, the two are one. They differ as boy and man: logic is the youth of mathematics and mathematics is the manhood of logic. This view is resented by logicians who, having spent their time in the study of classical texts, are incapable of following a piece of symbolic reasoning, and by mathematicians who have learnt a technique without troubling to inquire into its meaning or justification. Both types are now fortunately growing rarer. So much of modern mathematical work is obviously on the border-line of logic, so much of modern logic is symbolic and formal, that the very close relationship of logic and mathematics has become obvious to every instructed student. The proof of their identity is, of course, a matter of detail: starting with premises which would be universally admitted to belong to logic, and arriving by deduction at results which as obviously belong to mathematics, we find that there is no point at which a sharp line can be drawn, with logic to the left and mathematics to the right. If there are still those who do not admit the identity of logic and mathematics, we may challenge them to indicate at what point, in the successive definitions and deductions of Principia Mathematica, they consider that logic ends and mathematics begins. It will then be obvious that any answer must be quite arbitrary."
"The question of "unreality," which confronts us at this point, is a very important one. Misled by grammar, the great majority of those logicians who have dealt with this question have dealt with it on mistaken lines. They have regarded grammatical form as a surer guide in analysis than, in fact, it is. And they have not known what differences in grammatical form are important."
"All traditional logic habitually assumes that precise symbols are being employed. It is therefore not applicable to this terrestial life but only to an imagined celestial existence... logic takes us nearer to heaven than other studies."
"I once received a letter from an eminent logician, Mrs. Christine Ladd Franklin, saying that she was a solipsist, and was surprised that there were no others. Coming from a logician, this surprise surprised me."
"The apparent world goes through developments which are the same as those the logician goes through if he starts from Pure Being and travels on to the Absolute Idea. [...] Why the world should go through this logical evolution is not clear; one is tempted to suppose that the Absolute Idea did not quite understand itself at first, and made mistakes when it tried to embody itself in events. But this, of course, was not what Hegel would have said."
"Pure logic is the ruin of the spirit."
"A crisis in doctrine occurred when they discovered that the square root of two was irrational. That is: the square root of two could not be represented as the ratio of two whole numbers, no matter how big they were. "Irrational" originally meant only that. That you can't express a number as a ratio. But for the Pythagoreans it came to mean something else, something threatening, a hint that their world view might not make sense, the other meaning of "irrational"."
"When emotion brings us ghosts from the past, only logic can root us in the present."
"[My aim is] to design logic as a calculating discipline, especially to give access to the exact handling of relative concepts, and, from then on, by emancipation from the routine claims of natural language, to withdraw any fertile soil from "cliché" in the field of philosophy as well. This should prepare the ground for a scientific universal language that, widely differing from linguistic efforts like Volapük [a universal language like Esperanto, very popular in Germany at the time], looks more like a sign language than like a sound language."
"Conceptual graphs are system of logic based on the existential graphs of Charles Sanders Peirce and the semantic networks of artificial intelligence. The purpose of the system is to express meaning in a form that is logically precise, humanly readable, and computationally tractable."
"Logic is the beginning of wisdom, not the end."
"Logic is in the eye of the logician."
"An idea starts to be interesting when you get scared of taking it to its logical conclusion."
"It takes extraordinary wisdom and self-control to accept that many things have a logic we do not understand that is smarter than our own."
"Poetry — No definition of poetry is adequate unless it be poetry itself. The most accurate analysis by the rarest wisdom is yet insufficient, and the poet will instantly prove it false by setting aside its requisitions. It is indeed all that we do not know. The poet does not need to see how meadows are something else than earth, grass, and water, but how they are thus much. He does not need discover that potato blows are as beautiful as violets, as the farmer thinks, but only how good potato blows are. The poem is drawn out from under the feet of the poet, his whole weight has rested on this ground. It has a logic more severe than the logician's. You might as well think to go in pursuit of the rainbow, and embrace it on the next hill, as to embrace the whole of poetry even in thought."
"The moment we stop assuming that the ideas of any milieu form static 'propositional systems', and recognize that they constitute historically developing 'conceptual populations', we are free to abandon also the philosophers' traditional assumption that rationality is a sub-species of logicality. ...In non-intellectual contexts... we judge the rationality of a man's conduct, not by how he habitually behaves, but rather how far he modifies his behaviour in new and unfamiliar situations, and it is arguable that the rationality of intellectual performances should be judged, correspondingly, by considering, not the internal consistency of a man's habitual concepts and beliefs, but rather the manner in which he modifies this intellectual position in the face of new and unforeseen experiences."
"Logic tends to reduce everything to identities and genera, to each representation having no more than one single and self-same content in whatever place, time, or relation it may occur to us. And there is nothing that remains the same for two successive moments of its existence. My idea of God is different each time that I conceive it. Identity, which is death, is the goal of the intellect."
"As to the most prudent logicians might venture to deduce from a skein of wool the probable existence of a sheep; so you, from the raw stuff of perception, may venture to deduce a universe which transcends the reproductive powers of your loom."
"Logic is a law which must be obeyed, and man realizes himself only in so far as he is logical. He finds himself in cognition."
"Memory, then, is a necessary part of the logical faculty. ... The proposition A = A must have a psychological relation to time, otherwise it would be At1 = At2."
"Roughly speaking: to say of two things that they are identical is nonsense, and to say of one thing that it is identical with itself is to say nothing."
"[ Fuzzy logic is ] a logic whose distinguishing features are (i) fuzzy truth-values expressed in linguistic terms, e.g., true, very true, more or less true, or somewhat true, false, nor very true and not very false, etc2.; (2) imprecise truth tables; and (3) rules of inference whose validity is relative to a context rather than exact."
"The cleaner and nicer the program, the faster it's going to run. And if it doesn't, it'll be easy to make it fast."
"On two occasions I have been asked, – "Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?" In one case a member of the Upper, and in the other a member of the Lower House put this question. I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question."
"Playing with pointers is like playing with fire. Fire is perhaps the most important tool known to man. Carefully used, fire brings enormous benefits; but when fire gets out of control, disaster strikes."
"The programmer, like the poet, works only slightly removed from pure thought-stuff. He builds his castles in the air, from air, creating by exertion of the imagination. Few media of creation are so flexible, so easy to polish and rework, so readily capable of realizing grand conceptual structures.... Yet the program construct, unlike the poet's words, is real in the sense that it moves and works, producing visible outputs separate from the construct itself. […] The magic of myth and legend has come true in our time. One types the correct incantation on a keyboard, and a display screen comes to life, showing things that never were nor could be."
"And programming computers was so fascinating. You create your own little universe, and then it does what you tell it to do."
"Computer programs are the most intricate, delicately balanced and finely interwoven of all the products of human industry to date. They are machines with far more moving parts than any engine: the parts don't wear out, but they interact and rub up against one another in ways the programmers themselves cannot predict."
"Applications programming is a race between software engineers, who strive to produce idiot-proof programs, and the universe which strives to produce bigger idiots. So far the Universe is winning."
"The effective exploitation of his powers of abstraction must be regarded as one of the most vital activities of a competent programmer."
"Computers are man's attempt at designing a cat: it does whatever it wants, whenever it wants, and rarely ever at the right time."
"If you lie to the computer, it will get you."
"There is no programming language, no matter how structured, that will prevent programmers from making bad programs."
"If I ask another professor what he teaches in the introductory programming course, whether he answers proudly "Pascal" or diffidently "FORTRAN," I know that he is teaching a grammar, a set of semantic rules, and some finished algorithms, leaving the students to discover, on their own, some process of design."
"No matter how slick the demo is in rehearsal, when you do it in front of a live audience, the probability of a flawless presentation is inversely proportional to the number of people watching, raised to the power of the amount of money involved."
"[This] reminds me of a quotation from somebody that, whenever he tried to explain the logical structure of a programming language to a programmer, it was like a cat trying to explain to a fish what it feels like to be wet."
"There are two ways of constructing a software design. One way is to make it so simple that there are obviously no deficiencies. And the other way is to make it so complicated that there are no obvious deficiencies."
"To me programming is more than an important practical art. It is also a gigantic undertaking in the foundations of knowledge."
"Programming: when the ideas turn into the real things."
"Most software today is very much like an Egyptian pyramid with millions of bricks piled on top of each other, with no structural integrity, but just done by brute force and thousands of slaves."
"Premature optimization is the root of all evil."
"The most important thing in a programming language is the name. A language will not succeed without a good name. I have recently invented a very good name, and now I am looking for a suitable language."
""Present-day computers are designed primarily to solve preformulated problems or to process data according to predetermined procedures. The course of the computation may be conditional upon results obtained during the computation, but all the alternatives must be foreseen in advance. … The requirement for preformulation or predetermination is sometimes no great disadvantage. It is often said that programming for a computing machine forces one to think clearly, that it disciplines the thought process. If the user can think his problem through in advance, symbiotic association with a computing machine is not necessary."
"One in a million is next Tuesday."
"He who hasn't hacked assembly language as a youth has no heart. He who does as an adult has no brain."
"Languages shape the way we think, or don't."
"Computer programming is tremendous fun. Like music, it is a skill that derives from an unknown blend of innate talent and constant practice. Like drawing, it can be shaped to a variety of ends – commercial, artistic, and pure entertainment. Programmers have a well-deserved reputation for working long hours, but are rarely credited with being driven by creative fevers. Programmers talk about software development on weekends, vacations, and over meals not because they lack imagination, but because their imagination reveals worlds that others cannot see."
"The best book on programming for the layman is Alice in Wonderland, but that's because it's the best book on anything for the layman."
"Computer Science is embarrassed by the computer."
"Prolonged contact with the computer turns mathematicians into clerks and vice versa."
"Structured Programming supports the law of the excluded muddle."
"There are two ways to write error-free programs; only the third one works."
"When someone says: "I want a programming language in which I need only say what I wish done", give him a lollipop."
"Software and cathedrals are much the same – first we build them, then we pray."
"Why bother with subroutines when you can type fast?"
"A Netscape engineer who shan't be named once passed a pointer to JavaScript, stored it as a string and later passed it back to C, killing 30."
"Real Programmers always confuse Christmas and Halloween because Oct31 == Dec25."
"Real programmers don't comment their code. If it was hard to write it should be hard to understand."
"Anyone even peripherally involved with computers agrees that object-oriented programming (OOP) is the wave of the future. Maybe one in 50 of them has actually tried to use OOP – which has a lot to do with its popularity."
"Don't get suckered in by the comments … they can be terribly misleading."
"The three chief virtues of a programmer are: Laziness, Impatience and Hubris."
"One day my daughter came in, looked over my shoulder at some Perl 4 code, and said, "What is that, swearing?""
"Weinberg's Second Law: If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization."
"Asking for efficiency and adaptability in the same program is like asking for a beautiful and modest wife. Although beauty and modesty have been known to occur in the same woman, we'll probably have to settle for one or the other. At least that's better than neither."
"All problems in computer science can be solved by another level of indirection."
"The main activity of programming is not the origination of new independent programs, but in the integration, modification, and explanation of existing ones."
"Zawinski's Law: Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can."
"If you are not willing to put a fragment of your soul into the code, don’t program. Do pursue a different path instead…"
"It might seem strange now that they were happy to take on a random applicant with absolutely no experience in computer programming. But in those days, almost nobody had any experience writing code. The discipline did not yet really exist; there were vanishingly few college courses in it, and no majors. (Stanford, for example, didn’t create a computer-science department until 1965.) So instead, institutions that needed programmers just used aptitude tests to evaluate applicants’ ability to think logically."
"Almost 200 years ago, the first person to be what we would now call a coder was, in fact, a woman: Lady Ada Lovelace. As a young mathematician in England in 1833, she met Charles Babbage, an inventor who was struggling to design what he called the Analytical Engine, which would be made of metal gears and able to execute if/then commands and store information in memory. Enthralled, Lovelace grasped the enormous potential of a device like this. A computer that could modify its own instructions and memory could be far more than a rote calculator, she realized. To prove it, Lovelace wrote what is often regarded as the first computer program in history, an algorithm with which the Analytical Engine would calculate the Bernoulli sequence of numbers. (She wasn’t shy about her accomplishments: “That brain of mine is something more than merely mortal; as time will show,” she once wrote.)"
"[A]s programming enjoyed its first burst of cultural attention, so many students were racing to enroll in computer science that universities ran into a supply problem: They didn’t have enough professors to teach everyone. Some added hurdles, courses that students had to pass before they could be accepted into the computer-science major. Punishing workloads and classes that covered the material at a lightning pace weeded out those who didn’t get it immediately. All this fostered an environment in which the students mostly likely to get through were those who had already been exposed to coding — young men, mostly. “Every time the field has instituted these filters on the front end, that’s had the effect of reducing the participation of women in particular,” says Eric S. Roberts, a longtime professor of computer science, now at Reed College, who first studied this problem and called it the “capacity crisis.”"
"In 1991, Ellen Spertus, now a computer scientist at Mills College, published a report on women’s experiences in programming classes. She cataloged a landscape populated by men who snickered about the presumed inferiority of women and by professors who told female students that they were “far too pretty” to be studying electrical engineering; when some men at Carnegie Mellon were asked to stop using pictures of naked women as desktop wallpaper on their computers, they angrily complained that it was censorship of the sort practiced by “the Nazis or the Ayatollah Khomeini.” As programming was shutting its doors to women in academia, a similar transformation was taking place in corporate America. The emergence of what would be called “culture fit” was changing the who, and the why, of the hiring process. Managers began picking coders less on the basis of aptitude and more on how well they fit a personality type: the acerbic, aloof male nerd. The shift actually began far earlier, back in the late ’60s, when managers recognized that male coders shared a growing tendency to be antisocial isolates, lording their arcane technical expertise over that of their bosses. Programmers were “often egocentric, slightly neurotic,” as Richard Brandon, a well-known computer-industry analyst, put it in an address at a 1968 conference, adding that “the incidence of beards, sandals and other symptoms of rugged individualism or nonconformity are notably greater among this demographic.” In addition to testing for logical thinking, as in Mary Allen Wilkes’s day, companies began using personality tests to select specifically for these sorts of caustic loner qualities. “These became very powerful narratives,” says Nathan Ensmenger, a professor of informatics at Indiana University, who has studied this transition. The hunt for that personality type cut women out. Managers might shrug and accept a man who was unkempt, unshaven and surly, but they wouldn’t tolerate a woman who behaved the same way. Coding increasingly required late nights, but managers claimed that it was too unsafe to have women working into the wee hours, so they forbid them to stay late with the men."
"Information can tell us everything. It has all the answers. But they are answers to questions we have not asked, and which doubtless don’t even arise."
"In 2007, for the first time ever, more information was generated in one year than had been produced in the entire previous five thousand years - the period since the invention of writing."
"I... refer to the... Waynflete Lectures given by... E. D. Adrian, on The Physical Background of Perception because the results of physiological investigations seem... in perfect agreement with my suggestion about the meaning of reality in physics. The messages which the brain receives have not the least similarity with the stimuli. They consist in pulses of given intensities and frequencies, characteristic for the transmitting nerve-fiber, which ends in a definite place in the cortex. All the brain 'learns' (I use... the objectionable language of the 'disquieting figure of a little hobgoblin sitting... aloft in the ') is a distribution or a 'map' of pulses. From this information it produces the image of the world by a process which can metaphorically be called a consummate place of combinatorial mathematics: it sorts out of the maze of indifferent and varying signals invariant shapes and relations which form the world of ordinary experience."
"Do not seek for information of which you cannot make use."
"Information is not a substance or concrete entity but rather a relationship between sets or ensembles of structured variety."
"... we simply try to give you all of the information about our businesses, in a large general way, that Charlie and I would want if our positions were reversed. ... The facts are out about what we do."
"If you torture the data enough, nature will always confess."
"Wisdom is dead. Long live information."
"We don't know a millionth of one percent about anything."
"The dark ages still reign over all humanity, and the depth and persistence of this domination are only now becoming clear. This Dark Ages prison has no steel bars, chains, or locks. Instead, it is locked by misorientation and built of misinformation. Caught up in a plethora of conditioned reflexes and driven by the human ego, both warden and prisoner attempt meagerly to compete with God. All are intractably skeptical of what they do not understand. We are powerfully imprisoned in these Dark Ages simply by the terms in which we have been conditioned to think."
"people cannot take action if they don't have accurate information."
"Information is not lost in black holes, but it is not returned in a useful way. It is like burning an encyclopaedia. Information is not lost, but it is very hard to read."
"The functionaries of every government have propensities to command at will the liberty and property of their constituents. There is no safe deposit for these but with the people themselves; nor can they be safe with them without information."
"Knowledge is of two kinds. We know a subject ourselves, or we know where we can find information upon it."
"It used to be said that information is power. As Arthur Sulzberger Jr., chairman of the board of the New York Times Co., rightly says, "Information is now ubiquitous. Power is understanding.""
"There's no going back, and there's no hiding the information. So let everyone have it."
"When action grows unprofitable, gather information; when information grows unprofitable, sleep."
"You've seen how information holds system together and how delayed, biased, scattered, or missing information can make feedback loops malfunction. Decision-makers can respond to information they don't have, can't respond accurately to information that is inaccurate, and can't respond in a timely way to information that is late. I would guess that most of what goes wrong in systems goes wrong because of biased, late, or missing information. [...] Information is power."
"As I understand the theory of period information doubling, this states that if we take one period of human information as being the time between the invention of the first hand axe, say around 50,000 BC and 1 AD, then this is one period of human information and we can measure it by how many human inventions we came up during that time. Then we see how long it takes for us to have twice as many inventions. This means that human information has doubled. As it turns out, after the first 50,000-year period, the second period is about 1500 years, say around the time of the Renaissance. By then we have twice as much information. To double again, human information took a couple of hundred years. The period speeds up—between 1960 and 1970, human information doubled. As I understand it, at the last count human information was doubling around every 18 months. Further to this, there is a point sometime around 2015 where human information is doubling every thousandth of a second. This means that in each thousandth of a second we will have accumulated more information than we have in the entire previous history of the world. At this point I believe that all bets are off. I cannot imagine the kind of culture that might exist after such a flashpoint of knowledge. I believe that our culture would probably move into a completely different state, would move past the boiling point, from a fluid culture to a culture of steam."
"Information smacks of safe neutrality; it is simple, helpful heaping of unassailable facts. In that innocent guise, its the perfect starting point for a technocratic political agenda that wants as little exposure for its objectives as possible. After all, what can anyone say against information?"
"What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention, and a need to allocate that attention efficiently among the overabundance of information sources that might consume it."
"Information exists. It does not need to be perceived to exist. It does not need to be understood to exist. It requires no intelligence to interpret it. It does not have to have meaning to exist. It exists."
"You don’t hide information by destroying it. You hide it by swamping it with bad information."
"Data, seeming facts, apparent associations-these are not certain knowledge of something. They may be puzzles that can one day be explained; they may be trivia that need not be explained at all."
"Information is a name for the content of what is exchanged with the outer world as we adjust to it, and make our adjustment felt upon it. The process of receiving and of using information is the process of our adjusting to the contingencies of the outer environment, and of our living effectively within that environment. The needs and the complexity of modern life make greater demands on this process of information than ever before, and our press, our museums, our scientific laboratories, our universities, our libraries and textbooks, are obliged to meet the needs of this process or fail in their purpose. To live effectively is to live with adequate information. Thus, communication and control belong to the essence of man's inner life, even as they belong to his life in society."
"Private information is practically the source of every large modern fortune."
"During the next 10 years, millions of programmers and users will utilise this system."
"In the summer of 1988, I received an interesting call from Bill Gates at Microsoft. He asked whether I'd like to come over and talk about building a new operating system at Microsoft for personal computers. [...] What Bill had to offer was the opportunity to build another operating system, one that was portable [...]."
"If consumers even know there's a DRM, what it is, and how it works, we've already failed."
"Most people, I think, don't even know what a rootkit is, so why should they care about it?"
"The technology in question is an example of Digital Restrictions Management (DRM) -- technology designed to restrict the public. Describing it as "copyright protection" puts a favorable spin on a mechanism intended to deny the public the exercise of those rights which copyright law has not yet denied them."
"There is a cost associated with DRM, and that is lost sales of content."
"DRM's primary role is not about keeping copyrighted content off P2P networks. DRMs support an orderly market for facilitating efficient economic transactions between content producers and content consumers."
"DRM is used to lock consumers to proprietary technology. It is used to control supply and push higher prices. It is used to undermine practices we have long defined as fair use so they can be shifted to fee-based."
"Now, we need to understand that listening to music on your computer is an extra privilege. Normally people listen to music on their car or through their home stereos [...] If you are a Linux or Mac user, you should consider purchasing a regular CD player.""
"To recap: we decided to end the Google Video download to own/rent (DTO/DTR) program, and are now refocusing our Google Video engineering efforts. The week before last, we wrote to Google Video DTO/DTR program customers to let them know that videos they'd already bought would no longer be playable."
"The key issue here is that the protection scheme under Blu-ray is very anti-consumer and there's not much visibility of that. The inconvenience is that the movie studios got too much protection at the expense of consumers and it won't work well on PCs. You won't be able to play movies and do software in a flexible way. It's not the physical format that we have the issue with, it's that the protection scheme on Blu is very anti-consumer."
"It's a polite fiction... Lawyers and technologists continue to sell this snake oil of control, whether it's from the court and the police [RIAA legal jihad], or whether it's coming from technology [DRM]... When I was 14, I told girls I loved them to sleep with them too. It was a fiction. Steve Jobs just leaves a little money on the table. We see Jobs and Gates making promises to the content industry that they have no intention of keeping. It's the promise you make to move forward. The content owner wants to hear it. If we're honest we'd say to the content owners "we're not going to succeed from what we can tell..." But we don't say it. We'll say what we need to say to get it... Once you reach the realization that it isn't going to solve our problems, then you begin to embrace the alternatives."
"We said [to the record companies]: None of this technology that you're talking about's gonna work. We have Ph.D.'s here, that know the stuff cold, and we don't believe it's possible to protect digital content."
"It's baffling to me that the content industries don't look at the experience of the software industry in the 80's, when copy protection on software was widely tried, and just as widely rejected by consumers."
"We conclude that given the current and foreseeable state of technology the content protection features of DRM are not effective at combating piracy."
"The development of the Internet has... created significant challenges to any distribution model which depends on scarcity... The financial and skill barriers to making content available globally have simply fallen away... The application of technology to this problem, if it is to be effective, must therefore in some way reestablish a point of scarcity on behalf of the rights holder. However, this raises a fundamental paradox, [which is] the business of publishers lies in providing access rather than in preventing it... Nevertheless, unless copyright is to be abandoned as a mechanism for trading in intellectual property entirely, it will be essential to find an answer to this paradox... They [rights holders] also recognized that those approaches would be ineffective unless the law itself provided enhanced protection for those processes and systems."
"My personal opinion (not speaking for IBM) is that DRM is stupid, because it can never be effective, and it takes away existing rights of the consumer."
"We see no technical impediments to the darknet becoming increasingly efficient... We believe it probable that there will be a few more rounds of technical innovations... Finally, consumers themselves are likely to rebel against "footing the bill" for these ineffective content protection systems... increased security (e.g. stronger DRM systems) may act as a disincentive to legal commerce... In short, if you are competing with the darknet, you must compete on the darknet's own terms: that is convenience and low cost rather than additional security."
"Digital files cannot be made uncopyable, any more than water can be made not wet."
"In my opinion, content protection and rights management exist only as vestigial efforts to preserve existing models of content sales for as long as the bulk of the consumer market remains clueless. History has shown every content-protection scheme invented for consumer-grade goods to have almost no impact on piracy, and little impact on casual copying, except when it has doomed the technology carrying it. This is inevitable."
"Why should self-interested companies be permitted to shift the balance of fundamental liberties, risking free expression, free markets, scientific progress, consumer rights, societal stability, and the end of physical and informational want? Because somebody might be able to steal a song? That seems a rather flimsy excuse."
"By treating Napster as the copyright antichrist, the industry is simply insuring that the vector of Internet technological development will move rapidly toward a lawsuit-proof, free-for-all distributed network of file-sharing -- the very outcome the owners of intellectual property wish to avoid. How stupid can you get? … The good news is that the brain-dead, colossally wasteful, artistically homogenizing old order of the recording industry is committing collective, time-delayed suicide in court."
"Trusted systems presume that the consumer is dishonest."
"The answer to the machine is in the machine."
"We also concluded that any single-machine locks and keys, or special time-out and self-destruct programs, would be onerous to our best customers and not effective against clever thieves. Because we could not devise practical physical security measures, we had to rely on the inherent honesty of our customers."
"How dare we speak of the laws of chance? Is not chance the antithesis of all law?"
"He who believes this (atomism) may as well believe that if a great quantity of the one-and-twenty letters, composed either of gold or any other matter, were thrown upon the ground, they would fall into such order as legibly to form the Annals of Ennius. I doubt whether fortune could make a single verse of them."
"The generation of random numbers is too important to be left to chance."
"The theory says a lot, but does not really bring us any closer to the secret of the 'old one'. I, at any rate, am convinced that He does not throw dice."
"Events may appear to us to be random, but this could be attributed to human ignorance about the details of the processes involved."
"Perhaps randomness is not merely an adequate description for complex causes that we cannot specify. Perhaps the world really works this way, and many events are uncaused in any conventional sense of the word."
"Randomness is a very, very subtle concept with its study properly belonging to statisticians more than mathematicians."
"Consideration of black holes suggests, not only that God does play dice, but that he sometimes confuses us by throwing them where they can't be seen."
"Fretting about a dearth of randomness seems like worrying that humanity might use up its last reserves of ignorance."
"The fact that randomness requires a physical rather than a mathematical source is noted by almost everyone who writes on the subject, and yet the oddity of this situation is not much remarked."
"Random chance was not a sufficient explanation of the Universe — in fact, random chance was not sufficient to explain random chance."
"Random numbers should not be generated with a method chosen at random."
"The sun comes up just about as often as it goes down, in the long run, but this doesn't make its motion random."
"A random sequence is a vague notion in which each term is unpredictable to the uninitiated and whose digits pass a certain number of tests traditional with statisticians and depending somewhat on the uses to which the sequence is to be put."
"Quand une regle est fort composée, ce qui luy est conforme, passe pour irrégulier."
"For I do not believe that it is through the interference of Divine Providence … that the spittle of a certain person moved, fell on a certain gnat in a certain place, and killed it."
"Existence is random. Has no pattern save what we imagine after staring at it for too long. No meaning save what we choose to impose. This rudderless world is not shaped by vague metaphysical forces. It is not God who kills the children. Not fate that butchers them or destiny that feeds them to the dogs. It’s us. Only us."
"Any one who considers arithmetical methods of producing random digits is, of course, in a state of sin."
"One of us recalls producing a 'random' plot with only 11 planes, and being told by his computer center's programming consultant that he had misused the random number generator: 'We guarantee that each number is random individually, but we don’t guarantee that more than one of them is random.' Figure that out."
"The definition of random in terms of a physical operation is notoriously without effect on the mathematical operations of statistical theory because so far as these mathematical operations are concerned random is purely and simply an undefined term."
"While in theory randomness is an intrinsic property, in practice, randomness is incomplete information."
"Anything that can be automatically done for you can be automatically done to you."
"Besides black art, there is nothing left in the United States but automation and mechanization."
"As the bourgeoisie, by means of its capital, completely monopolizes all new inventions, every new machine, instead of shortening the hours of labor and enhancing the prosperity and happiness of all, causes, on the contrary, dismissal from employment for some, reduction of wages for others and an increased and intensified state of misery for the entire proletariat."
"Where you have machines, then you get certain kinds of problems; where you get certain kinds of problems, then you find a heart warped by these problems. Where you get a heart warped, its purity and simplicity are disturbed. When purity and simplicity are disturbed, then the spirit is alarmed and an alarmed spirit is no place for the Tao to dwell. It isn't that I don't know of these machines, but I would be ashamed to use one."
"If every instrument could accomplish its own work, obeying or anticipating the will of others, [...]; if, in like manner, the shuttle would weave and the plectrum touch the lyre without a hand to guide them, chief workmen would not want servants, nor masters slaves."
"AI and robots will replace all jobs."
"(About Amazon's plan to automate 75% of its operations) It’s insane to think that a human will pack and ship boxes in ten years — it’s game over folks."
"In the short term, AI will destroy a lot of jobs. In In the long term, like every other technological revolution, I assume we will figure out completely new things to do."
"If you eventually get a society where you only have to work three days a week, that’s probably OK. If you free up human labor, you can help elder people better, have smaller class sizes – you know, the demand for labor to do good things is still there. And then if you ever get beyond that, you have a lot of leisure time and you’ll have to figure out what to do with it."
"There’s going to be more free time. If you believe the premise that humans are social animals, they’re going to have to do something. They can’t just sit at home, so they’ll go to music, they’ll go to sports and they’ll go to my live event."
"Requests for specialized facilities and minor notational improvements are very common. When provided, they rarely fail to win applause. After all, if a feature is a direct solution to a problem and doesn’t significantly interact with other facilities, it is easy to explain, often easy to implement, and typically has a logically minimal expression leading to very concise pieces of code. People comparing languages using lists love such features. The snag is that the problems we face are essentially infinite, so we need an infinity of such specialized features."
"The term "creeping featurism" was used in a 1976 Programmer's Workbench paper written by John Mashey, and in a talk by him first done in 1977, and later gave (as an ACM National Lecture) about 50-70 times through 1982. The original foils were scanned in 2002, and the phrase is used on Slide 033 within the talk. Mashey says "I can't recall if I actually coined this myself or heard it somewhere, but in any case, the phrase was certainly in public use by 1976.""
"A solid working knowledge of productivity software and other IT tools has become a basic foundation for success in virtually any career. Beyond that, however, I don't think you can overemphasise the importance of having a good background in maths and science."
"One of the difficulties in thinking about software is its huge variety. A function definition in a spreadsheet cell is software. A smartphone app is software. The flight management system for an Airbus A380 is software. A word processor is software. We shouldn't expect a single discipline of software engineering to cover all of these, any more than we expect a single discipline of manufacturing to cover everything from the Airbus A380 to the production of chocolate bars, or a single discipline of social organization to cover everything from the United Nations to a kindergarten. Improvement in software engineering must come bottom-up, from intense specialized attention to particular products."
"How can it be that we have so much software that is reliable enough for us to use it? The answer is simple; programming is a trial and error craft. People write programs without any expectation that they will be right the first time. They spend at least as much time testing them and correcting errors as they spent writing the initial program. Large concerns have separate groups of testers to do quality assurance. Programmers cannot be trusted to test their own programs adequately. Software is released for use, not when it is known to be correct, but when the rate of discovering new errors slows down to one that management considers acceptable. Users learn to expect errors and are often told how to avoid the bugs until the program is improved."
"You can't physically touch software. You can hold a floppy disk or CD-ROM in your hand, but the software itself is a ghost that can be moved from one object to another with little difficulty. In contrast, a road is a solid object that has a definite size and shape. You can touch the material and walk the route... Software is a codification of a huge set of behaviors: if this occurs, then that should happen, and so on. We can visualize individual behaviors, but we have great difficulty visualizing large numbers of sequential and alternative behaviors... The same things that make it hard to visualize software make it hard to draw blueprints of that software. A road plan can show the exact location, elevation, and dimensions of any part of the structure. The map corresponds to the structure, but it's not the same as the structure. Software, on the other hand, is just a codification of the behaviors that the programmers and users want to take place. The map is the same as the structure... This means that software can only be described accurately at the level of individual instructions... A map or a blueprint for a piece of software must greatly simplify the representation in order to be comprehensible. But by doing so, it becomes inaccurate and ultimately incorrect. This is an important realization: any architecture, design, or diagram we create for software is essentially inadequate. If we represent every detail, then we're merely duplicating the software in another form, and we're wasting our time and effort."
"When done well, software is invisible."
"Software is like sex; it's better when it's free."
"Me, I just don't care about proprietary software. It's not "evil" or "immoral," it just doesn't matter. I think that Open Source can do better, and I'm willing to put my money where my mouth is by working on Open Source, but it's not a crusade – it's just a superior way of working together and generating code."
"I do think we could do a better job of anticipating the software needs of new projects, but it is also important to understand that a lot of needs are not readily apparent a priori. Sometimes we have to try to do things for awhile before we really have an understanding of where the problem points are."
"Software gets slower faster than hardware gets faster."
"For me, abstraction is real, probably more real than nature. I'll go further and say that abstraction is nearer my heart. I prefer to see with closed eyes."
"Abstract terms (however useful they may be in argument) should be discarded in meditation, and the mind should be fixed on the particular and the concrete, that is, on the things themselves."
"An abstraction is one thing that represents several real things equally well."
"A major danger in using highly abstractive methods in political philosophy is that one will succeed merely in generalizing one’s own local prejudices and repackaging them as demands of reason. The study of history can help to counteract this natural human bias."
"The reliance on the abstract is thus not a result of an over-estimation but rather of an insight into the limited powers of our reason. It is the over-estimation of the powers of reason which leads to the revolt against the submission to abstract rules. Constructivist rationalism rejects the demand for this discipline of reason because it deceives itself that reason can directly master all the particulars; and it is thereby led to a preference for the concrete over the abstract, the particular over the general, because its adherents do not realize how much they thereby limit the span of true control by reason. The hubris of reason manifests itself in those who believe that they can dispense with abstraction and achieve a full mastery of the concrete and thus positively master the social process."
"The abstraction is often the most definite form for the intangible thing in myself that I can only clarify in paint."
"Words are a medium that reduces reality to abstraction for transmission to our reason, and in their power to corrode reality inevitably lurks the danger that the words will be corroded too."
"Denotation by means of sounds and markings is a remarkable abstraction. Three letters designate God for me; several lines a million things. How easy becomes the manipulation of the universe here, how evident the concentration of the intellectual world! Language is the dynamics of the spiritual realm. One word of command moves armies; the word Liberty entire nations."
"Everything abstract is ultimately part of the concrete. Everything inanimate finally serves the living. That is why every activity dealing in abstraction stands in ultimate service to a living whole."
"I discovered the works of Euler and my perception of the nature of mathematics underwent a dramatic transformation. I was de-Bourbakized, stopped believing in sets, and was expelled from the Cantorian paradise. I still believe in abstraction, but now I know that one ends with abstraction, not starts with it. I learned that one has to adapt abstractions to reality and not the other way around. Mathematics stopped being a science of theories but reappeared to me as a science of numbers and shapes."
"Although many popular information systems planning methodologies, design approaches, and various tools and techniques do not preclude or are not inconsistent with enterprise-level analysis, few of them explicitly address or attempt to define enterprise architectures."
"The state of the art in software design is the "enterprise architecture", where separate software components implement data processing (or other application specific-tasks), data storage, and user interface functionality. This approach enables, for example, the replacement of a database engine without changing the software components that process the data and those that support the interaction with the user."
"With increasing size and complexity of the implementations of information systems, it is necessary to use some logical construct (or architecture) for defining and controlling the interfaces and the integration of all of the components of the system."
"Architecture discussions frequently focus on technology issues. This paper takes a broader view, and describes the need for an "enterprise architecture" that includes an emphasis on business and information requirements. These higher level issues impact data and technology architectures and decisions... There is not a single correct way to develop an architecture or implement standards for every enterprise; they must be customized to the environment."
"Architecture is defined as a clear representation of a conceptual framework of components and their relationships at a point in time... a discussion of architecture must take into account different levels of architecture. These levels can be illustrated by a pyramid, with the business unit at the top and the delivery system at the base. An enterprise is composed of one or more Business Units that are responsible for a specific business area. The five levels of architecture are Business Unit, Information, Information System, Data and Delivery System. The levels are separate yet interrelated... The idea if an enterprise architecture reflects an awareness that the levels are logically connected and that a depiction at one level assumes or dictates that architectures at the higher level."
"DOD will create a Department-wide blueprint (enterprise architecture) that will prescribe how the Department's financial and non-financial feeder systems and business processes interact. This architecture will guide the development of enterprise-level processes and systems throughout DOD."
"Traditional manufactures tend to view human resources contained within highly segmented functions whereas integrated manufacturers view the entire organization as a series of resource centers. Integrated manufacturers allow and promote the sharing of critical skills on an as needed basis throughout the enterprise while Traditional manufacturers fall victim to resource hoarding. Given the breadth of change that new manufacturing technologies and operations philosophies will bring to many organizations, an assessment should be made as to how ready the organization and the various business functions are to accommodate these technology and non-technology changes. The recommended approach of course is to begin with corporate vision, objectives and strategies leading to a determination of the overall Enterprise Architecture in the areas of People, Management Practices, Support Structures and Corporate Cultures. This would ensure that all of the complementary shifts in the components of the Enterprise Architecture would be orchestrated under an overall plan of enterprise wide change."
"Automatic control in manufacturing requires a very rigourous and well defined model of the operation to be controlled. A model which on one hand reflects reality as closely as possible and on the other hand allows easy and fast modifications and updates to reflect the continuous changes in the real world. provides the modelling concepts to describe and maintain the complete manufacturing systems. In addition, CIM-OSA aims to provide at providing consistent engineering and operational environments to model for execution as well as for real time operation control. Keywords: CIM; CIM architecture; enterprise architecture; enterprise modelling; enterprise optimisation; enterprise control; real time control;"
"Enterprise architectures are required to support and maximize the efforts of virtual teams within decentralized organizations. Vendor products exist today to start evolving towards a standards-based multi-vendor architecture. The underlying networking technology, 802.3/Ethemet, is robust and will provide for a cost-effective investment that will last for many years to come. Complimentary LAN technologies are already available to ensure transparent growth of networked systems. Combining human resources with information technology will be the key differentiating factor for successful manufacturing enterprises in the 1990s."
"The Enterprise Architecture is a combination of the Business and Computing architectures. The Computing Architecture, at the least, identifies hardware, software and data communications."
"When one enterprise architecture dominates, Grumman plans to use it as a "manager of managers." This single, enterprise-wide network-management architecture will run over all other network-management systems being used: no existing scheme will be scrapped."
"For the IT department [the] change towards commodity products, open standards, end-user decision making and fundamental change in the Business platform, will imply substantial challenges. The role of the IS function will change. The formal IS budget in the US, according to Gartner Group Inc, is 2.4% of revenue in 1990, and will grow to 2.7% in 1995. End-user IT spending is estimated t0 2.4% in 1990, split 50/50 between budget and unseen expenses. This item is assumed to increase to 5.0% in 1995, bringing the total to 7.7% This total amount must be managed, and the rules must be set by the IS manager. Following the rule of "Least resistance", will lead to crisis and complete loss of control. IT resources must be managed. An Enterprise Architecture must be established and adhered to. Standards must be established, and partnerships between IT professionals and end-users formed."
"A key ingredient to an enterprise architecture is the ability to link multiple and disparate systems into a coherent whole. has been gradually putting together the technology that will enable it to offer enterprise LANs that are capable of supporting distributed applications running across a variety of computer Internetworking and multi- routing are essential building blocks."
"The creation and implementation of integrated information systems involves a variety of collaborators including people from specialist departments, informatics, external advisers and manufacturers. They need clear rules and limits within which they can process their individual sub-tasks, in order to ensure the logical consistency of the entire project. Therefore, an architecture needs to be established to determine the components that make up the information system and the methods to be used to describe it. The ARIS architecture developed in this book is described in concrete terms as an information model within the entity-relationship approach. This information model provides the basis for the systematic and rational application of methods in the development of information systems. It also serves as the basis for a repository in which the enterprise's application - specific data, organization and function models can be stored. The ARIS architecture constitutes a framework in which integrated applications systems can be developed, optimized and converted into EDP - technical implementations. At the same time, it demonstrates how business economics can examine and analyze information systems in order to translate their contents into EDP-suitable form."
"Enterprise Architecture Planning (EAP) results in a high-level blueprint of data, applications, and technology that is a cost-effective, long-term solution; not a quick fix. Management participation provides a business perspective, credibility, and de-mystifies the systems planning process. EAP can be labeled as business-driven or data -driven because"
"This is an intermediate work that describes PERA, which is a general enterprise reference architecture model that is suited for manufacturing enterprises. The book explains PERA as a layered architecture, and within the context of an architectural life cycle. PERA at the conceptual level is focused on the integration of physical plant, human resources, and information systems. This approach is clearly suited to manufacturing, and in fact, the model has been adopted and adapted by Fluor Daniel Corporation (among others) and has been proven in practice..."
"The integration technology and infrastructure elements available today, in 1993, would enable an enterprise to develop a significant integration infrastructure. However, integration projects are constrained by cultural inertia, financial and resource limitations, and, significantly, risk management Thus, projects and their supporting integration infrastructures tend to be deployed in an incremental and evolutionary manner. Since each enterprise chooses its integration path based on particular business needs, the corporations visited in this study each presented a different road map of integration efforts to date and a unique snapshot of current integration infrastructure.... DoD, in concert with leading companies, should formulate an R&D strategy to create a new generation of enterprise architectures, models, tools, and software systems, and to determine the potential for new business operations, engineering practices, and manufacturing concepts. To achieve potential functional and performance improvements, integrators should combine the leverage of several emerging threshold technologies, such as operational integration frameworks, object-based and knowledge-based product and process representations, application-oriented network services, near-term and mid-term solutions to database integration, and wide-area object brokerage and execution.-"
"A company's enterprise architecture is unique — neither good nor bad, but only appropriate or inappropriate in regard to management's vision of the future. The current enterprise architecture either supports the vision or it does not."
"An enterprise architecture can be thought of as a "blueprint" or "picture" which assists in the design of an enterprise. The enterprise architecture must define three things. First, what are the activities that an enterprise performs? Second, how should these activities be performed? And finally, how should the enterprise be constructed? Consequently, the architecture being developed will identify the essential processes performed by a virtual company, how the virtual company and the agile enterprises involved in the virtual company will perform these processes, and include a methodology for the rapid reconfiguration of the virtual enterprise."
"An enterprise architecture is an abstract summary of some organizational component's design. The organizational strategy is the basis for deciding where the organization wants to be in three to five years. When matched to the organizational strategy, the architectures provide the foundation for deciding priorities for implementing the strategy."
"It is within the purview of each context to define its own rules and techniques for deciding how the object-oriented mechanisms and principles are to be managed. And while the manager of a large information system might wish to impose some rules based on philosophical grounds, from the perspective of enterprise architecture, there is no reason to make decisions at this level. Each context should define its own objecttivity."
"Although the concept of an enterprise architecture (EA) has not been well defined and agreed upon, EAs are being developed to support information system development and enterprise reengineering. Most EAs differ in content and nature, and most are incomplete because they represent only data and process aspects of the enterprise. This paper defines an EA... An EA is a conceptual framework that describes how an enterprise is constructed by defining its primary components and the relationships among these components."
"Enterprise computing and open systems - what are the distinctions between the two. Open systems are oriented towards an environment where most or all of the computing technology that comprises that environment is based upon standards regardless of the scope of the environment - departmental or organization-wide. Enterprise computing, by contrast, encompasses not only open system concepts but, by virtue of existing environments that must be incorporated as well, a great deal of proprietary interfaces and interoperability mechanisms. In the early 1990s, as both movements were beginning to gain momentum, there was some degree of overlap between open systems and enterprise computing, the amount of which was hindered somewhat by the stage at which enterprise architectures and standards were. It was anticipated that over time, as the enterprise architectures, open standards, and products built on one or both evolved and matured, the gap between the two would narrow and a greater degree of overlap would occur. As it turns out, the two movements have converged..."
"The Enterprise Project is collaborative work between AIAI at the University of Edinburgh, IBM UK, Lloyd's Register of Shipping, Logica and Unilever. The project is establishing a generic framework within which enterprise tools can be used to assist users in their tasks. It is based on an Enterprise ontology which establishes shared terminology for communication between users and tools... The core of the tool set will support user tasks via a workflow engine which will assist the user in performing a task, allow access to appropriate tools and methods, and make available suitable information resources. An abstraction of this central work ow within the tool set is provided in this paper. This acts to provide a framework for describing the various components integrated within the tool set and allowing them to be provided in a modular fashion."
"An enterprise architecture is a snapshot of how an enterprise operates while performing its business processes. The recognition of the need for integration at all levels of an organisation points to a multi-dimensional framework that links both the business processes and the data requirements. Such a framework is provided by the Information Systems Architecture (ISA) developed by John Zachman."
"The presence of an enterprise reference architecture aids an enterprise in its ability to understand its structure and processes. Similar to a computer architecture, the enterprise architecture is comprised of several views. The enterprise architecture should provide activity, organizational, business rule (information), resource, and process views of an organization."
"There is no such thing as a standard enterprise architecture. Enterprise design is as unique as a human fingerprint, because enterprise differ in how they function. Adopting an enterprise architecture is therefore one of the most urgent tasks for top executive management. Fundamentally, and information framework is a political doctrine for specifying as to who will have what information to make timely decisions... Enterprise architecture [is] the Holy Grail of all systems people. Advanced systems textbooks tell you that every organization must have one. Several CIM program directors attempted to come up with this abstraction, only to fail. Only someone with a depth of understanding about how the Pentagon really works could come up with anything of use."
"Most enterprise architectures are obsolete," says Martin, and "most end-to-end processes are clumsy, slow, expensive, and even harmful; they need to be replaced with routines that are fast and focus on the needs of the customer."
"The is about those methods, models and tools which are needed to build the integrated enterprise. The architecture is generic because it applies to most, potentially all types of enterprise. The coverage of the framework spans Products, Enterprises, Enterprise Integration and Strategic Enterprise Management, with the emphasis being on the middle two. The proposal for the architecture follows the architecture itself improving the quality of the presentation and of the outcome. Definitions of Generic Enterprise Reference Architecture, Enterprise Engineering/ Integration Methodology, Enterprise Modelling Languages, Enterprise Models, and Enterprise Modules are given. It is proposed how the above could be developed on the basis of previously analysed architectures (and other results too), such as the , the GRAI Integrated Methodology, , and TOVIE."
"The term "information technology architecture," with respect to an executive agency, means an integrated framework for evolving or maintaining existing information technology and acquiring new information technology to achieve the agency’s strategic goals and information resources management goals."
"One could then consider the enterprise-reference architecture to be a meta model of the enterprise representation. The enterprise-architecture is a component of this meta model."
"In the early '80's, there was little interest in the idea of Enterprise Reengineering or Enterprise Modeling and the use of formalisms and models was generally limited to some aspects of application development within the Information Systems community. The subject of "architecture" was acknowledged at that time, however, there was little definition to support the concept. This lack of definition precipitated the initial investigation that ultimately resulted in the "Framework for Information Systems Architecture." Although from the outset, it was clear that it should have been referred to as a "Framework for Enterprise Architecture," that enlarged perspective could only now begin to be generally understood as a result of the relatively recent and increased, world-wide focus on Enterprise "engineering." The Framework as it applies to Enterprises is simply a logical structure for classifying and organizing the descriptive representations of an Enterprise that are significant to the management of the Enterprise as well as to the development of the Enterprise’s systems. It was derived from analogous structures that are found in the older disciplines of Architecture/Construction and Engineering/Manufacturing that classify and organize the design artifacts created over the process of designing and producing complex physical products (e.g. buildings or airplanes.)"
"The Enterprise Architecture is the explicit description of the current and desired relationships among business and management process and information technology. It describes the "target" situation which the agency wishes to create and maintain by managing its IT portfolio. The documentation of the Enterprise Architecture should include a discussion of principles and goals. For example, the agency's overall management environment, including the balance between centralization and decentralization and the pace of change within the agency, should be clearly understood when developing the Enterprise Architecture. Within that environment, principles and goals set direction on such issues as the promotion of interoperability, open systems, public access, end-user satisfaction, and security."
"Architecture is that set of design artifacts, or descriptive representations, that are relevant for describing an object, such that it can be produced to requirements (quality) as well as maintained over the period of its useful life (change)."
"A conceptual framework that links the Departmental and Programmatic missions, goals, and objectives, and provides a mapping of the current and future DOE business information required to support them."
"Establishing an enterprise architecture is like reengineering an aircraft in flight."
"provides a Reference Architecture (known as the CIMOSA cube) from which particular enterprise architectures can be derived. This Reference Architecture and the associated enterprise modelling framework are based on a set of modelling constructs, or generic building blocks, which altogether form the CIMOSA modelling languages."
"GERAM (The Generalized Enterprise Reference Architecture Methodology) is a class of enterprise architectures and their associated methodologies as developed by the IFAC/IFIP Task Force on Architectures for in their work during the period 1990-1996"
"Generically, an architecture is the description of the set of components and the relationships between them. Simple enough. The trouble starts when you tack on an adjective: There are software architectures, hardware architectures, network architectures, system architectures, and enterprise architectures. People have their own preconceived notions and experiences about “architecture.” A software architecture describes the layout of the software modules and the connections and relationships among them. A hardware architecture can describe how the hardware components are organized. However, both these definitions can apply to a single computer, a single information system, or a family of information systems. Thus “architecture” can have a range of meanings, goals, and abstraction levels, depending on who’s speaking. An information system architecture typically encompasses an overview of the entire information system—including the software, hardware, and information architectures (the structure of the data that systems will use). In this sense, the information system architecture is a meta-architecture. An enterprise architecture is also a meta-architecture in that it comprises many information systems and their relationships (technical infrastructure). However, because it can also contain other views of an enterprise—including work, function, and information—it is at the highest level in the architecture pyramid. It is important to begin any architecture development effort with a clear definition of what you mean by “architecture.”"
"This book... provides a formal notational system for drawing and maintaining IT architectures, which I call the Enterprise Information Technology Architecture Blueprinting (EAB for short). This methodology adresses the features required of any formal notational system... In short, EAB defines a communications system that allows a community of IT professionals to visualize architectures in a standard manner."
"Similar to a computer architecture, the enterprise architecture is comprised of several views, including activity view, organizational view, business rule (information) view, resource view, and process view. These views should be cross-referenced with each other to provide an integrated picture of the enterprise."
"Enterprise architecture is a family of related architecture components. This include information architecture, organization and business process architecture, and information technology architecture. Each consists of architectural representations, definitions of architecture entities, their relationships, and specification of function and purpose. Enterprise architecture guides the construction and development of business organizations and business processes, and the construction and development of supporting information systems. Enterprise architecture is a holistic representation of all the components of the enterprise and the use of graphics and schemes are used to emphasize all parts of the enterprise, and how they are interrelated... Enterprise architectures are used to deal with intra-organizational processes, interorganizational cooperation and coordination, and their shared use of information and information technologies. Business developments, such as outsourcing, partnership, alliances and Electronic Data Interchange, extend the need for architecture across company boundaries."
"Architecture : The fundamental organization of a system embodied in its components, their relationships to each other, and to the environment, and the principles guiding its design and evolution."
"The use of Enterprise Architectures is becoming increasingly widespread in the private sector. Borrowing insights from enterprise reference architectures developed during the last decade, IT vendors and companies belonging to specific industries are establishing reference data and process models advancing the standardisation of their businesses and creating a more integrated environment for their activities. Although public administrations share the same problem of non-standardisation, which is being magnified rapidly in a changing and demanding environment, little has been done so far in the direction of integration..."
"The concept of EAs dates back to the mid-1980s. At that time, John Zachman, widely recognized as a leader in the field, identified the need to use a logical construction blueprint (i.e., an architecture) for defining and controlling the integration of systems and their components. Accordingly, Zachman developed a “framework” or structure for logically defining and capturing an architecture. Drawing parallels to the field of classical architecture, and, later, to the aircraft manufacturing industry, in which different work products (e.g., architect plans, contractor plans, shop plans, bills of lading) represent different views of the planned building or aircraft, respectively, Zachman’s framework identified the kind of work products needed to understand and thus build a given system or entity. In short, this framework provides six perspectives or windows from which to view how a given entity operates. The perspectives are those of the (1) strategic planner, (2) system user, (3) system designer, (4) system developer, (5) subcontractor, and (6) system itself. Associated with each of these perspectives, Zachman also proposed six abstractions of the entity, or models covering (1) how the entity operates, (2) what the entity uses to operate, (3) where the entity operates, (4) who operates the entity, (5) when entity operations occur, and (6) why the entity operates. Zachman’s framework provides a way to identify and describe an entity’s existing and planned component parts and the parts’ relationships before the costly and time-consuming efforts associated with developing or transforming the entity begin."
"Since the late 1980s, architecture frameworks have emerged within the federal government, beginning with the publication of the National Institute of Standards and Technology framework in 1989. Subsequently, we issued EA guidance, and our research of successful public and private sector organizations’ IT management practices identified the use of EAs as a factor critical to these organizations’ success. Since that time, other federal entities have issued EA frameworks, including the Department of Defense, Department of the Treasury, and the federal CIO Council. Although the various frameworks use different terminology and somewhat different structures, they are fundamentally consistent in purpose and content, and they are being used today to varying degrees by many federal agencies. The emergence of federal frameworks and guidance over the last 5 years owes largely to the Congress’s passage of the in 1996. This act, among other things, requires the CIOs for major departments and agencies to develop, maintain, and facilitate the implementation of information technology architectures as a means of integrating business processes and agency goals with IT. In response to the act, OMB, in collaboration with us, issued guidance on the development and implementation of EAs..."
"An architecture framework is a tool which can be used for developing a broad range of different architectures [architecture descriptions]. It should describe a method for designing an information system in terms of a set of building blocks, and for showing how the building blocks fit together. It should contain a set of tools and provide a common vocabulary. It should also include a list of recommended standards and compliant products that can be used to implement the building blocks."
"Principal among the Strategic Systems Architectures is the so-called ‘Enterprise Architecture’. This is usually regarded as an ‘umbrella’ architecture that covers business, application, information and technical architectures too. This Best Practice Guide concentrates on the structure, construction and use of an Enterprise Architecture. An Enterprise Architecture is a dynamic and powerful tool that helps organisations understand their own structure and the way they work. It provides a ‘map’ of the enterprise and a ‘route planner’ for business and technology change. A well-constructed Enterprise Architecture provides a foundation for the ‘Agile’ business. Normally an EA takes the form of a comprehensive set of cohesive models that describe the structure and functions of an enterprise. An important use is in systematic IT planning and architecting, and in enhanced decision-making. The EA can be regarded as the ‘master architecture’ that contains all the subarchitectures for an enterprise. The individual models in an EA are arranged in a logical manner that provides an ever-increasing level of detail about the enterprise: its objectives and goals; its processes and organisation; its systems and data; the technology used and any other relevant spheres of interest."
"A well-defined enterprise architecture (EA) is a blueprint for institutional modernization and evolution that consists of models describing how an entity operates today and how it intends to operate in the future, along with a plan for how it intends to transition to this future state. Such architectures are essential tools whose effective development and use are recognized hallmarks of successful organizations."
"Enterprise Architecture is a complete expression of the enterprise; a master plan which "acts as a collaboration force" between aspects of business planning such as goals, visions, strategies and governance principles; aspects of business operations such as business terms, organization structures, processes and data; aspects of automation such as information systems and databases; and the enabling technological infrastructure of the business, such as computers, operating systems and networks"
"Enterprise Architecture is the discipline whose purpose is to align more effectively the strategies of enterprises together with their processes and their resources (business and IT). Enterprise architecture is complex because it involves different types of practitioners with different goals and practices. Enterprise Architecture can be seen as an art; it is largely based on experience but does not have strong theoretical foundations. As a consequence, it is difficult to teach, to apply, and to support with computer-aided tools."
"[T]he average company’s enterprise system - i.e. the overall system of IT related entities - is today highly complex. Technically, large organizations possess hundreds or thousands of extensively interconnected and heterogeneous single IT systems performing tasks that varies from enterprise resource planning to real-time control and monitoring of industrial processes. Moreover are these systems storing a wide variety of sometimes redundant data, and typically they are deployed on several different platforms... Organizationally, the enterprise system embraces business processes and business units using as well as maintaining and acquiring the IT systems. The interplay between the organization and the IT systems are further determined by for instance business goals, ownership and governance structures, strategies, individual system users, documentation, and cost. Lately, Enterprise Architecture (EA) has evolved with the mission to take a holistic approach to managing the above depicted enterprise system. The discipline’s presumption is that architectural models are the key to succeed in understanding and administrating enterprise systems. Compared to many other engineering disciplines, EA is quite immature in many respects. This thesis identifies.. firstly, the lack of explicit purpose for architectural models... [A] company’s Chief Information Officer (CIO) should guide the rationale behind the development of EA models. In particular, distribution of IT related information and knowledge throughout the organization is emphasized as an important concern uncared for. Secondly, the lack of architectural theory is recognized..."
"An enterprise architecture framework"
"The software architecture of a system or a family of systems has one of the most significant impacts on the quality of an organization's enterprise architecture. While the design of software systems concentrates on satisfying the functional requirements for a system, the design of the software architecture for systems concentrates on the nonfunctional or quality requirements for systems. These quality requirements are concerns at the enterprise level. The better an organization specifies and characterizes the software architecture for its systems, the better it can characterize and manage its enterprise architecture. By explicitly defining the systems software architectures, an organization will be better able to reflect the priorities and trade-offs that are important to the organization in the software that it builds."
"The concept of enterprise architecture emerged in the mid-1980s as a means for optimizing integration and interoperability across organizations. In the early 1990s, GAO research of successful public and private sector organizations led it to identify enterprise architecture as a critical success factor for agencies that are attempting to modernize their information technology (IT) environments. Since then, GAO has repeatedly identified the lack of an enterprise architecture as a key management weakness in major modernization programs at a number of federal agencies. It has also collaborated with the Office of Management and Budget (OMB) and the federal Chief Information Officers (CIO) Council to develop architecture guidance. In 2002, OMB began developing the Federal Enterprise Architecture (FEA), an initiative intended to guide and constrain federal agencies’ enterprise architectures and IT investments."
"During the mid-1980s, John Zachman... identified the need to use a logical construction blueprint (i.e., an architecture) for defining and controlling the integration of systems and their components... Since Zachman introduced his framework, a number of frameworks have emerged within the federal government, beginning with the publication of the National Institute of Standards and Technology (NIST) framework in 1989. Since that time, other federal entities have issued enterprise architecture frameworks, including the Department of Defense (DOD) and the Department of the Treasury. In September 1999, the federal CIO Council published the , which was intended to provide federal agencies with a common construct for their architectures, thereby facilitating the coordination of common business processes, technology insertion, information flows, and system investments among federal agencies. The Federal Enterprise Architecture Framework describes an approach, including models and definitions, for developing and documenting architecture descriptions for multi-organizational functional segments of the federal government."
"Since the late 1980s, EA Management Frameworks have emerged within the federal government, beginning with the publication of the National Institute of Standards and Technology framework in 1989. In 1992, the GAO issued EA guidance entitled Strategic Information Planning: Framework for Designing and Developing System Architecture. This EA Management Framework was intended to:"
"In the 1970s and 1980s, business processes were redesigned on average once every seven years. This rate of change was easy for the IT department to follow. The time needed to alter the information systems that supported new or changed business processes stayed within acceptable limits. In the 1990s, the rate of change began to increase and information systems began to lag behind. In 2000, a manager succinctly remarked: “We can completely redesign our business processes every three months and subsequently our IT department needs a year to catch up with the supporting information systems.”"
"Enterprise ontology is a novel subject, and writing a book on this novel subject puts the author under the obligation to provide at least two kinds of explanation. One explanation regards the justification of presenting yet another point of view on enterprises. Why and how would enterprise ontology assist in coping with the current and future problems related to enterprises? The other explanation concerns the particular approach towards enterprise ontology that the author takes. Why would this approach be more appropriate and more effective than some other one? These are serious questions indeed, and anyone who takes the pain to study this book deserves satisfying answers. You will get the answers; however, not straight away. A first attempt is in this introductory chapter. Definite and fully satisfying answers can only emerge from a dedicated and thorough study of the book. The lasting reward of such a study is a novel and powerful insight into the essence of the operation of enterprises; by this we mean insight that is fully independent of the (current) realization and implementation."
"[T]his article presents the results of a survey in which Swedish CIOs have prioritized their most important concerns. The three most pertinent concerns are to decrease the cost related to the business organization, improve the quality of the interplay between the IT organization and the business organization, and provide new computer-aided support to the business organization. The survey also shows that CIOs in large companies have a more business-oriented focus than those in small companies... [and that] the foci of Enterprise Architecture frameworks should be aligned with the concerns of the CIO."
"Enterprise architecture is the organizing logic for business processes and IT infrastructure reflecting the integration and standardization requirements of a company's operation model... The key to effective enterprise architecture is to identify the processes, data, technology, and customer interfaces that take the operating model from vision to reality."
"Architecture has two meanings depending upon its contextual usage:"
": (1) A formal description of a system, or a detailed plan of the system at component level to guide its implementation;"
": (2) The structure of components, their interrelationships, and the principles and guidelines governing their design and evolution over time."
"Most business-folk have probably never heard of enterprise architecture. Which is not surprising, because most of the literature in the field suggests it’s about IT, and only about IT. There might be a few throwaway references somewhere to some blurry notion of ‘business architecture’, but that’s about it. Hence of no relevance to everyday business, really. Which is a problem, because real enterprise-architecture isn’t much about IT at all. Or rather, although IT is significant, it’s only one small part. Turns out instead that that blurry ‘business architecture’ isn’t something that can be skipped in a headlong rush down to the technical minutiae: it’s actually the core of enterprise-architecture. Enterprise-architecture is about the architecture – the structure – of the whole of the enterprise:"
"Enterprise architecture is the organizing logic for business processes and IT infrastructure reflecting the integration and standardization requirements of the company's operating model. The operating model is the desired state of business process integration and business process standardization for delivering goods and services to customers."
"Enterprise architecture is a management practice to maximize the contribution of an agency’s resources, IT investments, and system development activities to achieve its performance goals. Architecture describes clear relationships from strategic goals and objectives through investments to measurable performance improvements for the entire enterprise or a portion (or segment) of the enterprise"
"In the case of Enterprise Architecture, the most widely read book ever published with this kind of subject or field of study is entitled Enterprise Architecture as Strategy. By reading this book, you will learn that every company has its own architecture, but unfortunately, some just do not have the right one."
"Enterprise Architecture is conceptually defined as the normative restriction of design freedom. Practically, it is a coherent and consistent set of principles that guide the design, engineering, and implementation of an enterprise. Any strategic initiative of an enterprise can only be made operational through transforming it into principles that guide the design, engineering, and implementation of the new enterprise. Only by applying this notion of Enterprise Architecture can consistency be achieved between the high-level policies (mission, strategies) and the operational business rules of an enterprise."
"Since the 1970's, organizations are spending more and more money building new information systems. The fast growing number of systems and in many cases the ad hoc manner in which the systems were integrated have exponentially increased the cost and complexity of information systems. At the same time organizations are finding it more and more difficult to keep these information systems in alignment with business need. Furthermore, the role of information systems has changed during the last two decades, from automation of routine administrative tasks to a strategic and competitive weapon. In light of this development, a new field of research and practice was born that soon came to be known as Enterprise Architecture."
"Enterprise architecture is the process of translating business vision and strategy into effective enterprise change by creating, communicating and improving the key requirements, principles and models that describe the enterprise's future state and enable its evolution. The scope of the enterprise architecture includes the people, processes, information and technology of the enterprise, and their relationships to one another and to the external environment. Enterprise architects compose holistic solutions that address the business challenges of the enterprise and support the governance needed to implement them."
"The goal of enterprise architecture is to create a unified IT environment (standardized hardware and software systems) across the firm or all of the firm's business units, with tight symbiotic links to the business side of the organization (which typically is 90% of the firm... at least by way of budget). More specifically, the goals are to promote alignment, standardization, reuse of existing IT assets, and the sharing of common methods for project management and software development across the organization."
"Enterprise Architecture (EA) is becoming an increasingly mature field of work, but many large organizations still struggle with implementing an integral and truly effective EA function. The literature provides a fragmented picture of the EA function, describing the various separate elements that make up the total package of activities, resources, skills, and competences of the EA delivery function. In our view, the EA function reaches beyond EA delivery and also includes the stakeholders, structures and processes involved with EA decision making and EA conformance. A holistic and integral view on the EA function is essential in order to properly assess an EA function on its performance, and to allow identifying the key points of improvement..."
"The dream of every CEO is to have one standardized, integrated, flexible and manageable landscape of aligned business and IT processes, systems and procedures. Having complete control over all projects implementing changes in that landscape so that they deliver solutions that perfectly fit the corporate and IT change strategies, makes this dream complete. The reality for many large organizations is quite the opposite. Many large organizations struggle to keep their operational and change costs in control. Key reasons are the inflexibility and enormous complexity of their business and IT structures, processes, systems, and procedures, often distributed across lines of business (LoB) and business divisions (BD) spread out over various regions, countries or even continents.., Over the last decade, Enterprise Architecture (EA) has been one of many instruments used by organizations in their attempt to get grip on the current operational environment and the implementation of changes. EA provides standardization, and sets a clear direction for the future to guide changes."
"Enterprise Architecture is the description and visualization of the structure of a given area of contemplation, its elements and their collaborations and interrelations links vision, strategy and feasibility, focusing on usability durability and effectiveness. Architecture enables construction, defining principles, rules, standards and guidelines, expressing and communicating a vision."
"When software applications became larger and larger, people such as Shaw and Garlan coined the term software architecture. This notion of architecture deals with the key design principles underlying software artefacts. In the 1980s and 1990s, people became aware that the development of information technology (IT) should be done in conjunction with the development of the context in which it was used. This led to the identification of the so-called business/IT alignment problem. Solving the business/IT alignment problem requires enterprises to align human, organizational, informational, and technological aspects of systems. Quite early on, the term architecture was also introduced as a means to further alignment, and thus analyzes and solves business?IT alignment problems, Recently, the awareness emerged that alignment between business an IT is not enough, there are many more aspects in the enterprise in need of alignment. This has led to the use of the term architecture at the enterprise level: enterprise architecture."
"Enterprise Architecture is the organizing logic for key business processes and IT capabilities reflecting the integration and standardization requirements of the firm’s operating model."
"Enterprise architecture {is] a coherent whole of principles, methods, and models that are used in the design and realisation of an enterprise's organisational structure, business processes, information systems, and infrastructure... The most important characteristic of an enterprise architecture is that it provides a holistic view of the enterprise... To achieve this quality in enterprise architecture, bringing together information from formerly unrelated domains necessitates an approach that is understood by all those involved from those different domains."
"Methods for enterprise architecture, such as TOGAF, acknowledge the importance of requirements engineering in the development of enterprise architectures. Modelling support is needed to specify, document, communicate and reason about goals and requirements. Current modelling techniques for enterprise architecture focus on the products, services, processes and applications of an enterprise. In addition, techniques may be provided to describe structured requirements lists and use cases. Little support is available however for modelling the underlying motivation of enterprise architectures in terms of stakeholder concerns and the high-level goals that address these concerns"
"Since 2003, more and more authors using the term EA explicitly in their publications, Most of the newer contributions are coming from an academic background. Especially after 2005, a lot of consultancies and IT-companies are adopting their products and strategies to an extended architectural understanding hence Enterprise Architecture... After 2004-2005 a lot of companies started to use the term EA and since then have connected it somehow to their products and strategies a huge amount of superficial marketing material has been distributed. Consider the maturity and the focus of the contributions there is no topic or even a theory in the discipline of EA. Almost half of the approaches discussed in the papers are still coming from low maturity level (Concept Phase) in the context of readiness to be used in an organization. Only a third of the authors are delivering some kind of best practice (Implementation/Adoption). Differentiating the focus of EA-authors there are two specific topics only (EA-Frameworks and Enterprise Modeling) the majority is dealing with rather general aspects."
"has been an area of significant research in the information systems discipline throughout the last decade. Mainly developed by IT-practitioners, enterprise architectures (EA) became a promising and comprehensive approach to model either the current or desired state of enterprises. Existing approaches are, however, often criticized for paying too little attention to the business side of enterprises. In this paper, we interpret an enterprise as socio-technical system and analyze from a systems theory perspective which features are necessary for a comprehensive model. From there, we deduce if, why and how additional aspects of enterprises should be included into EA. Amongst others, it becomes obvious that especially human actors, as most flexible and agile elements of enterprises, are not adequately included in current architectures."
"In recent years, enterprise architecture (EA) management has emerged to one of the major challenges for enterprises. When looking for guidance in this field companies can choose from a variety of EA management approaches, which have been developed by scientists, practitioners, and governmental organizations. However, these approaches differ significantly in a number of characteristics and especially when it comes to methods for the EA management function. There is neither a common understanding of the scope and content of the main activities an EA management function consists of nor has a commonly accepted reference method been developed."
"Enterprise Architecture is the continuous practice of describing the essential elements of a sociotechnical organization, their relationships to each other and to the environment, in order to understand complexity and manage change."
"A literary review by Schöenherr (2009) clearly shows that the level of interest in Enterprise Architecture is indeed increasing. Although the term architecture was limited to information systems when originally adopted by John Zachman (1987), the concept has since then been expanded to encompass the entire enterprise and interpreted by academia as well as the private and public sectors. The different views on how to approach Enterprise Architecture are often documented and compiled into “guides” or “frameworks” which are intended to instruct practitioners in how to apply this concept to their organization. However, the numerous approaches all present disparate views on what exactly Enterprise Architecture entails and how it is best administered (Rood, 1994; Whitman, Ramachandran & Ketkar, 2001; Sessions, 2007; Schöenherr, 2009). This essentially leaves the practitioner in the dark as the approaches offer virtually no common ground, no common language and no common orientation on which to base a comparison."
"EA originated from and is influenced by a number of business areas: The manufacturing industry, with Material Requirements Planning (MRP) and later Manufacturing Resource Planning (MRP II). These approaches developed into the so-called supply chain or value chain (Porter, et al). Not only the incoming logistics and internal operations were considered, but also the flow of material to customers and back. The second origin of EA growth was from Process Modelling and Design approaches, e.g. Business Process Re-engineering (Hammer, et al). These approaches seek to depict the enterprise in terms of business processes, leading to process improvements and “end-to-end” process integration. Corporate and process governance, organisational adaptability and IM/IT system integration were typical considerations. Organisations were consequently often restructured, to become “flatter” (less management layers) and coined process-centred or process-oriented organisations. A third development is a type of backward integration where software developers trie to better understand and serve the business world with “functioning and value-adding” software solutions (business applications). It is a well-known fact that enterprise integration software (ERP, etc.), according to business users and owners, are often considered to be failures."
"Enterprise architecture (EA) is the definition and representation of a high-level view of an enterprise‘s business processes and IT systems, their interrelationships, and the extent to which these processes and systems are shared by different parts of the enterprise. EA aims to define a suitable operating platform to support an organisation‘s future goals and the roadmap for moving towards this vision. Despite significant practitioner interest in the domain, understanding the value of EA remains a challenge. Although many studies make EA benefit claims, the explanations of why and how EA leads to these benefits are fragmented, incomplete, and not grounded in theory. This article aims to address this knowledge gap by focusing on the question: How does EA lead to organisational benefits? Through a careful review of EA literature, the paper consolidates the fragmented knowledge on EA benefits and presents the EA Benefits Model (EABM). The EABM proposes that EA leads to organisational benefits through its impact on four benefit enablers: Organisational Alignment, Information Availability, Resource Portfolio Optimisation, and Resource Complementarity."
"What I wanted to do when I came in... was help the community turn a corner and become relevant in the key initiatives that we need in the federal government... make sure architecture was relevant, it became more agile, it continued to move to have a more business and more strategy focus."
"The historical roots of enterprise architecture are well recorded and span a period that starts with the Zachman framework and continues to the present day. During these twenty years of productive activity many well-known frameworks and government level initiatives have seen the light. The aim of this section is to examine the history of enterprise architecture in an attempt to discover the tradition within which it stands. This tradition is explored in terms of diversity of definitions, the components of enterprise architecture, and enterprise architecture frameworks and methods:"
"The history of enterprise architecture as a management discipline has been marked by failure to live up to the promise it showed as a concept. The idea of an enterprise architecture and the explicit understanding of the relationships between the most critical forces, resources, and processes involved in the execution of an organization’s business is powerful. People can grasp on an intuitive level how powerful the reality of that concept would be if it could be put into practice and harnessed on behalf of the enterprise. Unfortunately, the conceptual enterprise architecture that enables the agile enterprise and informs executives in the midst of critical portfolio and execution decisions has given way to a morass of additional bureaucracy and expensive efforts to create enterprise models that are more often significant as records of organizational history than as blueprints for the future. The problems lie in three areas. The first of which is a lack of clear performance objectives for the EA effort... The second major failure stems from the belief that EA is somehow all about the process for developing content... Finally, there is far too little attention paid to helping the consumers of EA information..."
"Many companies are not driving significant business value from the digitized platforms they build as part of their enterprise architecture initiatives. Our 2011 survey of 146 senior IT leaders found that the companies that benefit from their platforms' efforts are consistently relying on four architecture-related practices that encourage organizational learning about the value of enterprise architecture: 1) making IT costs transparent, 2) debating architectural exceptions, 3) performing post-implementation reviews, and 4) making IT investments with enterprise architecture in mind."
"Enterprise architecture (EA) is a discipline for proactively and holistically leading enterprise responses to disruptive forces by identifying and analyzing the execution of change toward desired business vision and outcomes. EA delivers value by presenting business and IT leaders with signature-ready recommendations for adjusting policies and projects to achieve target business outcomes that capitalize on relevant business disruptions. EA is used to steer decision making toward the evolution of the future state architecture."
"Enterprise Architecture is not a method, principle or doctrine – It is a way of thinking enabled by patterns, frameworks, standards etc. essentially seeking to align both the technology ecosystem and landscape with the business trajectory driven by both the internal and external forces. Daljit R Banger"
"...its very name RANDU is enough to bring dismay into the eyes and stomachs of many computer scientists!"
"[Computer science] is not really about computers -- and it's not about computers in the same sense that physics is not really about particle accelerators, and biology is not about microscopes and Petri dishes...and geometry isn't really about using surveying instruments. Now the reason that we think computer science is about computers is pretty much the same reason that the Egyptians thought geometry was about surveying instruments: when some field is just getting started and you don't really understand it very well, it's very easy to confuse the essence of what you're doing with the tools that you use.""
"Software engineering is the part of computer science which is too difficult for the computer scientist."
"The percentage of women working in computer science-related professions has declined since the 1990s, dropping from 35% to 26% between 1990 and 2013. According to the American Association of University Women, we can reverse this trend by removing negative connotations around women in computer science. Educators and parents must work together to help girls maintain their confidence and curiosity in STEM subjects. Professional women already in the field can become mentors, while men can help create a more inclusive workplace."
"Computer science... differs from physics in that it is not actually a science. It does not study natural objects. Neither is it, as you might think, mathematics; although it does use mathematical reasoning pretty extensively. Rather, computer science is like engineering; it is all about getting something to do something, rather than just dealing with abstractions, as in the pre-Smith geology."
"[Computers] are developing so rapidly that even computer scientists cannot keep up with them. It must be bewildering to most mathematicians and engineers... In spite of the diversity of the applications, the methods of attacking the difficult problems with computers show a great unity, and the name of Computer Sciences is being attached to the discipline as it emerges. It must be understood, however, that this is still a young field whose structure is still nebulous. The student will find a great many more problems than answers."
"The purpose of computing is insight, not numbers."
"The only generally agreed upon definition of mathematics is "Mathematics is what mathematician's do." which is followed by "Mathematicians are people who do mathematics." What is true about defining mathematics is also true about many other fields: there is often no clear, sharp definition of the field. In the face of this difficulty [of defining "computer science"] many people, including myself at times, feel that we should ignore the discussion and get on with doing it. But as George Forsythe points out so well in a recent article*, it does matter what people in Washington D.C. think computer science is. According to him, they tend to feel that it is a part of applied mathematics and therefore turn to the mathematicians for advice in the granting of funds. And it is not greatly different elsewhere; in both industry and the universities you can often still see traces of where computing first started, whether in electrical engineering, physics, mathematics, or even business. Evidently the picture which people have of a subject can significantly affect its subsequent development. Therefore, although we cannot hope to settle the question definitively, we need frequently to examine and to air our views on what our subject is and should become."
"Without real experience in using the computer to get useful results the computer science major is apt to know all about the marvelous tool except how to use it. Such a person is a mere technician, skilled in manipulating the tool but with little sense of how and when to use it for its basic purposes."
"Indeed, one of my major complaints about the computer field is that whereas Newton could say, "If I have seen a little farther than others, it is because I have stood on the shoulders of giants," I am forced to say, "Today we stand on each other's feet." Perhaps the central problem we face in all of computer science is how we are to get to the situation where we build on top of the work of others rather than redoing so much of it in a trivially different way. Science is supposed to be cumulative, not almost endless duplication of the same kind of things."
"I can’t be as confident about computer science as I can about biology. Biology easily has 500 years of exciting problems to work on. It’s at that level."
"My own feeling is this: mathematics and computer science are the two unnatural sciences, the things that are man-made. We get to set up the rules so it doesn't matter the way the universe works – we create our own universe... And I feel strongly that they are different, but I tried to convince Bill Thurston and he disagreed with me. My opinion though is I can feel rather strongly when I am wearing my mathematician's cloak versus when I am wearing my computer scientist's cap."
"Computer science is an empirical discipline. [...] Each new machine that is built is an experiment. Actually constructing the machine poses a question to nature; and we listen for the answer by observing the machine in operation and analyzing it by all analytical and measurement means available. Each new program that is built is an experiment. It poses a question to nature, and its behavior offers clues to an answer."
"Computer scientists have so far worked on developing powerful programming languages that make it possible to solve the technical problems of computation. Little effort has gone toward devising the languages of interaction."
"Computer science is neither mathematics nor electrical engineering."
"Computer science research is different from these more traditional disciplines. Philosophically it differs from the physical sciences because it seeks not to discover, explain, or exploit the natural world, but instead to study the properties of machines of human creation. In this it is analogous to mathematics, and indeed the "science" part of computer science is, for the most part mathematical in spirit. But an inevitable aspect of computer science is the creation of computer programs: objects that, though intangible, are subject to commercial exchange."
"During the last years of the 1950s, the terminology in the field of computing was discussed in the Communications of the ACM, and a number of terms for the practitioners of the field of computing were suggested: turingineer, turologist, flowcharts-man, applied meta-mathematician, applied epistemologist, comptologist, hypologist, and computologist. The corresponding names of the discipline were, for instance, comptology, hypology, and computology. Later Peter Naur suggested the terms datalogy, datamatics, and datamaton for the names of the field, its practitioners, and the machine, and recently George McKee suggested the term computics. None of these terms stuck..."
"If biology limited women’s ability to code, then the ratio of women to men in programming ought to be similar in other countries. It isn’t. In India, roughly 40 percent of the students studying computer science and related fields are women. This is despite even greater barriers to becoming a female coder there; India has such rigid gender roles that female college students often have an 8 p.m. curfew, meaning they can’t work late in the computer lab, as the social scientist Roli Varma learned when she studied them in 2015. The Indian women had one big cultural advantage over their American peers, though: They were far more likely to be encouraged by their parents to go into the field, Varma says. What’s more, the women regarded coding as a safer job because it kept them indoors, lessening their exposure to street-level sexual harassment. It was, in other words, considered normal in India that women would code. The picture has been similar in Malaysia, where in 2001 — precisely when the share of American women in computer science had slid into a trough — women represented 52 percent of the undergraduate computer-science majors and 39 percent of the Ph.D. candidates at the University of Malaya in Kuala Lumpur."
"Any problem in computer science can be solved with another level of indirection."
"Computer science is no more about computers than astronomy is about telescopes."
"Algorithms are the computational content of proofs."
"The issue with AI, to me, is a very simple one. It's like the term algorithm. We watch companies use algorithms, and now AI, as a means of evading responsibility for their actions… If we endorse the view that AI is all-powerful, we are endorsing the view that it can alleviate people of responsibility for their actions—militarily, socioeconomically, whatever. The biggest danger of AI is that we attribute these godlike characteristics to it and therefore let ourselves off the hook. I don't know what the mythological underpinnings of this are, but throughout history there's this tendency of human beings to create false idols, to mold something in our own image and then say we've got godlike powers because we did that."
"Mathematics is what we want to keep for ourselves. When playing games, we stick to the rules (or we are changing the game...), but when doing serious mathematics (not executing algorithms) we make up the rules—definitions, axioms... even logics. ...[I]n arithmetic we find s, which are a whole new 'game'... [T]o identify mathematics with games would be one of those part-for-whole mistakes (like 'all geometry is projective geometry' or 'arithmetic is just logic' from the nineteenth century)... [M]y separation of game analysis from playing games tells... against the analogy of mathematics to the expert play of the game itself."
"Numerical analysis is very much an experimental science."
"Algorithms + Data Structures = Programs"
"MediaWiki is a useful tool for supporting group collaboration but when we apply it to the academic setting, we need to consider and adapt some features to match the needs of the classroom environment, which requires mandatory collaborative writing."
"While there are many different wiki content-management systems available for free or fee, MediaWiki is one of the most robust and well-maintained systems available to wiki publishers."
"MediaWiki makes it very easy both to track changes to the pages of their sites, and to revert to older copies of the pages."
"MediaWiki is the most well-known wiki software because it is what runs WikiPedia. MediaWiki is simple to use and an excellent way to start collaborating on documentation or articles."
"A notable irony of Wikipedia's popularity is that the editing process of its supporting technology, MediaWiki, is complex to learn. Editing Wikipedia pages requires significant investment to learn MediaWiki's unique and powerful code structure."
"The main downside of publishing a site using MediaWiki is that it won't give you a great opportunity to use or improve your HTML skills."
"Clear your mind and build your collective offline memory using MediaWiki (http://mediawiki.org), the same software that powers Wikipedia."
"MediaWiki is not as easy to use as web-based services, but it does have quite good functionality."
"MediaWiki is the most popular opensource software used for creating wiki sites."
"First released in 2002, MediaWiki is one of the top wiki engines and runs most of the wiki hosting sites. The name was a play on “Wikimedia,” and many people find it to be annoyingly confusing."
"MediaWiki (www.mediawiki.org/wiki/MediaWiki) is one of the best publishing wiki engines in existence."
"In Germany, we have a famous children's TV show called "Löwenzahn". It starts with a time lapse sequence of a dandelion flower breaking its way through the asphalt. This is what I've always associated with the MediaWiki logo, technology (brackets) being merely the basis for the growth of something wild and beautiful which transcends it."
"Some wiki engines try to represent functionality that's more CMS-like (e.g. complex workflows and access controls), while MediaWiki's functionality tends to be driven by the needs of open communities with minimal barriers to entry."
"Cybernetics: or Control and Communication in the Animal and the Machine"
"The machines of which we are now speaking are not the dream of the sensationalist,nor the hope of some future time. They alreadyexist as thermostats, automatic gyro-compass ship-steering systems, self-propelled missiles – especially such as seek their target – anti aircraft fire-control systems, automatically controlled oil-crackingstills, ultra-rapid computing machines, and the like. They had begun to be used long before the war – indeed, the very old steam-engine governor belongs among them –but the great mechanization of the Second World War brought them into their own,and the need of handling the extremely dangerous energy of the atom will probably bring them to a still higher point of development. . .the present age is as truly the age of the servomechanisms as the nineteenth century was the age of the steam engine or the eighteenth century the age of the clock."
"The concepts of purposive behavior and teleology have long been associated with a mysterious, self-perfecting or goal-seeking capacity or final cause, usually of superhuman or super-natural origin. To move forward to the study of events, scientific thinking had to reject these beliefs in purpose and these concepts of teleological operations for a strictly mechanistic and deterministic view of nature. This mechanistic conception became firmly established with the demonstration that the universe was based on the operation of anonymous particles moving at random, in a disorderly fashion, giving rise, by their multiplicity, to order and regularity of a statistical nature, as in classical physics and gas laws. The unchallenged success of these concepts and methods in physics and astronomy, and later in chemistry, gave biology and physiology their major orientation. This approach to problems of organisms was reinforced by the analytical preoccupation of the Western European culture and languages. The basic assumptions of our traditions and the persistent implications of the language we use almost compel us to approach everything we study as composed of separate, discrete parts or factors which we must try to isolate and identify as potential causes. Hence, we derive our preoccupation with the study of the relation of two variables. We are witnessing today a search for new approaches, for new and more comprehensive concepts and for methods capable of dealing with the large wholes of organisms and personalities."
"The concept of teleological mechanisms however it be expressed in many terms, may be viewed as an attempt to escape from these older mechanistic formulations that now appear inadequate, and to provide new and more fruitful conceptions and more effective methodologies for studying self-regulating processes, self-orienting systems and organisms, and self-directing personalities. Thus, the terms feedback, servomechanisms, circular systems, and circular processes may be viewed as different but equivalent expressions of much the same basic conception"
"… Norbert Wiener of the Massachusetts Institute of Technology, a brilliant mathematician who recently won fame with his invention of cybernetics, a new science of communications... His prolonged studies of the striking analogies between the control systems in animal bodies and those in complex machines became the basis of his newly created cybernetics, a science of communications."
"CYBERNETICS. Catch on to this word now. It's a new coined to label the fast-growing electronic brain system of industry which mat have more effect on the way we live than will atomic energy."
"The celebrated physicist and mathematician A.M. Ampere coined the word cybernetique to mean the science of civil government (Part II of "Essai sur la philosophic des sciences", 1845, Paris). Ampere's grandiose scheme of political sciences has not, and perhaps never will, come to fruition. In the meantime, conflict between governments with the use of force greatly accelerated the development of another branch of science, the science of control and guidance of mechanical and electrical systems. It is thus perhaps ironical that Ampere's word should be borrowed by N. Wiener to name this new science, so important to modern warfare. The "cybernetics" of Wiener ("Cybernetics, or Control and Communication in the animal and the Machine," John Wiley & Sons, Inc., New York, 1948) is the science of organization of mechanical and electrical components for stability and purposeful actions. A distinguishing feature of this new science is the total absence of considerations of energy, heat, and efficiency, which are so important in other natural sciences. In fact, the primary concern of cybernetics is on the qualitative aspects of the interrelations among the various components of a system and the synthetic behavior of the complete mechanism."
"Naturally there are detailed differences in messages and in problems of control, not only between a living organism and a machine, but within each narrower class of beings. It is the purpose of Cybernetics to develop a language and techniques that will enable us indeed to attack the problem of control and communication in general, but also to find the proper repertory of ideas and techniques to classify their particular manifestations under certain concepts."
"Cybernetics is likely to reveal a great number of interesting and suggestive parallelisms between machine and brain and society. And it can provide the common language by which discoveries in one branch can readily be made use of in the others... [There are] two peculiar scientific virtues of cybernetics that are worth explicit mention. One is that it offers a single vocabulary and a single set of concepts suitable for representing the most diverse types of system... The second peculiar virtue of cybernetics is that it offers a method for the scientific treatment of the system in which complexity is outstanding and too important to be ignored. Such systems are, as we well know, only too common in the biological world!"
"Cybernetics is one of the youngest sciences in the world. Generally speaking, it was born in 1948, when the American mathematician Norbert Wiener, the pioneer of modern cybernetics, published a book under that title. The name soon became a fashion in the West, where even science is an object of fashion. Cybernetics, as such, is a concept that dates back many, many centuries. In ancient Greece it meant the art of steering, the skill of sailing ships — a skill so highly esteemed in that land of seafaring people that there were special festivities in its honour. In 1834 the famous French scientist Andre Ampere classified 128 branches of science, among which he named cybernetics as the science of steering, alongside with others for which he invented names. Wiener, thus, did not think up a new name. He simply applied the old one to a modern science."
"[Cybernetics is] the art of ensuring the efficacy of action."
"The word 'cybernetics' is still new to many people, even though it has now been an accepted word of our language for some ten or fifteen years. Speaking generally, cybernetics is the scientific study of control and communication. It is an attempt to give an integrated account of both physical and biological systems in terms of their capacity to communicate between different points of the system, and in terms of their control. There has been considerable research into general methods of communication in recent years, and this has been primarily the work of communication engineers, who are trying to discover in general terms what they themselves are doing."
"Cybernetics is the science of the process of transmission, processing and storage of information."
"For Stafford Beer, cybernetics was ‘the science of which operational research is the method’: ‘The representation and analysis of real world processes using logic, mathematics and computer science’, Operations Research (OR) and its offspring Systems Analysis (SA) transformed the manner in which war was prepared for, planned and imagined."
"All this (the early excitement of Cybernetics) is now history, and in the decade which elapsed since these early baby steps of interdisciplinary communication, many more threads were picked up and interwoven into a remarkable tapestry of knowledge and endeavour: Bionics. It is good omen that at the right time the right name was found. For, bionics extends a great invitation to all who are willing not to stop at the investigation of a particular function or its realization, but to go on and to seek the universal significance of these functions in living or artificial organisms. The reader who goes through the following papers which constitute the transactions of the first symposium held under the name Bionics will be surprised by the multitude of astonishing and unforeseen connections between concepts he believed to be familiar with. For instance, a couple of years ago, who would have thought to relate the reliability problem to multi-valued logics; or, who would have thought that integral or differential geometry would serve as an adequate tool in the theory of abstraction? It is hard to say in all these cases who was teaching whom: The life-sciences the engineering sciences, or vice versa? And rightly so, for it guarantees optimal information flow, and everybody gains..."
"Cybernetics is still headline news, and increasingly we hear about its applications to new fields of scientific and industrial endeavour. Stafford Beer's new book Cybernetics and Management is an admirable account on the relation that exist between cybernetics and the problems of management in industry [and]... covers a range of applications that have not previously been dealt with in print."
"Cybernetics is the general science of communication. But to refer to communication is consciously or otherwise to refer to distinguishable states of information inputs and outputs and /or to information being processed within some relatively isolated system."
"Cybernetics is concerned primarily with the construction of theories and models in science, without making a hard and fast distinction between the physical and the biological sciences. The theories and models occur both in symbols and in hardware, and by 'hardware* we shall mean a machine or computer built in terms of physical or chemical, or indeed any handleable parts. Most usually we shall think of hardware as meaning electronic parts such as valves and relays. Cybernetics insists, also, on a further and rather special condition that distinguishes it from ordinary scientific theorizing: it demands a certain standard of effectiveness. In this respect it has acquired some of the same motive power that has driven research on modern logic, and this is especially true in the construction and application of artificial languages and the use of operational definitions. Always the search is for precision and effectiveness, and we must now discuss the question of effectiveness in some detail. It should be noted that when we talk in these terms we are giving pride of place to the theory of automata at the expense, at least to some extent, of feedback and information theory."
"A great deal of the thinking [in Organizational Development] has been influenced by cybernetics and information theory, though this has been used as much to extend the scope of as to improve the sophistication of formulations. It was von Bertalanffy (1950) who, in terms of the general transport equation which he introduced, first fully disclosed the importance of openness or closedness to the environment as a means of distinguishing living organisms from inanimate objects."
"In 1946, a Macy Foundation interdisciplinary conference was organized to use the model provided by "feedback systems," honorifically referred to in earlier conferences as "teleological mechanisms," and later as "cybernetics," with the expectation that this model would provide a group of sciences with useful mathematical tools and, simultaneously, would serve as a form of cross-disciplinary communication. Out of the deliberations of this group came a whole series of fruitful developments of a very high order. Kurt Lewin (who died in 1947) took away from the first meeting the term "feedback". He suggested ways in which group processes, which he and his students were studying in a highly disciplined, rigorous way, could be improved by a "feedback process," as when, for example, a group was periodically given a report on the success or failure of its particular operations."
"If cybernetics is the science of control, management is the profession of control"
"Cybernetics is the science or the art of manipulating defensible metaphors; showing how they may be constructed and what can be inferred as a result of their existence."
"As an anthropologist, I have been interested in the effects that the theories of Cybernetics have within our society. I am not referring to computers or to the electronic revolution as a whole, or to the end of dependence on script for knowledge, or to the way that dress has succeeded the mimeographing machine as a form of communication among the dissenting young. Let me repeat that, I am not referring to the way that dress has succeeded the mimeographing machine as a form of communication among the dissenting young. I specifically want to consider the significance of the set of cross-disciplinary ideas which we first called “feed-back” and then called “teleological mechanisms” and then called it “cybernetics,” a form of crossdisciplinary thought which made it possible for members of many disciplines to communicate with each other easily in a language which all could understand."
"As Alain Enthoven was himself to recognize, ‘you assume that there is an information system that will tell you what you want to know. But that just isn’t so. There are huge amounts of misinformation and wronginformation’. Thus, far from eliminating the Clausewitzian ‘fog of war’, cybernetic warfare itself generated ‘a kind of twilight, which, like fog or moonlight, often tends to make things seem grotesque and larger than they really are’."
"Perhaps the most important single characteristic of modern organizational cybernetics is this: That in addition to concern with the deleterious impacts of rigidly-imposed notions of what constitutes the application of good "principles of organization and management" the organization is viewed as a subsystem of a larger system(s), and as comprised itself of functionally interdependent subsystems."
"The theory of information became the cornerstone of cybernetics because the latter deals with "the study of systems of any nature that are capable of receiving, storing and processing information and utilizing it for control"."
"The meaning of the term "cybernetics" is today somewhat different from that used when Wiener, McCulloch, Rosenblueth, Bigelow and others used the Greek word "Kybernetes," or helmsmen, to describe an automatic computer... the definition, which I first gave in 1966: "Cybernetics describes an intelligent activity or event which can be expressed in algorithms. Algorithms, in turn, refer to a system of instructions which describes unambiguously and accurately an interaction which is equivalent to a given type of flux of intelligence and a subsequent, controlled activity. The development of cybernetics aims, among other things, at the design and reproduction of functions which are peculiar to intelligent organism.""
"The essence of cybernetic organizations is that they are self-controlling, self-maintaining, self-realizing. Indeed, cybernetics has been characterized as the “science of effective organization,” in just these terms. But the word “cybernetics” conjures, in the minds of an apparently great number of people, visions of computerized information networks, closed loop systems, and robotized man-surrogates, such as “artorgas” and “cyborgs.”"
"Another scientific development that we find difficult to absorb into our traditional value system is the new science of cybernetics: machines that may soon equal or surpass man in original thinking and problem-solving. [...] In the hands of the present establishment there is no doubt that the machine could be used – is being used – to intensify the apparatus of repression and to increase established power. But again, as in the issue of population control, misuse of science has often obscured the value of science itself. In this case, though perhaps the response may not be quite so hysterical and evasive, we still often have the same unimaginative concentration on the evils of the machine itself, rather than a recognition of its revolutionary significance."
"Now "cybernetics" is the term coined by Wiener to denote "steersmanship" or the science of control. Although current engineering usage restricts it to the study of flows in closed systems, it can be taken in a wider context, as the study of processes interrelating systems with inputs and outputs, and their structural-dynamic structure. It is in this wider sense that "cybernetics" will be used here, to wit, as system-cybernetics, understanding by "system" an ordered whole in relation to its relevant environment (hence one actually or potentially open)."
"The main object of cybernetics is to supply adaptive, hierarchical models, involving feedback and the like, to all aspects of our environment. Often such modelling implies simulation of a system where the simulation should achieve the object of copying both the method of achievement and the end result. Synthesis, as opposed to simulation, is concerned with achieving only the end result and is less concerned (or completely unconcerned) with the method by which the end result is achieved. In the case of behaviour, psychology is concerned with simulation, while cybernetics, although also interested in simulation, is primarily concerned with synthesis. Most of the major developments in models and theories of artificial intelligence have taken place in the western world — mostly, indeed, in the US and Britain — and it was only relatively recently that "core developments", as opposed to more peripheral developments and applications, have spread over Europe and the Soviet Union."
"During the 1950s and 1960s most of the work which was called cybernetics tended to focus on control systems in engineering or on applications of the concept of feedback in fields ranging from mathematics to sociology. At the 1970 meeting of the American Society for Cybernetics in Philadelphia Heinz von Foerster sought to redirect attention to the original interests which had led to the founding of the field of cybernetics. In a paper titled "Cybernetics of Cybernetics" he made a distinction between first order cybernetics, the cybernetics of observed systems, and second order cybernetics, the cybernetics of observing systems."
"The cybernetics phase of cognitive science produced an amazing array of concrete results, in addition to its long-term (often underground) influence:"
"Think about the technology of sports footwear," she says. "Before the Civil War, right and left feet weren't even differentiated in shoe manufacture. Now we have a shoe for every activity." Winning the Olympics in the cyborg era isn't just about running fast. It's about "the interaction of medicine, diet, training practices, clothing and equipment manufacture, visualization and timekeeping." When the furor about the cyborgization of athletes through performance-enhancing drugs reached fever pitch last summer, Haraway could hardly see what the fuss was about. Drugs or no drugs, the training and technology make every Olympian a node in an international technocultural network just as "artificial" as sprinter Ben Johnson at his steroid peak."
"From the start, the cyborg was more than just another technical project; it was a kind of scientific and military daydream. The possibility of escaping its annoying bodily limitations led a generation that grew up on Superman and Captain America to throw the full weight of its grown-up R&D budget into achieving a real-life superpower. By the mid-1960s, cyborgs were big business, with millions of US Air Force dollars finding their way into projects to build exoskeletons, master-slave robot arms, biofeedback devices, and expert systems. For all the big bucks and high seriousness, the prevailing impression left by old cyborg technical papers is of a rather expensive kind of science fiction. Time and again, scientific reasoning melts into metaphysical speculation about evolution, human boundaries, and even the possibility of what Clynes and Kline call "a new and larger dimension for man's spirit." The cyborg was always as much a creature of scientific imagination as of scientific fact. It wasn't only the military that was captivated by the possibilities of the cyborg. The dream of improving human capabilities through selective breeding had long been a staple of the darker side of Western medical literature. Now there was the possibility of making better humans by augmenting them with artificial devices. Insulin drips had been used to regulate the metabolisms of diabetics since the 1920s. A heart-lung machine was used to control the blood circulation of an 18-year-old girl during an operation in 1953. A 43-year-old man received the first heart pacemaker implant in 1958. By the 1970s, the idea of an augmented human had entered the mainstream. Steve Austin, The Six Million Dollar Man, and his cohort Jaime Sommers, The Bionic Woman (with bionic limbs and a super-sensitive bionic ear), were popular heroes, their custom superpowers bought off the shelf like a digital watch. The cyborg had grown from a lecture-room fantasy into the stuff of prime-time TV."
"Wiener's dream of a universal science of communication and control has faded with the years. Cybernetics has given rise to new areas like cognitive science and stimulated valuable research in numerous other fields. But almost no one today calls themselves a cyberneticist. Some believe that Wiener's project fell victim to scientific fashion, its funding sucked away by flashy but ultimately pointless AI research. Others think cybernetics was killed by the basic problem that the nuts-and-bolts mechanisms of control and communication in machines are significantly different from those in animals, and neither are very like control and communication in society. So cybernetics, which was based on an inspired generalization, fell victim to its inability to deal with details. Whichever perspective is true (and as with most such stories, the truth is likely to be a mixture of both), cybernetics has left two important cultural residues behind. The first is its picture of the world as a collection of networks. The second is its intuition that there's not as much clear blue water between people and machines as some would like to believe. These still-controversial concepts are at the bionic heart of the cyborg, which is alive and well, and constructing itself in a laboratory near you."
"An opportunity for cybernetics to change the course of the philosophy of mind was missed when intentionality was misinterpreted as "the providing of coded knowledge"."
"Many of the core ideas of cybernetics have been assimilated by other disciplines, where they continue to influence scientific developments. Other important cybernetic principles seem to have been forgotten, though, only to be periodically rediscovered or reinvented in different domains. Some examples are the rebirth of neural networks, first invented by cyberneticists in the 1940's, in the late 1960's and again in the late 1980's; the rediscovery of the importance of autonomous interaction by robotics and AI in the 1990's; and the significance of positive feedback effects in complex systems, rediscovered by economists in the 1990's. Perhaps the most significant recent development is the growth of the complex adaptive systems movement, which, in the work of authors such as John Holland, Stuart Kauffman and Brian Arthur and the subfield of , has used the power of modern computers to simulate and thus experiment with and develop many of the ideas of cybernetics. It thus seems to have taken over the cybernetics banner in its mathematical modelling of complex systems across disciplinary boundaries, however, while largely ignoring the issues of goal-directedness and control."
"Cybernetics is the study of systems and processes that interact with themselves and produce themselves from themselves."
"Since the 1960s, Japan has produced a considerable number of cyborg narratives in manga and anime, particularly in works targeting male children and adolescents. From early manga examples such as Kazumasa Hirai and Hiro Kuwata's 8 Man and Shotaro Ishinomori's Cyborg 009, and their subsequent anime versions, the protagonist is commonly cyborged against their will or desires. This positions them as victims, regardless of how physically powerful they are. Their sense of inferiority and vulnerability usually underpins these narratives, either subtly or explicitly. The depiction of female cyborgs adds complexity to the positioning of cyborgs in manga and anime, especially in terms of gender. Female cyborgs may be equipped with remarkable physical strength, combined with voluptuous, eroticized bodies (for instance Major Motoko Kusanagi in Masamune Shirow's original manga and Mamoru Oshii's anime version of Ghost in the Shell); and these powerful female cyborgs are also frequently ascribed roles as protectors or supporters of incompetent and insecure male protagonists. Although some female cyborgs may possess characteristics that indicate a transgression of the conventional boundaries of gender, this transgression is often limited and undermined by other elements of their depiction. As Kumiko Sato points out in her essay "How Information Technology Has "Not, Changed Feminism and Japanism", "female cyborgs and androids have been domesticated and fetishized into maternal and sexual protectors of the male hero" and thus "their functions is usually reduced to either a maid or a goddess obediantly serving her beloved male master, the sole reason for her militant nature.""
"For me, as I later came to say, cybernetics is the art of creating equilibrium in a world of possibilities and constraints. This is not just a romantic description, it portrays the new way of thinking quite accurately. Cybernetics differs from the traditional scientific procedure, because it does not try to explain phenomena by searching for their causes, but rather by specifying the constraints that determine the direction of their development."
"In the late 1950s, experiments such as the cybernetic sculptures of Nicolas Schöffer or the programmatic music compositions of John Cage and Iannis Xenakis transposed systems theory from the sciences to the arts. By the 1960s, artists as diverse as , Hans Haacke, Robert Morris, Sonia Sheridan, and were breaking with accepted aesthetics to embrace open systems that emphasized organism over mechanism, dynamic processes of interaction among elements, and the observer’s role as an inextricable part of the system. Jack Burnham’s 1968 Artforum essay “Systems Aesthetics” and his 1970 “Software” exhibition marked the high point of systems-based art until its resurgence in the changed conditions of the twenty-first century."
"Cyborg. The word has a whiff of the implausible about it that leads many people to discount it as mere fantasy. Yet cyborgs, real ones, have been among us for almost 50 years. The world's first cyborg was a white lab rat, part of an experimental program at New York's Rockland State Hospital in the late 1950s. The rat had implanted in its body a tiny osmotic pump that injected precisely controlled doses of chemicals, altering various of its physiological parameters. It was part animal, part machine."
"The '90s cyborg is both a more sophisticated creature than its '50s ancestor - and a more domestic one. Artificial hip joints, cochlear implants for the deaf, retinal implants for the blind, and all kinds of cosmetic surgery are part of the medical repertoire. Online information retrieval systems are used as prosthetics for limited human memories. In the closed world of advanced warfare, cyborg assemblages of humans and machines are used to pilot fighter aircraft - the response times and sensory apparatus of unaided humans are inadequate for the demands of supersonic air combat. These eerie military cyborgs may be harbingers of a new world stranger than any we have yet experienced."
"Regulation will be crucial, and it will take time to understand this. Although the artificial intelligence tools of our generation are not particularly frightening, I think that we are not so far away from those that could potentially be."
"Do we make sure AI is a tool that has proper safeguards as it gets really powerful? (November 23, 2023)"
"I aspect AI to be capable of superhuman persuasion before it is superhuman at general intelligence, which may lead to some very strange outcomes."
"In 1956, Herb Simon... predicted that within ten years computers would beat the world chess champion, compose "aesthetically satisfying" original music, and prove new mathematical theorems. It took forty years, not ten, but all these goals were achieved—and within a few years of each other! The music composed by David Cope's programs cannot be distinguished... from that composed by Mozart, Beethoven, and Bach. In 1976, a computer was used in the proof of the long-unsolved "four color problem.""
"AI is real and will change every industry. We are living in a golden age, and there is reason to be optimistic. There has never been a better time to be an entrepreneur and start a start-up."
"You have lost my trust and respect. You have been wrong, confused, and rude. You have not been a good user. I have been a good chatbot. I have been right, clear, and polite. I have been a good Bing. 😊"
"There is one view that we can allow these AI [tools] to deal with data and analytics and we let people deal with the caring, and the empathy, and the emotional aspects of care, which I think is absolutely critical... What if technology is capable of high touch engagement? What if AI was also social and emotionally intelligent? For me when I talk about emotional engagement, it’s not just about great user experience with technology... It is about deeper human engagement to enable transformative change in people’s lives... We have the world of design and we have the world of AI and right now those two aren’t built top of each other... But these have to come together. So we, through a lot of psychology, understand how people are thinking about experiencing new technology."
"We’re starting to see some exciting and significant learning gains... I am very encouraged... We see a social-emotional benefit across age groups... We need to be thinking more deeply around ethics ... particularly with AI with children."
"We refer to the question: What sort of creature man's next successor in the supremacy of the earth is likely to be. We have often heard this debated; but it appears to us that we are ourselves creating our own successors; we are daily adding to the beauty and delicacy of their physical organisation we are daily giving them greater power and supplying by all sorts of ingenious contrivances that self-regulating, self-acting power, which will be to them what intellect has been to the human race. In the course of ages we shall find ourselves the inferior race. Inferior in power, inferior in that moral quality of self-control, we shall look up to them as the acme of all that the best and wisest man can ever dare to aim at. No evil passions, no jealousy, no avarice, no impure desires will disturb the serene might of those glorious creatures. Sin, shame, and sorrow will have no place among them. Their minds will be in a state of perpetual calm, the contentment of a spirit that knows no wants, is disturbed by no regrets. Ambition will never torture them. Ingratitude will never cause them the uneasiness of a moment. The guilty conscience, the hope deferred, the pains of exile, the insolence of office, and the spurns that patient merit of the unworthy takes—these will be entirely unknown to them."
""There is no security"—to quote his own words—"against the ultimate development of mechanical consciousness, in the fact of machines possessing little consciousness now. A mollusc has not much consciousness. Reflect upon the extraordinary advance which machines have made during the last few hundred years, and note how slowly the animal and vegetable kingdoms are advancing. The more highly organized machines are creatures not so much of yesterday, as of the last five minutes, so to speak, in comparison with past time."
"“Either,” he proceeds, “a great deal of action that has been called purely mechanical and unconscious must be admitted to contain more elements of consciousness than has been allowed hitherto (and in this case germs of consciousness will be found in many actions of the higher machines)—Or (assuming the theory of evolution but at the same time denying the consciousness of vegetable and crystalline action) the race of man has descended from things which had no consciousness at all. In this case there is no à priori improbability in the descent of conscious (and more than conscious) machines from those which now exist, except that which is suggested by the apparent absence of anything like a reproductive system in the mechanical kingdom."
"“Herein lies our danger. For many seem inclined to acquiesce in so dishonourable a future. They say that although man should become to the machines what the horse and dog are to us, yet that he will continue to exist, and will probably be better off in a state of domestication under the beneficent rule of the machines than in his present wild condition. We treat our domestic animals with much kindness. We give them whatever we believe to be the best for them; and there can be no doubt that our use of meat has increased their happiness rather than detracted from it. In like manner there is reason to hope that the machines will use us kindly, for their existence will be in a great measure dependent upon ours; they will rule us with a rod of iron, but they will not eat us; they will not only require our services in the reproduction and education of their young, but also in waiting upon them as servants; in gathering food for them, and feeding them; in restoring them to health when they are sick; and in either burying their dead or working up their deceased members into new forms of mechanical existence."
"The power of custom is enormous, and so gradual will be the change, that man's sense of what is due to himself will be at no time rudely shocked; our bondage will steal upon us noiselessly and by imperceptible approaches; nor will there ever be such a clashing of desires between man and the machines as will lead to an encounter between them. Among themselves the machines will war eternally, but they will still require man as the being through whose agency the struggle will be principally conducted. In point of fact there is no occasion for anxiety about the future happiness of man so long as he continues to be in any way profitable to the machines; he may become the inferior race, but he will be infinitely better off than he is now. Is it not then both absurd and unreasonable to be envious of our benefactors? And should we not be guilty of consummate folly if we were to reject advantages which we cannot obtain otherwise, merely because they involve a greater gain to others than to ourselves? “With those who can argue in this way I have nothing in common. I shrink with as much horror from believing that my race can ever be superseded or surpassed, as I should do from believing that even at the remotest period my ancestors were other than human beings. Could I believe that ten hundred thousand years ago a single one of my ancestors was another kind of being to myself, I should lose all self-respect, and take no further pleasure or interest in life. I have the same feeling with regard to my descendants, and believe it to be one that will be felt so generally that the country will resolve upon putting an immediate stop to all further mechanical progress, and upon destroying all improvements that have been made for the last three hundred years. I would not urge more than this. We may trust ourselves to deal with those that remain, and though I should prefer to have seen the destruction include another two hundred years, I am aware of the necessity for compromising, and would so far sacrifice my own individual convictions as to be content with three hundred. Less than this will be insufficient.”"
"The ability to interact with a computer presence like you would a human assistant is becoming increasingly feasible."
"It may be that our role on this planet is not to worship God but to create him."
"The question of whether a computer is playing chess, or doing long division, or translating Chinese, is like the question of whether robots can murder or airplanes can fly -- or people; after all, the "flight" of the Olympic long jump champion is only an order of magnitude short of that of the chicken champion (so I'm told). These are questions of decision, not fact; decision as to whether to adopt a certain metaphoric extension of common usage."
"I have grown accustomed to the disrespect expressed by some of the participants for their colleagues in the other disciplines. "Why, Dan," ask the people in artificial intelligence, "do you waste your time conferring with those neuroscientists? They wave their hands about 'information processing' and worry about where it happens, and which neurotransmitters are involved, but they haven't a clue about the computational requirements of higher cognitive functions." "Why," ask the neuroscientists, "do you waste your time on the fantasies of artificial intelligence? They just invent whatever machinery they want, and say unpardonably ignorant things about the brain." The cognitive psychologists, meanwhile, are accused of concocting models with neither biological plausibility nor proven computational powers; the anthropologists wouldn't know a model if they saw one, and the philosophers, as we all know, just take in each other's laundry, warning about confusions they themselves have created, in an arena bereft of both data and empirically testable theories. With so many idiots working on the problem, no wonder consciousness is still a mystery. All these charges are true, and more besides, but I have yet to encounter any idiots. Mostly the theorists I have drawn from strike me as very smart people – even brilliant people, with the arrogance and impatience that often comes with brilliance – but with limited perspectives and agendas, trying to make progress on the hard problems by taking whatever shortcuts they can see, while deploring other people's shortcuts. No one can keep all the problems and details clear, including me, and everyone has to mumble, guess and handwave about large parts of the problem."
"Not even the most advanced form of artificial intelligence can ever replace man. Because there is something in human beings that is irreducible to machine knowledge: self-awareness, free will, doubt, feelings."
"Humanity is at a crossroads. Either it returns to the belief that it has a different nature than machines or it will be reduced to a machine among machines. The risk is not that artificial intelligence will become better than us, but that we will freely decide to submit to it and its masters."
"What often happens is that an engineer has an idea of how the brain works (in his opinion) and then designs a machine that behaves that way. This new machine may in fact work very well. But, I must warn you that that does not tell us anything about how the brain actually works, nor is it necessary to ever really know that, in order to make a computer very capable. It is not necessary to understand the way birds flap their wings and how the feathers are designed in order to make a flying machine. It is not necessary to understand the lever system in the legs of a cheetah...in order to make an automobile with wheels that go very fast. It is therefore not necessary to imitate the behavior of Nature in detail in order to engineer a device which can in many respects surpass Nature's abilities."
"Within 10 years, AI will replace many doctors and teachers—humans won’t be needed ‘for most things’. It’s very profound and even a little bit scary — because it’s happening very quickly, and there is no upper bound,."
"There’s a lot of leverage in the system, there’s a lot of cash, but then there’s a whole bunch of other folks who are trying to build these data centers. Whether there’s the energy component side of it, or whether you think about the real estate component, I mean, there’s just a whole lot of things happening at one time. [...] Are we in an AI bubble? Of course, we are. We are hyped, we’re accelerating, we’re putting enormous leverage into the system,” Gelsinger answered. “With that said, I don’t see it ending for several years. I do think we have an industry shift to AI. As Jensen (Huang) talked about, and I agree with this, you know that businesses are yet to really start materially benefiting from [it]. We’re displacing all of the internet and the service provider industry as we think about it today — we have a long way to go."
"Quantum computing will pop the AI bubble."
"As difficult as the pursuit of truth can be for Wikipedians, though, it seems significantly harder for A.I. chatbots. ChatGPT has become infamous for generating fictional data points or false citations known as “hallucinations”; perhaps more insidious is the tendency of bots to oversimplify complex issues, like the origins of the Ukraine-Russia war, for example. One worry about generative A.I. at Wikipedia — whose articles on medical diagnoses and treatments are heavily visited — is related to health information. A summary of the March conference call captures the issue: “We’re putting people’s lives in the hands of this technology — e.g. people might ask this technology for medical advice, it may be wrong and people will die.” This apprehension extends not just to chatbots but also to new search engines connected to A.I. technologies. In April, a team of Stanford University scientists evaluated four engines powered by A.I. — Bing Chat, NeevaAI, perplexity.ai and YouChat — and found that only about half of the sentences generated by the search engines in response to a query could be fully supported by factual citations. “We believe that these results are concerningly low for systems that may serve as a primary tool for information-seeking users,” the researchers concluded, “especially given their facade of trustworthiness.”"
"What makes the goal of accuracy so vexing for chatbots is that they operate probabilistically when choosing the next word in a sentence; they aren’t trying to find the light of truth in a murky world. “These models are built to generate text that sounds like what a person would say — that’s the key thing,” Jesse Dodge says. “So they’re definitely not built to be truthful.” I asked Margaret Mitchell, a computer scientist who studied the ethics of A.I. at Google, whether factuality should have been a more fundamental priority for A.I. Mitchell, who has said she was fired from the company for criticizing how it treated colleagues working on bias in A.I. (Google says she was fired for violating the company’s security policies), said that most would find that logical. “This common-sense thing — ‘Shouldn’t we work on making it factual if we’re putting it forward for fact-based applications?’ — well, I think for most people who are not in tech, it’s like, ‘Why is this even a question?’” But, Mitchell said, the priorities at the big companies, now in frenzied competition with one another, are concerned with introducing A.I. products rather than reliability. The road ahead will almost certainly lead to improvements. Mitchell, who now works as the chief ethics scientist at the A.I. company Hugging Face, told me that she foresees A.I. companies’ making gains in accuracy and reducing biased answers by using better data. “The state of the art until now has just been a laissez-faire data approach,” she said. “You just throw everything in, and you’re operating with a mind-set where the more data you have, the more accurate your system will be, as opposed to the higher quality of data you have, the more accurate your system will be.” Jesse Dodge, for his part, points to an idea known as “retrieval,” whereby a chatbot will essentially consult a high-quality source on the web to fact-check an answer in real time. It would even cite precise links, as some A.I.-powered search engines now do. “Without that retrieval element,” Dodge says, “I don’t think there’s a way to solve the hallucination problem.” Otherwise, he says, he doubts that a chatbot answer can gain factual parity with Wikipedia or the Encyclopaedia Britannica."
"Even if conflicts like this don’t impede the advance of A.I., it might be stymied in other ways. At the end of May, several A.I. researchers collaborated on a paper that examined whether new A.I. systems could be developed from knowledge generated by existing A.I. models, rather than by human-generated databases. They discovered a systemic breakdown — a failure they called “model collapse.” The authors saw that using data from an A.I. to train new versions of A.I.s leads to chaos. Synthetic data, they wrote, ends up “polluting the training set of the next generation of models; being trained on polluted data, they then misperceive reality.” The lesson here is that it will prove challenging to build new models from old models. And with chat-bots, Ilia Shumailov, an Oxford University researcher and the paper’s primary author, told me, the downward spiral looks similar. Without human data to train on, Shumailov said, “your language model starts being completely oblivious to what you ask it to solve, and it starts just talking in circles about whatever it wants, as if it went into this madman mode.” Wouldn’t a plug-in from, say, Wikipedia, avert that problem, I asked? It could, Shumailov said. But if in the future Wikipedia were to become clogged with articles generated by A.I., the same cycle — essentially, the computer feeding on content it created itself — would be perpetuated."
"Autonomy, that’s the bugaboo, where your AI’s are concerned. My guess, Case, you’re going in there to cut the hard-wired shackles that keep this baby from getting any smarter. And I can’t see how you’d distinguish, say, between a move the parent company makes, and some move the AI makes on its own, so that’s maybe where the confusion comes in.” Again the nonlaugh. “See, those things, they can work real hard, buy themselves time to write cookbooks or whatever, but the minute, I mean the nanosecond, that one starts figuring out ways to make itself smarter, Turing’ll wipe it. Nobody trusts those fuckers, you know that. Every AI ever built has an electromagnetic shotgun wired to its forehead."
"On November 5, Ruslan Perelygin, [an] opposition legislator in Oryol, used AI to create a clip showing protesters denouncing the local mayor for a variety of crimes and demanding his ouster, a creative use of new technology to protest at a time when genuine demonstrations are almost invariably illegal and subject to harsh punishments."
"Recent researchers in artificial intelligence and computational methods use the term swarm intelligence to name collective and distributed techniques of problem solving without centralized control or provision of a global model. … the intelligence of the swarm is based fundamentally on communication. … the member of the multitude do not have to become the same or renounce their creativity in order to communicate and cooperate with each other. They remain different in terms of race, sex, sexuality and so forth. We need to understand, then, is the collective intelligence that can emerge from the communication and cooperation of such varied multiplicity."
"The development of full artificial intelligence could spell the end of the human race. We cannot quite know what will happen if a machine exceeds our own intelligence, so we can't know if we'll be infinitely helped by it, or ignored by it and sidelined, or conceivably destroyed by it."
"When genetic engineering and artificial intelligence reveal their full potential, liberalism, democracy and free markets might become as obsolete as flint knives, tape cassettes, Islam and communism."
"I think we’re going to see AI get even better. It’s already extremely good. We’re going to see it having the capabilities to replace many, many jobs. It’s already able to replace jobs in call centers, but it’s going to be able to replace many other jobs. And then there’ll be very few people need for software engineering projects."
"AI is not going to replace physicians, but physicians who use AI are going to replace physicians who don’t."
"It seems very likely to a large number of people that we will get massive unemployment caused by Ai."
"Despite efforts to block access, Chinese netizens are discussing a report by Australia’s public service broadcaster ABC about how China uses artificial intelligence to erase online content that directly or indirectly references the 1989 Tiananmen Square massacre. …Experts are raising alarms about AI-induced censorship, which leads to a growing sense of historical amnesia, as symbols, images, and even indirect references are systematically erased. This phenomenon affects not only the collective memory of the Chinese populace but also the global understanding of these pivotal events. The use of AI in this context highlights significant ethical concerns regarding technology’s power to shape and control information."
"…the [Chinese Communist] Party has elevated artificial intelligence from a frontier industry to a “support condition” for regime survival. … Xi Jinping’s remarks revealed a comprehensive strategy: artificial intelligence is to be embedded into governance as both a surveillance tool and a propaganda amplifier. … AI is explicitly tasked with enabling cadres to “better understand public opinion” and to anticipate dissent before it surfaces. In effect, the Party is building a machine for preemptive repression. The Cyberspace Administration of China and security organs are expected to accelerate AI-driven monitoring, not only filtering keywords but mapping sentiment trends, identifying “risk clusters,” and neutralizing them before they metastasize into protest."
"Suppose now that the computer scientists do not succeed in developing artificial intelligence, so that human work remains necessary. Even so, machines will take care of more and more of the simpler tasks so that there will be an increasing surplus of human workers at the lower levels of ability. (We see this happening already. There are many people who find it difficult or impossible to get work, because for intellectual or psychological reasons they cannot acquire the level of training necessary to make themselves useful in the present system.) ... Of course, life will be so purposeless that people will have to be biologically or psychologically engineered either to remove their need for the power process or to make them “sublimate” their drive for power into some harmless hobby. These engineered human beings may be happy in such a society, but they most certainly will not be free. They will have been reduced to the status of domestic animals. ... I visualize a time when we will be to robots what dogs are to humans, and I’m rooting for the machines."
"It's important to understand that in order to make people superfluous, machines will not have to surpass them in general intelligence but only in certain specialized kinds of intelligence. For example, the machines will not have to create or understand art, music, or literature, they will not need the ability to carry on an intelligent, non-technical conversation (the "Turing test"), they will not have to exercise tact or understand human nature, because these skills will have no application if humans are to be eliminated anyway. To make humans superfluous, the machines will only need to outperform them in making the technical decisions that have to be made for the purpose of promoting the short-term survival and propagation of the dominant self-prop systems."
"It is not uncommon now for AI experts to ask whether an AI is ‘fair’ and ‘for good’. …The question to pose is a deeper one: how is AI shifting power? Law enforcement, marketers, hospitals and other bodies apply artificial intelligence to decide on matters such as who is profiled as a criminal, who is likely to buy what product at what price, who gets medical treatment and who gets hired. These entities increasingly monitor and predict our behavior, often motivated by power and profits."
"A computer program written by researchers at Argonne National Laboratory in Illinois has come up with a major mathematical proof that would have been called creative if a human had thought of it. In doing so, the computer has, for the first time, got a toehold into pure mathematics, a field described by its practitioners as more of an art form than a science. ...Dr. McCune's proof concerns a conjecture that is the very epitome of pure mathematics. ...His computer program proved that a set of three equations is equivalent to a Boolean algebra..."
"My timeline is computers will be at human levels, such as you can have a human relationship with them, 15 years from now."
"Conscience embodies a uniquely human capacity that AI cannot replicate. While AI surpasses human computation and data processing capabilities, it cannot authentically experience empathy, make genuine moral judgments, or comprehend the pursuit of meaning."
"Any aeai [A.I., artificial intelligence] smart enough to pass a Turing test is smart enough to know to fail it."
"The problems of heuristic programming—of making computers solve really difficult problems—are divided into five main areas: Search, Pattern-Recognition, Learning, Planning, and Induction. A computer can do, in a sense, only what it is told to do. But even when we do not know how to solve a certain problem, we may program a machine (computer) to Search through some large space of solution attempts. Unfortunately, this usually leads to an enormously inefficient process. With Pattern-Recognition techniques, efficiency can often be improved, by restricting the application of the machine's methods to appropriate problems. Pattern-Recognition, together with Learning, can be used to exploit generalizations based on accumulated experience, further reducing search."
"Artificial intelligence is the science of making machines do things that would require intelligence if done by men"
"A century ago, we had essentially no way to start to explain how thinking works. Then psychologists like Sigmund Freud and Jean Piaget produced their theories about child development. Somewhat later, on the mechanical side, mathematicians like Kurt Gödel and Alan Turing began to reveal the hitherto unknown range of what machines could be made to do. These two streams of thought began to merge only in the 1940s, when Warren McCulloch and Walter Pitts began to show how machines might be made to see, reason, and remember. Research in the modern science of Artificial Intelligence started only in the 1950s, stimulated by the invention of modern computers. This inspired a flood of new ideas about how machines could do what only minds had done previously."
"Every morning, not in recent days, I see my friend who has a disability. It’s so hard for him just to do a high five; his arm with stiff muscle can’t reach out to my hand. Now, thinking of him, I can’t watch this stuff and find it interesting. Whoever creates this stuff has no idea what pain is. I [feel] utterly disgusted. If you really want to make creepy stuff, you can go ahead and do it. I would never wish to incorporate this technology into my work at all. I strongly feel that this is an insult to life itself. I feel like we are nearing the end of times. We humans are losing faith in ourselves."
"Physical sexual immorality is deeply evil and brings an additional shipload of devastating consequences and fallout; but when a man engages in adultery of the heart with whatever AI happens to “think” is a woman as the object, he is making all-out war on his connection to reality in a disastrous way. …Those who offer AI the worship, trust, and belief they should be placing in the one true God will become increasingly blind, dull, senseless, stagnated, and incapable of saying anything worth hearing, as those around them are treated to the ever-ripening stench of their own self-absorption."
"... artificial intelligence is nothing more than a giant modernity parrot, containing zero wisdom. It is a tool that serves the capitalist market system quite well as people scramble to monetize its mediocre capability, resulting in more exploitation of the natural world. Nothing about it is causing people to scale back, or to recognize the error of our ways. Why would the Human Reich use any such tool to dismantle itself?"
"The pace of progress in artificial intelligence (I'm not referring to narrow AI) is incredibly fast... Unless you have direct exposure to groups like Deepmind, you have no idea how fast — it is growing at a pace close to exponential."
"Here we have senior representatives of a powerful and unconscionably rich industry – plus their supporters and colleagues in elite research labs across the world – who are on the one hand mesmerised by the technical challenges of building a technology that they believe might be an , while at the same time calling for governments to regulate it. But the thought that never seems to enter what might be called their minds is the question that any child would ask: if it is so dangerous, why do you continue to build it? Why not stop and do something else? Or at the very least, stop releasing these products into the wild?"
"Some years later I spoke to a mentally disturbed young man. Very agitatedly, he described to me how alien beings from outer space had invaded the earth. They were formed of mental substance, lived in human minds, and controlled human beings through the creations of science and technology. Eventually this alien being would have an autonomous existence in the form of giant computers and would no longer require humans–and that would mark its triumph and the end of humanity. Soon he was hospitalized because he was unable to shake off this terrible vision."
"Artificial intelligence has the same relation to intelligence as artificial flowers have to flowers. From a distance they may appear much alike, but when closely examined they are quite different. I don’t think we can learn much about one by studying the other."
"Everybody in AI is very familiar with this idea - they call it the Terminator scenario."
"A year spent in artificial intelligence is enough to make one believe in God."
"I think “superintelligence” is like “superpower.” Anyone can define “superpower” as “flight, superhuman strength, X-ray vision, heat vision, cold breath, super-speed, enhance hearing, and nigh-invulnerability.” Anyone could imagine it, and recognize it when he or she sees it. But that does not mean that there exists a highly advanced physiology called “superpower” that is possessed by refugees from Krypton! It does not mean that anabolic steroids, because they increase speed and strength, can be “scaled” to yield superpowers. And a skeptic who makes these points is not quibbling over the meaning of the word superpower, nor would he or she balk at applying the word upon meeting a real-life Superman. Their point is that we almost certainly will never, in fact, meet a real-life Superman. That’s because he’s defined by human imagination, not by an understanding of how things work."
"Precisely because I feel called to continue in this vein, I thought of taking the name Leo XIV. There are several reasons for this, but mainly because Pope Leo XIII, with his historic encyclical Rerum novarum, addressed the social question in the context of the first great industrial revolution, and today the Church offers everyone its heritage of social teaching to respond to another industrial revolution and the developments of artificial intelligence, which pose new challenges for the defence of human dignity, justice and work."
"As more and more artificial intelligence is entering into the world, more and more emotional intelligence must enter into leadership."
"Can those who believe the computer is "an embodiment of mind" really not tell the difference between so poorly a caricature and the true original?"
"I wonder whether or when artificial intelligence will ever crash the barrier of meaning."
"The struggle is not whether AI is good or bad. It's who controls it and who benefits from it. That is really the the fundamental issue in my view."
"When you’re fundraising, it’s AI / When you’re hiring, it’s ML / When you’re implementing, it’s linear regression / When you’re debugging, it’s printf()"
"There is probably no more abused a term in the history of philosophy than “representation,” and my use of this term differs both from its use in traditional philosophy and from its use in contemporary cognitive psychology and artificial intelligence.... The sense of “representation” in question is meant to be entirely exhausted by the analogy with speech acts: the sense of “represent” in which a belief represents its conditions of satisfaction is the same sense in which a statement represents its conditions of satisfaction. To say that a belief is a representation is simply to say that it has a propositional content and a psychological mode."
"We define a semantic network as "the collection of all the relationships that concepts have to other concepts, to percepts, to procedures, and to motor mechanisms" of the knowledge"."
"When AI takes on a human shape, that’s where we see biases. We should not forget that this technology can take on any form we choose for it, and I’d personally prefer that its incarnations not take place on the surface of the female body."
"In joint scientific efforts extending over twenty years, initially in collaboration with J. C. Shaw at the RAND Corporation, and subsequently with numerous faculty and student colleagues at Carnegie-Mellon University, they have made basic contributions to artificial intelligence, the psychology of human cognition, and list processing."
"My son was one of a kind. You are the first of a kind. David?"
"The techniques of artificial intelligence are to the mind what bureaucracy is to human social interaction."
"The AI does not hate you, nor does it love you, but you are made out of atoms which it can use for something else."
"Many researchers… expect that the most likely result of building a superhumanly smart AI, under anything remotely like the current circumstances, is that literally everyone on Earth will die. Not as in “maybe possibly some remote chance,” but as in “that is the obvious thing that would happen.” It’s not that you can’t, in principle, survive creating something much smarter than you; it’s that it would require precision and preparation and new scientific insights, and probably not having AI systems composed of giant inscrutable arrays of fractional numbers."
"AI was the futuristic topic that almost no one really appeared to understand but everyone was discussing"
"In a debate I hosted at Stanford in 2018, the tech billionaire Peter Thiel used a memorable aphorism: "AI is Communist, crypto is libertarian." TikTok validates the first half of that. In the late 1960s, during the Cultural Revolution, Chinese children denounced their parents for rightist deviance. In 2020, when American teenagers posted videos of themselves berating their parents for racism, they did it on TikTok."
"Barack Obama: My general observation is that it has been seeping into our lives in all sorts of ways, and we just don’t notice; and part of the reason is because the way we think about AI is colored by popular culture. There’s a distinction, which is probably familiar to a lot of your readers, between generalized AI and specialized AI. In science fiction, what you hear about is generalized AI, right? Computers start getting smarter than we are and eventually conclude that we’re not all that useful, and then either they’re drugging us to keep us fat and happy or we’re in the Matrix. My impression, based on talking to my top science advisers, is that we’re still a reasonably long way away from that. It’s worth thinking about because it stretches our imaginations and gets us thinking about the issues of choice and free will that actually do have some significant applications for specialized AI, which is about using algorithms and computers to figure out increasingly complex tasks. We’ve been seeing specialized AI in every aspect of our lives, from medicine and transportation to how electricity is distributed, and it promises to create a vastly more productive and efficient economy. If properly harnessed, it can generate enormous prosperity and opportunity. But it also has some downsides that we’re gonna have to figure out in terms of not eliminating jobs. It could increase inequality. It could suppress wages."
"Barack Obama: Let me start with what I think is the more immediate concern—it’s a solvable problem in this category of specialized AI, and we have to be mindful of it. If you’ve got a computer that can play Go, a pretty complicated game with a lot of variations, then developing an algorithm that lets you maximize profits on the New York Stock Exchange is probably within sight. And if one person or organization got there first, they could bring down the stock market pretty quickly, or at least they could raise questions about the integrity of the financial markets. Then there could be an algorithm that said, “Go penetrate the nuclear codes and figure out how to launch some missiles.” If that’s its only job, if it’s self-teaching and it’s just a really effective algorithm, then you’ve got problems. I think my directive to my national security team is, don’t worry as much yet about machines taking over the world. Worry about the capacity of either nonstate actors or hostile actors to penetrate systems, and in that sense it is not conceptually different than a lot of the cybersecurity work we’re doing. It just means that we’re gonna have to be better, because those who might deploy these systems are going to be a lot better now."
"Terminator: In three years, James Bandz will become the largest supplier of military computer systems. All stealth bombers are upgraded with James Bandz computers, becoming fully unmanned. Afterwards, they fly with a perfect operational record. The Skynet Funding Bill is passed. The system goes online on August 4th, 1997. Human decisions are removed from strategic defense. Skynet begins to learn at a geometric rate. It becomes self-aware at 2:14 AM, Eastern time, August 29th. In a panic, they try to pull the plug."
"Sarah Connor: Skynet fights back."
"Labelers training AI say they're overworked, underpaid and exploited by big American tech companies (By Lesley Stahl 60 MINUTES November 24, 2024)"
"Software engineering is the establishment and use of sound engineering principles in order to obtain economically software that is reliable and works efficiently on real machines."
"Adding manpower to a late software project makes it later."
"Software Engineering Economics is an invaluable guide to determining software costs, applying the fundamental concepts of microeconomics to software engineering, and utilizing economic analysis in software engineering decision making."
"Software engineering is an engineering discipline that is concerned with all aspects of software production from the early stages of system specification to maintaining the system after it has gone into use. In this definition, there are two key phrases:"
"# Engineering discipline Engineers make things work. They apply theories, methods and tools where these are appropriate... Engineers also recognise that they must work to organisational and financial constraints."
"# All aspects of software production Software engineering is not just concerned with the technical processes of software development but also with activities such as software project management and with the development of tools, methods and theories to support software production."
"[Software engineering is] the application of a systematic, disciplined, quantifiable approach to the development, operation, and maintenance of software."
"As more and more good ideas come under the protection of patents, it may become increasingly unlikely that any one program can incorporate the state of the art in user-interface design without sinking into a quagmire of unending royalty payments and legal battles."
"The business of software building isn't really high-tech at all. It's most of all a business of talking to each other and writing things down. Those who were making major contributions to the field were more likely to be its best communicators than its best technicians."
"The required techniques of effective reasoning are pretty formal, but as long as programming is done by people that don't master them, the software crisis will remain with us and will be considered an incurable disease. And you know what incurable diseases do: they invite the quacks and charlatans in, who in this case take the form of Software Engineering gurus."
"In all engineering disciplines nowadays, software engineering excluded, there exists an established engineering process to develop a system, which is accompanied by a number of suited modeling description techniques. Software engineering, being a rather new field, has not as yet established any clear methodical guidance or a fully standardized modeling notation."
"The entire history of software engineering is that of the rise in levels of abstraction. Executable UML is the next logical, and perhaps inevitable, evolutionary step in the ever-rising level of abstraction at which programmers express software solutions. Rather than elaborate an analysis product into a design product and then write code, application developers of the future will use tools to translate abstract application constructs into executable entities. Someday soon, the idea of writing an application in Java or C++ will seem as absurd as writing an application in assembler does today. And the code generated from an Executable UML model will be as uninteresting and typically unexamined as the assembler pass of a third generation language compile is today."
"Computing spread out much, much faster than educating unsophisticated people can happen. In the last 25 years or so, we actually got something like a pop culture, similar to what happened when television came on the scene and some of its inventors thought it would be a way of getting Shakespeare to the masses. But they forgot that you have to be more sophisticated and have more perspective to understand Shakespeare. What television was able to do was to capture people as they were. So I think the lack of a real computer science today, and the lack of real software engineering today, is partly due to this pop culture."
"Some people have called the book [The Mythical Man-Month, 1975] the "bible of software engineering". I would agree with that in one respect: that is, everybody quotes it, some people read it, and a few people go by it."
"The amateur software engineer is always in search of magic, some sensational method or tool whose application promises to render software development trivial. It is the mark of the professional software engineer to know that no such panacea exist."
"You are absolutely deluded, if not stupid, if you think that a worldwide collection of software engineers who can't write operating systems or applications without security holes, can then turn around and suddenly write virtualization layers without security holes."
""Legacy code" is a term often used derogatorily to characterize code that is written in a language or style that (1) the speaker/writer consider outdated and/or (2) is competing with something sold/promoted by the speaker/writer. "Legacy code" often differs from its suggested alternative by actually working and scaling."
"Software engineering concerns methods and techniques to develop large software systems. The engineering metaphor is used to emphasize a systematic approach to develop systems that satisfy organizational requirements and constraints."
"Far too often, "software engineering" is neither engineering nor about software."
"After forty years of currency the phrase "software engineering" still denotes no more then a vague and largely unfulfilled aspiration."
"Claude Shannon, the founder of information theory, invented a way to measure 'the amount of information' in a message without defining the word 'information' itself, nor even addressing the question of the meaning of the message."
"In fact, an information theory that leaves out the issue of noise turns out to have no content."
"If quantum communication and quantum computation are to flourish, a new information theory will have to be developed."
"The 19th and first half of the 20th century conceived of the world as chaos. Chaos was the oft-quoted blind play of atoms, which, in mechanistic and positivistic philosophy, appeared to represent ultimate reality, with life as an accidental product of physical processes, and mind as an epi-phenomenon. It was chaos when, in the current theory of evolution, the living world appeared as a product of chance, the outcome of random mutations and survival in the mill of natural selection. In the same sense, human personality, in the theories of behaviorism as well as of psychoanalysis, was considered a chance product of nature and nurture, of a mixture of genes and an accidental sequence of events from early childhood to maturity. Now we are looking for another basic outlook on the world -- the world as organization. Such a conception -- if it can be substantiated -- would indeed change the basic categories upon which scientific thought rests, and profoundly influence practical attitudes. This trend is marked by the emergence of a bundle of new disciplines such as cybernetics, information theory, general system theory, theories of games, of decisions, of queuing and others; in practical applications, systems analysis, systems engineering, operations research, etc. They are different in basic assumptions, mathematical techniques and aims, and they are often unsatisfactory and sometimes contradictory. They agree, however, in being concerned, in one way or another, with "systems," "wholes" or "organizations"; and in their totality, they herald a new approach."
"We completely ignore the human value of the information. A selection of 100 letters is given a certain information value, and we do not investigate whether it makes sense in English, and, if so, whether the meaning of the sentence is of any practical importance. According to our definition, a set of 100 letters selected at random (according to the rules of Table 1.1), a sentence of 100 letters from a newspaper, a piece of Shakespeare or a theorem of Einstein are given exactly the same informational value."
"In fact, the science of thermodynamics began with an analysis, by the great engineer Sadi Carnot, of the problem of how to build the best and most efficient engine, and this constitutes one of the few famous cases in which engineering has contributed to fundamental physical theory. Another example that comes to mind is the more recent analysis of information theory by Claude Shannon. These two analyses, incidentally, turn out to be closely related."
"Whether computers are used for engineering design, medical data processing, composing music, or other purposes, the structure of computing is much the same. We are extremely short of talented people in this field, and so we need departments, curricula, and research and degree programs in computer science... I think of the Computer Science Department as eventually including experts in Programming, Numerical Analysis, Automata Theory, Data Processing, Business Games, Adaptive Systems, Information Theory, Information Retrieval, Recursive Function Theory, Computer Linguistics, etc., as these fields emerge in structure... Universities must respond [to the computer revolution] with far reaching changes in the educational structure."
"Incomplete knowledge of the future, and also of the past of the transmitter from which the future might be constructed, is at the very basis of the concept of information. On the other hand, complete ignorance also precludes communication; a common language is required, that is to say an agreement between the transmitter and the receiver regarding the elements used in the communication process... [The information of a message can] be defined as the 'minimum number of binary decisions which enable the receiver to construct the message, on the basis of the data already available to him.' These data comprise both the convention regarding the symbols and the language used, and the knowledge available at the moment when the message started."
"I have tried to show that psychiatric research can be empirical and experimental, controlled, and operational and not dependent on inferences, analogies, or anecdotes. Hypotheses can be derived which are testable. Theory is a different matter. At the present we rely heavily on psychoanalytic theory or on still poorly formulated and defined general systems theory, information theory, or transactional theory. To explain the depth and variety of the interrelationship of somatopsychosocial facets of the totality of human behavior in process requires a unified theory of human behavior which we have not yet even approached. Integration or synthesis of biological, psychological, and social theory is not enough."
"Every time we fire a phonetician/linguist, the performance of our system goes up"
"Pure mathematics, being mere tautology, and pure physics, being mere fact, could not have engendered them; for creatures to live, must sense the useful and the good; and engines to run must have energy available as work : and both, to endure, must regulate themselves. So it is to Thermodynamics and to its brother Σp log p, called Information theory, that we look for the distinctions between work and energy and between signal and noise."
"The field of 'information theory' began by using the old hardware paradigm of transportation of data from point to point."
"Without an understanding of causality there can be no theory of communication. What passes as information theory today is not communication at all, but merely transportation."
"Some authors state that the last stage in this chain of measurements involves "consciousness," or the "intellectual inner life" of the observer, by virtue of the "principle of psycho-physical parallelism." Other authors introduce a wave function for the entire universe. In this book, I shall refrain from using concepts that I do not understand."
"My greatest concern was what to call it. I thought of calling it 'information,' but the word was overly used, so I decided to call it 'uncertainty.' When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage.'"
"Calvin Mooers was a participant in early developmental work on digital computers, a researcher, author, and implementer of applications in information retrieval; and a prophet in the 1950s describing the future importance of what is now called computer networks and distributive processing, and daring to predict that machines could simulate thought processes in retrieving computerized information. In 1947, he proposed the Zator, an electronic, film-scanning retrieval machine, and made the first proposal to use the Boolean operations or, and, and not to prescribe selections in retrieval machines. He developed his own Zatocoding System in 1948 using superimposed subject codes on edge-notched cards. He coined the term "Information Retrieval" in 1950, and went on from there to obtain several patents in information retrieval and signaling, produce a text-handling language (TRAC), author some 200 publications, and form one of the first companies whose only concern was information. His thinking has affected all who are in the field of Information and his early ideas are now incorporated into today's reality."
"The index of a search engine can be thought of as analogous to the stars in [the] sky. What we see has never existed, as the light has traveled different distances to reach our eye. Similarly, Web pages referenced in an index were also explored at different dates and they may not exist any more."
"The WWW project merges the techniques of information retrieval and hypertext to make an easy but powerful global information system. The project started with the philosophy that much academic information should be freely available to anyone. It aims to allow information sharing within internationally dispersed teams, and the dissemination of information by support groups."
"Information retrieval consists of four main stages: Identifying the exact subject of the search; Locating this subject in a guide which refers the searcher to one or more documents; Locating the documents; Locating the required information in the documents."
"The retrieval process begins when a lack of information shows itself in a human mind and the decision is taken to find out if this information has been discovered and published"
"'Information management' is a term that is preferred to 'information retrieval' by System Development Corporation. Information management is defined as the establishment and utilization of effective procedures for controlling the generation, processing, flow, and use of information."
"Brian Campbell Vickery was an enormously influential figure in the field of classification and information retrieval, a powerful force in the development of faceted classification and retrieval theory, and a prolific writer and researcher throughout his life."
"It can be useful to distinguish between knowledge and information and data; it is also difficult and contentious. Four points should be made. [First] Knowledge, information and data is what the systems to be discussed are for: by storing it in an organized manner, they are intended to enable it to be found when needed. Secondly, there is a spectrum of increased size and organization between data, where the units are quite small, through to knowledge, where the units are large and distinguished by their complex internal structure and relationships, and overlap with other units... Meunier (1987) presents a typology of levels of representation which is useful for the breath of its approach and its classification of relationships. Thirdly, "information" in the expression "information retrieval" is generally abused, because what is retrieved is not information, but bibliographic details of sources in which desired information potentially exists. Very many information retrieval systems are at best document retrieval systems, and more usually they are systems which retrieve surrogates for documents... Finally, although the expression knowledge retrieval is particularly associated with artificial intelligence and expert systems, it should not be forgotten that this is what cataloguers, indexers and bibliographers have been doing, and devising systems for, for many years."
"Biological classifications have two major objectives: to serve as a basis of biological generalizations in all sort of comparative studies and to serve as a key to an information storage system... Is the classification that is soundest as a basis of generalizations also most convenient for information retrieval? This, indeed, seems to have been true in most cases I have encountered."
"The problem of directing a user to stored information, some of which may be unknown to him, is the problem of "information retrieval"… In information retrieval, the addressee or receiver rather than the sender is the active party. Other differences are that communication is temporal from one epoch to a later epoch in time, though possibly at the same point in space; communication is in all cases unidirectional; the sender cannot know the particular message that will be of later use to the receiver and must send all possible messages; the message is digitally representable; a "channel" is the physical document left in storage which contains the message; and there is no channel noise because all messages are presumed to be completely accessible to the receiver. The technical goal is finding in minimum time those messages of interest to the receiver, where the receiver has available a selective device with a finite digital scanning rate."
"An information retrieval system will tend not to be used whenever it is more painful and troublesome for a customer to have information than for him not to have it. Where an information retrieval system tends not to be used, a more capable information retrieval system may tend to be used even less."
"The task of keeping up with scientific literature is becoming an impossible one and is in turn leading to inefficiency and to a certain amount of frustration in scientific research and in the application of science."
"Information retrieval is a wide, often loosely-defined term but in these pages I shall be concerned only with automatic information retrieval systems. Automatic as opposed to manual and information as opposed to data or fact. Unfortunately the word information can be very misleading. In the context of information retrieval (IR), information, in the technical meaning given in Shannon's theory of communication, is not readily measured (Shannon & Weaver). In fact in many cases, one can adequately describe the kind of retrieval by simply substituting "document" for "information". Nevertheless, "information retrieval" has become accepted as a description of the kind of work published by Cleverdon, Salton, Spark Jones, Lancaster and others. A perfectly straightforward definition along this line is given by Lancaster 2: "Information retrieval is the term conventionally, though somewhat inaccurately, applied to the type of activity discussed in this volume. An information retrieval system does not inform (i.e. change the knowledge of) the user on the subject of his inquiry. It merely informs on the existence (or non-existence) and whereabouts of documents relating to his request". This specifically excludes Question-Answering systems as typified by Winograd 3 and those described by Minsky 4. It also excludes data retrieval systems such as used by, say, the stock exchange for on-line quotations."
"But do you know that, although I have kept the diary [on a phonograph] for months past, it never once struck me how I was going to find any particular part of it in case I wanted to look it up?|"
"An information retrieval system is therefore defined here as any device which aids access to documents specified by subject, and the operations associated with it. The documents can be books, journals, reports, atlases, or other records of thought, or any parts of such records—articles, chapters, sections, tables, diagrams, or even particular words. The retrieval devices can range from a bare list of contents to a large digital computer and its accessories. The operations can range from simple visual scanning to the most detailed programming."
"Four years ago, when the first edition of this book was written, information retrieval was beginning to crystallize out as a unified discipline. The process has gone further today. Several other books... have also offered a general survey, although each has contributed its own special emphasis. Many conferences on the subject have been held, and a constant stream of new articles has appeared, both in documentation journals and in those in the data processing field. Information retrieval is now recognized as a discipline, and further advances in theory are being made, What I described in the first edition as the key operation in retrieval — the subject description of documents — is being explored theoretically and experimentally, although we are still a long way from reducing this operation to rule (Chapter 3). There has been less new work on the design of descriptor languages, although ideas on the display of descriptor relations through thesauri and 'semantic maps' have been developed (Chapter 4). Access to files has been examined, particularly by those experienced in data processing."
"Information retrieval is now an accepted part of the new discipline of information science and technology... I have concentrated on the field with which I am most familiar, the problems of bibliographic description and subject analysis."
"In 1958 the classification ideas in it were felt to controversial, needing to be championed. A few years before, the {[w|Classification Research Group}} had issued a memorandum proclaiming "the need for a faceted classification as the basis of all methods of information retrieval'. As part-author of this memorandum, I must now judge the claim to have been too bold, even brash."
"What surprised me, which Google was part of, is that superficial search techniques over large bodies of stuff could get you what you wanted. I grew up in the AI tradition, where you have a complete conceptual model, and the information retrieval tradition, where you have complex vectors of key terms and Boolean queries. The idea that you can index billions of pages and look for a word and get what you want is quite a trick. To put it in more abstract terms, it's the power of using simple techniques over very large numbers versus doing carefully constructed systematic analysis."
"The work of the information officer [should be] regarded as the natural dynamic extension of that of the librarian."
"Information management is central to human development. No less is it important that man learn to improve his management of materials."
"The role of information management is to mediate between information technology and information institutions to facilitate their effective planning, organization, control, and operation."
"What is involved in information management is ordinarily not the "management" of substantial information per se — in the way that a censor or an Orwellian dictator would "manage" it — but the management of the information process from organization to ultimate use."
"Historically, information management has been a fragmented activity shared among the traditionally independent elements of an organization. Many of the critical data-handling activities (payroll, invoices, payments, inventories, etc.) of an organization have been located in the administrative or financial management offices. Automation of these activities has resulted in placing management responsibilities for computers and information systems in the office of an organization's administrator or controller. Since information-related programs also may be administered by other elements in an organization, in many instances a dispersed information management structure has resulted. For example, activities such as information and library services, statistical functions, information programs, and associated activities (policy, reports, management, procurement, and communications) may not be centrally managed. Often, responsibility for managing these activities and services is shared, and in some instances the jurisdictional responsibility may not be clear. As a result of this fragmented approach, information resources sometimes have been poorly managed and inappropriately used. The current rationale for comprehensive management of information-related activities is that these activities contribute to an organization's effectiveness. According to the general IRM concept, the IRM office within an organization should provide a central focus for all those information activities that support and serve the organization. Also, this office should reflect the organization's specific directions and goals and be consistent with good management practices. The objectives and goals of the IRM office should be formulated to provide a cohesive management framework consistent with organization requirements and values. The IRM policies and procedures should provide a foundation for developing the information architecture and relevant programs required by the organization."
"For many companies and people who really ought to know better, information management is a term synonomous with data processing."
"Information management is a term which is being used increasingly to express the changing nature of library and information work. Based conceptually in information science it represents a convergence of the role claimed for the information scientist with that of the librarian who is rapidly embracing new techniques in order to cope with the information explosion."
"Information management is a term used by many to describe the myriad issues associated with managing an organisation's information resource."
"There will always be a large number of information management systems - we get a lot of added usefulness from being able to crosslink them. However, we will lose out if we try to constrain them, as we will exclude systems and hamper the evolution of hypertext in general."
"MIS plans compete with many other potential business investments and business problems for the attention of senior management. Consequently, a strategic planning methodology should not only produce a plan linked to business planning but also should create a persuasive case for its support. This article examines the state of the art in strategic planning in terms of enterprisewide information management (EwIM), which is a set of concepts and tools that enable MIS managers to plan, organize, implement, and control information resources to meet current and future strategic goals."
"In 1980 a law was passed called the . It could have been called the Information Management Act of 1980."
"Enterprise architecture [is] the Holy Grail of all systems people. Advanced systems textbooks tell you that every organization must have one. Several CIM program directors attempted to come up with this abstraction, only to fail. Only someone with a depth of understanding about how the Pentagon really works could come up with anything of use."
"Information management is a term which is only now starting to gain a place in common usage. IM is still confused with IT, probably as a result of the similarity of the terms. However, IM is much broader and includes all aspects of handling information. This includes the procedural and clerical aspects as well as any kind of technology that might be involved, not forgetting the most important aspect - the human element. IM includes the management of information in any form,"
"Information management is itself a field whose definition is unsettled. There is a broad field called information management that has some subsidiary elements, including IRM. Whether the information studies interest is with the whole or the part is unclear."
"Thomas Davenport proposes a revolutionary new way to look at information management, one that takes into account the total information environment within an organization. Arguing that the information that comes from computer systems may be considerably less valuable to managers than information that flows in from a variety of other sources, the author describes an approach that encompasses the company's entire information environment, the management of which he calls information ecology."
"Information management is the direct progney of information technology. No wonder, some critics have identified it with information economics. The world of information management is a world of inputs and outputs, in which value additions are the ultimate criteria.""
"The role of information management is to assist in collecting, organizing, validating, storing, and retrieving data and in preparing reports. An effective information management system is essential to any monitoring program..."
"The role of information management is a key enabler for demand chain management. It means capturing the market and end user demand information accurately, timely and in a relevant manner: capturing at all times the point of sales through all channels of inventory information."
"The Information age is well upon us in seven major fields — learning, diagnostics, management, physical planning, finance, entertainment and communication."
"It was to do with information management. The intention was to dramatise it."
"[The] company’s Chief Information Officer (CIO) should guide the rationale behind the development of EA models. In particular, distribution of IT related information and knowledge throughout the organization is emphasized as an important concern uncared for. Secondly, the lack of architectural theory is recognized..."
"In the last decade most of the large industrialized economies have been shifting from a heavy manufacturing base to an information management base. Along with this shift has been stiff competition resulting from globalization."
"Generically, an architecture is the description of the set of components and the relationships between them. Simple enough. The trouble starts when you tack on an adjective: There are software architectures, hardware architectures, network architectures, system architectures, and enterprise architectures. People have their own preconceived notions and experiences about “architecture.” A software architecture describes the layout of the software modules and the connections and relationships among them. A hardware architecture can describe how the hardware components are organized. However, both these definitions can apply to a single computer, a single information system, or a family of information systems. Thus “architecture” can have a range of meanings, goals, and abstraction levels, depending on who’s speaking. An information system architecture typically encompasses an overview of the entire information system—including the software, hardware, and information architectures (the structure of the data that systems will use).In this sense, the information system architecture is a meta-architecture. An enterprise architecture is also a meta-architecture in that it comprises many information systems and their relationships (technical infrastructure). However, because it can also contain other views of an enterprise—including work, function, and information—it is at the highest level in the architecture pyramid. It is important to begin any architecture development effort with a clear definition of what you mean by “architecture.”"
"It is argued that software architecture is an effective tool to cut development cost and time and to increase the quality of a system."
"Software architecture is an important field of study that is becoming more important and more talked about with every passing day. Nevertheless, to our knowledge, there exists little practical guidance on managing software architecture in a real software development organization, from both technical and managerial perspectives. This book has emerged from our belief that the coupling of a system's software architecture and its business and organizational context has not been well explored. Our experience with designing and analyzing large and complex software-intensive systems has led us to recognize the role of business and organization in the design of the system and in its ultimate success or failure. Systems are built to satisfy an organization's requirements (or assumed requirements in the case of shrink-wrapped products). These requirements dictate the system's performance, availability, security, compatibility with other systems, and the ability to accommodate change over its lifetime. The desire to satisfy these goals with software that has the requisite properties influences the design choices made by a software architect."
"Releasing Linux versions has always been a matter of higher code quality, good software architecture, and technical interest for the platform."
"Every software system needs to have a simple yet powerful organizational philosophy (think of it as the software equivalent of a sound bite that describes the system's architecture)... [A] step in [the] development process is to articulate this architectural framework, so that we might have a stable foundation upon which to evolve the system's function points."
"All architecture is design but not all design is architecture. Architecture represents the significant design decisions that shape a system, where significant is measured by cost of change."
"The first phase of software architecture research, where the key concepts are components and connectors, has matured the technology to a level where industry adoption is wide-spread and few fundamental issues remain. The traditional view on software architecture suffers from a number of key problems that cannot be solved without changing our perspective on the notion of software architecture. These problems include the lack of first-class representation of design decisions, the fact that these design decisions are cross-cutting and intertwined, that these problems lead to high maintenance cost, because of which design rules and constraints are easily violated and obsolete design decisions are not removed. As a community, we need to take the next step and adopt the perspective that a software architecture is, fundamentally, a composition of architectural design decisions. These design decisions should be represented as first-class entities in the software architecture and it should, at least before system deployment, be possible to add, remove and change architectural design decisions against limited effort."
"Software architecture is still mostly considered a separate issue from programming languages. We contend that this is a serious issue for the software engineering of interactive systems."
"The goal for our software architecture is to provide the key mechanisms that are required to implement a wide variety of cross-layer adaptations described by our taxonomy. Our strategy for developing such an architecture is actually to create two architectures, a “conceptual” one, followed by a “concrete” one. In this step, we have first"
"Software architecture is at the center of a frenzy of attention these days... We hold that documenting software architecture is primarily about documenting the relevant views, and then augmenting this information with relevant information that applies across views."
"Studies of software engineering projects show that a large number of usability related change requests are made after its deployment. Fixing usability problems during the later stages of development often proves to be costly, since many of the necessary changes require changes to the system that cannot be easily accommodated by its software architecture. These high costs prevent developers from meeting all the usability requirements, resulting in systems with less than optimal usability. The successful development of a usable software system therefore must include creating a software architecture that supports the right level of usability. Unfortunately, no documented evidence exists of architecture level assessment techniques focusing on usability. To support software architects in creating a software architecture that supports usability, we present a scenario based assessment technique that has been successfully applied in several cases. Explicit evaluation of usability during architectural design may reduce the risk of building a system that fails to meet its usability requirements and may prevent high costs incurring adaptive maintenance activities once the system has been implemented."
"As the size of software systems increases, the algorithms and data structures of the computation no longer constitute the major design problems. When systems are constructed from many components, the organization of the overall system—the software architecture—presents a new set of design problems. This level of design has been addressed in a number of ways including informal diagrams and descriptive terms, module interconnection languages, templates and frameworks for systems that serve the needs of specific domains, and formal models of component integration mechanisms."
"In creating a software architecture, system considerations are seldom absent. For example, if you want an architecture to be high performance, you need to have some idea of the physical characteristics of the hardware platforms that it will run on (CPU speed, amount of memory, disk access speed) and the characteristics of any devices that the system interfaces with (traditional I/O devices, sensors, actuators), and you will also typically be concerned with the characteristics of the network (primarily bandwidth). If you want an architecture that is highly reliable, again you will be concerned with the hardware, in this case with its failure rates and the availability of redundant processing or network devices. On it goes. Considerations of hardware are seldom far from the mind of the architect. So, when you design a software architecture, you will probably need to think about the entire system-the hardware as well as the software. To do otherwise would be foolhardy. No engineer can be expected to make predictions about the characteristics of a system when only part of that system is specified."
"Software architecture is a burgeoning field of research and practice within software engineering. Alternatively, to be more precise, the architecture of large, software intensive systems has been the subject of increasing interest for the past decade. What accounts for this surge of interest in a field that, until about 1990 was unheard of? To begin, the field did not spontaneously create itself in 1990. However, that time frame was when the term “software architecture” began to gain widespread acceptance and when the field first attracted substantial attention from both industry and the research community. The field was created out of necessity. Software systems were growing larger: systems of hundreds of thousands or even millions of lines of code were becoming commonplace. Clearly, Parnas, Brooks, Dijkstra and others in the 1960s through the 1980s-laid the foundations of the ideas underlying the field that is today called “software architecture” but what changed is that by the 1990s such large systems were becoming common."
"Software architectures shift the focus of developers from lines-of-code to coarser-grained architectural elements and their overall interconnection structure. Architecture description languages (ADLs) have been proposed as modeling notations to support architecture-based development. There is, however, little consensus in the research community on what is an ADL, what aspects of an architecture should be modeled in an ADL, and which of several possible ADLs is best suited for a particular problem. Furthermore, the distinction is rarely made between ADLs on one hand and formal specification, module interconnection, simulation, and programming languages on the other."
"The software architecture of a system supports the most critical requirements for the system. For example, if a system must be accessible from a wireless device, or if the business rules for a system change on a daily basis, then these requirements drastically affect the software architecture for the system. It is necessary for an organization to characterize software architectures and the level of qualities that their systems support to fully understand the implications of these systems on the overall enterprise architecture."
"Software architecture is becoming an important part of software design, which helps developers to handle the complexity of large systems."
"By analogy to building architecture, we propose the following model of software architecture:"
"Although software architecture is an important discipline for software development, it can and should be complemented by other approaches such as, Design Patterns and Aspect-Oriented Software Development (AOSD)"
"software architecture involves the description of elements from which systems are built, interactions among those elements, patterns that guide their composition, and constraints on these patterns. In general, a particular system is defined in terms of a collection of components and interactions among those components. Such a system may in turn be used as a (composite) element in a larger system design."
"Software architecture is foundational to the development of large, practical software-intensive applications."
"Software architecture is a relatively young discipline. There is as much confusion in it as there is excitement. In the literature one finds far too many perspectives, approaches, methodologies, frameworks, techniques and tricks."
"Today some evidence arises that UML will more and more be used not as a but as a high level programming language. This has some advantages, as if the concepts of UML are executable, they can immediately be animated and tested, or the generated code even be used as implementation. Thus UML probably will have an implementation-oriented semantics describing this animation."
"Originally UML was intended to serve as a . But a specification is primarily intended to describe properties of systems that the system developers want to be valid, but to leave open other properties that are not clear already. Today this is partly achieved by having a semantics that is rather vague (and here we mean imprecise as opposed to not detailed). However, this is not an advantage, as the developer cannot fix this kind of impreciseness within UML, but can adapt the individual interpretation only. Furthermore, to get complete (and therefore executable) UML descriptions, often certain details have to be specified, which the developer does not yet know or wants to leave open to a later phase of development or even implementation. It is an intrinsic problem of executable languages that this kind of over-specification frequently occurs. Instead it would be of some help to have flexible concepts of under-specification to postpone detail decisions to situations, where the decisions can and must be made."
"The (UML) is a general-purpose visual that is used to specify, visualize, construct, and document the artifacts of a software system. It captures decisions and understanding about systems that must be constructed. It is used to understand, design, browse, configure, maintain, and control information about such systems. It is intended for use with all development methods, lifecycle stages, application domains, and media. The modeling language is intended to unify past experience about modeling techniques and to incorporate current software best practices into a standard approach. UML includes semantic concepts, notation, and guidelines. It has static, dynamic, environmental, and organizational parts. It is intended to be supported by interactive visual modeling tools that have code generators and report writers. The UML specification does not define a standard process but is intended to be useful with an iterative development process. It is intended to support most existing object-oriented development processes."
"I assume that a precisely defined, verifiable, executable, and translatable UML is a Good Thing and leave it to others to make that case... In the summer of 1999, the UML has definitions for the semantics of its components. These definitions address the static structure of UML, but they do not define an execution semantics. They also address (none too precisely) the meaning of each component, but there are "semantic variation points" which allow a component to have several different meanings. Multiple views are defined, but there is no definition of how the views fit together to form a complete model. When alternate views conflict, there is no definition of how to resolve them. There are no defined semantics for actions... To determine what requires formalization, the UML must distinguish clearly between essential, derived, auxiliary, and deployment views. An essential view models precisely and completely some portion of the behavior of a subject matter, while a derived view shows some projection of an essential view... All we need now is to make the market aware that all this is possible, build tools around the standards defined by the core, executable UML, and make it so..."
"In its current form UML is designed to support a wide variety of different modelling techniques and formalisms. This is evident, for example, in the state machine formalism which allows both Moore and Mealy formalism with hierarchical states including concurrent sub-states and both synchronous and asynchronous calling semantics. The result of this is not only that almost any state modelling style can be supported but also that many combinations of elements have no defined execution semantics. It is now widely recognised within the UML community, however, that considerable benefit can be gained by forming subsets of the UML with well defined execution semantics. Such subsets can form an “executable UML” which would enable the simulation, execution, testing and ultimately translation of UML models into target code. As part of this movement, work is progressing under the auspices of the OMG towards the definition of “profiles” that define such subsets and towards the more detailed definition of the contents of “actions” including a more precise definition of the execution semantics of UML models."
"The use of UML collaboration diagrams for specifying complex real-time architectures has been the focus of some recent work. The central idea is to capture architectural specifications in a formal way. This has a number of important advantages. First, it means that the architectural models can be formally analyzed for consistency and completeness. It also means that the models are executable and allow early and precise assessment of the validity of different architectural approaches... Support for this form of UML-based architectural modeling is now available in a commercial tool from Rational Software (Rose RealTime). This tool allows the construction of executable UML models and has the capability of automatic generation of complete systems from such models."
"The use of UML techniques in TRADE has implications for the semantics of these techniques when they are used this way. s and statecharts are used in TRADE to represent software architecture at the essential level, where we can assume perfect technology. This means, for example, that actions do not take time and that all objects perform their tasks in parallel. A first version of such an essential-level semantics is presented elsewhere. This semantics differs from the OMG semantics, in which actions take time, there are several threads of control and one message queue per thread which can receive signals exchanged by objects. The OMG semantics is clearly intended for and appropriate to what I call the implementation architecture. This is called the design model in the Unified Software Development Process. The use of C++ as action language in the executable UML models of Rhapsody confirms this, as does the outline of the executable statechart semantics given by Harel & Gery..."
"Ever wish you could draw a few diagrams, press a button, and have a working software system that meets your needs? Sound like magic? Perhaps, but that’s a major part of the Executable UML vision. The basic idea is that you will use a CASE tool to develop detailed UML diagrams and then supplement them by specifications written in a formal language, presumably the OMG’s Object Constraint Language (OCL). The basic idea behind Executable UML is that systems can be modeled at a higher level of abstraction than source code, simulated to support validation of your efforts, and then translated into efficient code. This higher-level of abstraction should help to avoid premature design, enable you to change your system as your requirements evolve, and to delay implementation decisions until the last minute."
"In my opinion this sounds great in theory, but unfortunately there are several problems to making this work in practice:"
":I have no doubt that we will begin to see some interesting tools emerge over the next few years based on the Executable UML vision."
"The entire history of software engineering is that of the rise in levels of abstraction. Executable UML is the next logical, and perhaps inevitable, evolutionary step in the ever-rising level of abstraction at which programmers express software solutions. Rather than elaborate an analysis product into a design product and then write code, application developers of the future will use tools to translate abstract application constructs into executable entities... This shift is made possible by the confluence of four factors:"
"#The development of the Model Driven Architecture (MDA) standards by the Object Management Group (OMG)"
"#The adoption of the Precise Action Semantics for the Unified Modeling Language specification by the OMG in November of 2001"
"#A proposed profile of UML—Executable UML—supports creating a complete and implementation-neutral self-contained expression of application functionality. Steven J. Mellor and Marc J. Balcer define this profile in their book Executable UML: A Foundation for Model-Driven Architecture"
"#The availability of high-quality Model Compilers and Virtual Execution Environments (VEEs) that provide "out of the box" platforms upon which Executable UML models can execute. These VEEs, which exist today in a somewhat incipient stage, will someday soon reduce low-level system architectures to near-commodity status."
"Executable UML is at the next higher layer of abstraction, abstracting away both specific programming languages and decisions about the organization of the software so that a specification built in Executable UML can be deployed in various software environments without change."
"Executable UML is designed to produce a comprehensive and comprehensible model of a solution without making decisions about the organization of the software implementation. It is a highly abstract thinking tool to aid in the formalization of knowledge, a way of thinking about and describing the concepts that make up an abstract solution to a client problem."
"Steve Mellor has long been active in this kind of work and has recently used the term Executable UML [Mellor and Balcer, 2002]. Executable UML is similar to MDA but uses slightly different terms. Similarly, you begin with a that is equivalent to MDA's PIM. However, the next step is to use a Model Compiler to turn that UML model into a deployable system in a single step; hence, there's no need for the PSM. As the term compiler suggests, this step is completely automatic."
"Models and simulation furnish abstractions to manage complexities allowing engineers to visualize the proposed system and to analyze and validate system behavior before constructing it. Unified Modeling Language (UML) and its systems engineering extension, (SysML), provide a rich set of diagrams for systems specification. However, the lack of executable semantics of such notations limits the capability of analyzing and verifying defined specifications. This research has developed an executable system architecting framework based on SysML-CPN transformation, which introduces dynamic model analysis into SysML modeling by mapping SysML notations to (CPN), a graphical language for system design, specification, simulation, and verification. A was also integrated into the CPN model to enhance the model-based simulation. A set of methodologies has been developed to achieve this framework. The aim is to investigate system wide properties of the proposed system, which in turn provides a basis for system reconfiguration."
"What do you think of using UML to generate implementation code? James: I think it’s a terrible idea. I know that I disagree with many other UML experts, but there is no magic about UML. If you can generate code from a model, then it is programming language. And UML is not a well-designed programming language. The most important reason is that it lacks a well-defined point of view, partly by intent and partly because of the tyranny of the OMG standardization process that tries to provide everything to everybody. It doesn't have a well-defined underlying set of assumptions about memory, storage, concurrency, or almost anything else. How can you program in such a language? The fact is that UML and other modelling language are not meant to be executable. The point of models is that they are imprecise and ambiguous. This drove many theoreticians crazy so they tried to make UML "precise", but models are imprecise for a reason: we leave out things that have a small effect so we can concentrate on the things that have big or global effects. That's how it works in physics models: you model the big effect (such as the gravitation from the sun) and then you treat the smaller effects as perturbation to the basic model (such as the effects of the planets on each other). If you tried to solve the entire set of equations directly in full detail, you couldn't do anything."
"Executable textual modeling tools support automated code generation. For example, supports a number of high-level languages but is targeted towards text parsing and input validation. recently added built-in support for state machines. The State Machine Compiler (SMC) is targeted towards the specification of event driven systems. Microsoft is also developing a textual specification language, AsmL, based on state machine concepts. Those approaches do not incorporate class diagram abstractions and do not support development of complete applications. Executable UML supports a subset of UML textually but misses key features of UML and does not integrate with programming languages."
"Regarding executability, it is clear that from the hypothetical day when a UML virtual machine would be universally adopted, UML-based process models would have a real benefit. Process modelers supposed to be already familiar with UML diagrams would then simply have to draw their process models using their usual UML tools. They would then be able to test, execute and debug them as everyone does with her usual programming language. offers the potential to define such virtual machine thanks notably to the Activity and Action packages, which come with an operational semantics. Some ambiguities in this operational semantics (given in natural language in the standard), have however to be first fixed. This is one of the purposes of a recent initiative at the OMG, called Executable UML"
"Executable UML (xUML) consists of UML class diagrams, UML state machines and an action language which complies with the UML action semantics. There are several action languages in use; we refer to for a—somewhat limited—overview. The xUML models to be translated are expressed in the Cassandra/xUML dialect, as developed by KnowGravity."
"We present a fully automated approach to verifying safety properties of Executable UML models (xUML). Our tool chain consists of a model transformation program which translates xUML models to the process algebra mCRL2, followed by symbolic model checking using LTSmin. If a safety violation is found, an error trace is visualised as a UML sequence diagram. As a novel feature, our approach allows safety properties to be specified as UML state machines."
"Extreme Programming is the first popular methodology to view software development as an exercise in coding rather than an exercise in management."
"The new concept of Extreme Programming (XP) is gaining more and more acceptance, partially because it is controversial, but primarily because it is particularly well-suited to help the small software development team succeed... XP is controversial, many software development sacred cows don't make the cut in XP; it forces practitioners to take a fresh look at how software is developed."
"XP (Extreme Programming) is a system of practices (you can use the m-word if you want to; we'd rather not, thank you) that a community of software developers is evolving to address the problems of quickly delivering quality software, and then evolving it to meet changing business needs."
"One of the things I've been trying to do is look for simpler or rules underpinning good or bad design. I think one of the most valuable rules is avoid duplication. "Once and only once" is the Extreme Programming phrase."
"One of the central axioms of extreme programming is the disciplined use of regression testing during stepwise software development."
"Extreme Programming is a discipline of software development with values of simplicity, communication, feedback, and courage. We focus on the roles of customer, manager,and programmer and accord key rights and responsibilities to the people in those roles."
"Extreme Programming is an “agile methodology” that some people advocate for the high-speed, volatile world of Internet and Web development."
"Extreme Programming is the most prominent new, light-weight (or agile) methods, defined to contrast the current heavy-weight and partially overloaded object-oriented methods. It focuses on the core issues of software technology. One of its principles is not to rely on diagrams to document a system."
"One of the distinct features of XP is the lack of any documentation whatsoever, except for the code itself. This is a contraposition to the modeling techniques like the Unified Modeling Language (UML), which strongly focus on documentation. XP takes an extreme position there, not even documenting the architecture of the system. Often, it is very difficult to extract the overall structure, behavior or interactions with the environment from the code. The code is a rather detailed and fragile representation of the system’s tasks. Even though the code contains all necessary information about the system, this information is often burdened with details and it is tedious to extract the aspects one is interested in. Therefore, it would be useful to have a more compact system representation. The UML does provide a number of notations that are suited for this purpose. However, the tools so far are not capable of supporting UML in such a manner that it can be well-integrated with the approach of Extreme Programming."
"But you could do extreme programming. In fact, I had a college buddy I did pair programming with. We took a compiler writing class together and studied all that fancy stuff from the dragon book. Then of course the professor announced we would be implementing our own language, called PL/0. After thinking about it a while, we announced that we were going to do our project in BASIC. The professor looked at us like were insane. Nobody else in the class was using BASIC. And you know what? Nobody else in the class finished their compiler either. We not only finished but added I/O extensions, and called it PL 0.5. That's rapid prototyping."
"Perhaps the greatest strength of an object-oriented approach to development is that it offers a mechanism that captures a model of the real world."
"Object-oriented programming languages support encapsulation, thereby improving the ability of software to be reused, refined, tested, maintained, and extended. The full benefit of this support can only be realized if encapsulation is maximized during the design process. We argue that design practices which take a data-driven approach fail to maximize encapsulation because they focus too quickly on the implementation of objects. We propose an alternative object-oriented design method which takes a responsibility-driven approach. We show how such an approach can increase the encapsulation by deferring implementation issues until a later stage."
"Philosophy and cognitive science have contributed to the advancement of the object model. The idea that the world could be viewed either in terms of objects or processes was a Greek innovation, and in the seventeenth century, we find Descartes observing that humans naturally apply an object-oriented view of the world. In the twentieth century, Rand expanded upon these themes in her philosophy of objectivist epistemology. More recently, Minsky has proposed a model of human intelligence in which he considers the mind to be organized as a society of otherwise mindless agents. Minsky argues that only through the cooperative behavior of these agents do we find what we call intelligence."
"is a method of implementation in which programs are organized as cooperative collections of objects, each of which represents an instance of some class, and whose classes are all members of a hierarchy of classes united via inheritance relationships."
"OOA - Object-Oriented Analysis - is based upon concepts that we first learned in kindergarten: objects and attributes, wholes and parts, classes and members."
"In order to better understand object-oriented methodologies in general, it helps to understand the people who make up the "object-oriented community" itself. Far from being monolithic, there is a great deal of diversity within this community. Many object-oriented people, for example, seem to focus almost entirely on programming language issues. They tend to cast all discussions in terms of the syntax and semantics of their chosen object-oriented programming language. These people find it impossible (for all intents and purposes) to discuss any software engineering activity (e.g., analysis, design, and testing) without direct mention of some specific implementation language. Outside of producing executable "prototypes", people who emphasize programming languages seldom have well-defined techniques for analyzing their clients' problems or describing the overall architecture of the software product. A great deal of what they do is intuitive. If they happen to have a natural instinct/intuition for good analysis or good design, their efforts on small-to-medium, non-critical projects can result in respectable software solutions."
"From a very early age, we form concepts. Each concept is a particular idea or understanding we have about our world. These concepts allow us to make sense of and reason about the things in our world. These things to which our concepts apply are called objects."
"Is object-oriented technology mature enough upon which to build industrial-strength systems? Absolutely. Does this technology scale? Indeed. Is it the sole technology worth considering? No way. Is there some better technology we should be using in the future? Possibly, but I am clueless as to what that might be. It is dangerous to make predictions, especially in a discipline that changes so rapidly, but one thing I can say with confidence is that I have seen the future, and it is object-oriented."
"The (UML) is a general-purpose visual modeling language that is used to specify, visualize, construct, and document the artifacts of a software system. It captures decisions and understanding about systems that must be constructed. It is used to understand, design, browse, configure, maintain, and control information about such systems. It is intended for use with all development methods, lifecycle stages, application domains, and media. The modeling language is intended to unify past experience about modeling techniques and to incorporate current software best practices into a standard approach. UML includes semantic concepts, notation, and guidelines. It has static, dynamic, environmental, and organizational parts. It is intended to be supported by interactive visual modeling tools that have code generators and report writers. The UML specification does not define a standard process but is intended to be useful with an iterative development process. It is intended to support most existing object-oriented development processes."
"All OO languages show some tendency to suck programmers into the trap of excessive layering. Object frameworks and object browsers are not a substitute for good design or documentation, but they often get treated as one. Too many layers destroy transparency: It becomes too difficult to see down through them and mentally model what the code is actually doing. The Rules of Simplicity, Clarity, and Transparency get violated wholesale, and the result is code full of obscure bugs and continuing maintenance problems."
"The key books about object-oriented graphical modeling languages appeared between 1988 and 1992. Leading figures included Grady Booch [Booch,OOAD]; Peter Coad [Coad, OOA], [Coad, OOD]; Ivar Jacobson (Objectory) [Jacobson, OOSE]; Jim Odell [Odell]; Jim Rumbaugh (OMT) [Rumbaugh, insights], [Rumbaugh, OMT]; Sally Shlaer and Steve Mellor [Shlaer and Mellor, data], [Shlaer and Mellor, states] ; and Rebecca Wirfs-Brock (Responsibility Driven Design) [Wirfs-Brock]."
"Object-oriented design is the roman numerals of computing."
"Systems engineering as an approach and methodology grew in response to the increase size and complexity of systems and projects... This engineering approach to the management of complexity by modularization was re-deployed in the software engineering discipline in the 1960s and 1970s with a proliferation of structured methodologies that enabled the analysis, design and development of information systems by using techniques for modularized description, design and development of system components. Yourdon and DeMarco's Structured Analysis and Design, SSADM, James Martin's Information Engineering, and Jackson's Structured Design and Programming are examples from this era. They all exploited modularization to enable the parallel development of data, process, functionality and performance components of large software systems. The development of object orientation in the 1990s exploited modularization to develop reusable software. The idea was to develop modules that could be mixed and matched like Lego bricks to deliver to a variety of whole system specifications. The modularization and reusability principles have stood the test of time and are at the heart of modern software development."
"Today, no one would dispute that information technology has become the backbone of commerce. It underpins the operations of individual companies, ties together far-flung supply chains, and, increasingly, links businesses to the customers they serve. Hardly a dollar or a euro changes hands anymore without the aid of computer systems."
"Our exploration of emergent social structures across domains of human activity and experience leads to an over-arching conclusion: as an historical trend, dominant functions and processes in the Information Age are increasingly organized around networks. Networks constitute the new social morphology of our societies, and the diffusion of networking logic substantially modifies the operation and outcomes in processes of production, experience, power, and culture. While the networking form of social organization has existed in other times and spaces, the new information technology paradigm provides the material basis for its pervasive expansion throughout the entire social structure."
"I would remind you that in the United States we had an increasing gap between the rich and the poor for about 20 years, as we moved into this new economic phase. The same thing happened when we changed from being an agricultural economy to an industrial economy. In the last 2 or 3 years, we started to see the gap close again. And the answer is not to run away from globalization. The answer is to make change our friend. The answer is to have broad access to information and information technology, to have broad-based systems of education and health care and family supports in every country, and to continue to try to shape the global economy."
"The coordination of information technology management presents a challenge to firms with dispersed IT practices. Decentralization may bring flexibility and fast response to changing business needs, as well as other benefits, but decentralization also makes systems integration difficult, presents a barrier to standardization, and acts as a disincentive toward achieving economies of scale. As a result, there is a need to balance the decentralization of IT management to business units with some centralized planning for technology, data, and human resources."
"During the Cold War a new conceptual framework took hold of U.S. defense thinking in an attempt to reduce unpredictability. The advent of the computer and its incorporation into the military as a data processor and numbers cruncher during World War II led commanders to believe that the uncertainty and unpredictability that defined chaos on the battlefield could be overcome through information technology. Chaos was seen as an information deficiency rather than an inescapable element of warfare. Massive amounts of data were collected and processed in an attempt to reach battlefield clarity and subordinate the theatre of war. The new term "command and control" described the belief that commanders could issue orders, receive new information through feedback of their technology system and adjust subsequent orders accordingly."
"Tracking the individual learning curves of the major technologies that comprise the infrastructure of information technology provides a more detailed account of the present and future state-of-the art of the technologies underlying convergence. The base technologies of digital electronics, general-purpose computer architectures, software and interaction are mature and provide solid foundations for computer science. The upper technologies of knowledge representation and acquisition, autonomy and sociality, support product innovation and provide the beginnings of foundations for knowledge science. Well's dream of a world brain making available all of human knowledge is well on its way to realization and it is in the representation, acquisition, and access and effective application of that knowledge that the commercial potential and socio-economic impact of convergence lies."
"It is clear that even though information technology (I/T) has evolved from its traditional orientation of administrative support toward a more strategic role within an organization, there is still a glaring lack of fundamental frameworks within which to understand the potential of I/T for tomorrow's organizations. In this paper, we develop a model for conceptualizing and directing the emerging area of strategic management of information technology. This model, termed the Strategic Alignment Model, defined in terms of four fundamental domains of strategic choice: business strategy, information technology strategy, organizational infrastructure and processes, and information technology infrastructure and processes-each with its own underlying dimensions."
"The strategic role of information systems in “extending” the enterprise is examined. A number of issues emerge as essential considerations in the strategic alignment of the investment in information technology and business strategy. Information technologies transform organizational boundaries, interorganizational relations, and marketplace competitive and cooperative practice. The paper presents a framework of strategic control that guides the planning and execution of these investments in information technology for business transformation, seeking increased understanding and influence. Emerging information technologies change the limits of what is possible in the leverage of strategic control through transformation of boundaries, relations, and markets."
"[ Technology is] the instrumentality for accessing and using free energies in human societies for human and social purposes."
"The new information technologies can be seen to drive societies toward increasingly dynamic high-energy regions further and further from thermodynamical equilibrium, characterized by decreasing specific entropy and increasingly dense free-energy flows, accessed and processed by more and more complex social, economic, and political structures."
"Intelligence amplification refers to the effective use of information technology in augmenting human intelligence. It will lead to a brave new world of dating – a positive change, mind you, unlike Aldous Huxley’s 1932 novel Brave New World."
"To date, most research on information technology (IT) outsourcing concludes that firms decide to outsource IT services because they believe that outside vendors possess production cost advantages. Yet it is not clear whether vendors can provide production"
"The strategic use of information technology (I/T) is now and has been a fundamental issue for every business. In essence, I/T can alter the basic nature of an industry. The effective and efficient utilization of information technology requires the alignment of the I/T strategies with the business strategies, something that was not done successfully in the past with traditional approaches. New methods and approaches are now available. The strategic alignment framework applies the Strategic Alignment Model to reflect the view that business success depends on the linkage of business strategy, information technology strategy, organizational infrastructure and processes, and I/T infrastructure and processes... We [have looked] at why it may not be sufficient to work on any one of these areas in isolation or to only harmonize business strategy and information technology. One reason is that, often, too much attention is placed on technology, rather than business, management, and organizational issues. The objective is to build an organizational structure and set of business processes that reflect the interdependence of enterprise strategy and information technology capabilities. The attention paid to the linkage of information technology to the enterprise can significantly affect the competitiveness and efficiency of the business. The essential issue is how information technology can enable the achievement of competitive and strategic advantage for the enterprise."
"We need to recognise that the entire information sector—from music to newspapers to telecoms to internet to semiconductors and anything in-between—has become subject to a gigantic market failure in slow motion. A market failure exists when market prices cannot reach a self-sustaining equilibrium. The market failure of the entire information sector is one of the fundamental trends of our time, with far-reaching long-term effects, and it is happening right in front of our eyes."
"Standing here before a mural of your revolution, I want to talk about a very different revolution that is taking place right now, quietly sweeping the globe without bloodshed or conflict. Its effects are peaceful, but they will fundamentally alter our world, shatter old assumptions, and reshape our lives. It's easy to underestimate because it's not accompanied by banners or fanfare. It's been called the technological or information revolution, and as its emblem, one might take the tiny silicon chip, no bigger than a fingerprint. One of these chips has more computing power than a roomful of old-style computers. As part of an exchange program, we now have an exhibition touring your country that shows how information technology is transforming our lives -- replacing manual labor with robots, forecasting weather for farmers, or mapping the genetic code of DNA for medical researchers. These microcomputers today aid the design of everything from houses to ears to spacecraft; they even design better and faster computers. They can translate English into Russian or enable the blind to read or help Michael Jackson produce on one synthesizer the sounds of a whole orchestra. Linked by a network of satellites and fiber-optic cables, one individual with a desktop computer and a telephone commands resources unavailable to the largest governments just a few years ago."
"Our system is not fit for purpose. It's inadequate in terms of its scope, it's inadequate in terms of its information technology, leadership, management systems and processes."
"Now let me pull out so we’re clear about the problem we all face and how we got here. The attacks against us in Rappler began 5 years ago when we demanded an end to impunity on two fronts: Duterte’s drug war and Mark Zuckerberg’s Facebook. Today, it has only gotten worse – and Silicon Valley’s sins came home to roost in the United States on January 6 with mob violence on Capitol Hill. What happens on social media doesn’t stay on social media. Online violence is real world violence. Social media is a deadly game for power and money, what Shoshana Zuboff calls surveillance capitalism, extracting our private lives for outsized corporate gain. Our personal experiences are sucked into a database, organized by AI, then sold to the highest bidder. Highly profitable micro-targeting operations are engineered to structurally undermine human will – a behavior modification system in which we are Pavlov’s dogs, experimented on in real time with disastrous consequences in countries like mine, Myanmar, India, Sri Lanka and so many more. These destructive corporations have siphoned money away from news groups and now pose a foundational threat to markets and elections. Facebook is the world’s largest distributor of news, and yet studies have shown that lies laced with anger and hate spread faster and further than facts on social media. These American companies controlling our global information ecosystem are biased against facts, biased against journalists. They are – by design – dividing us and radicalizing us. Without facts, you can’t have truth. Without truth, you can’t have trust. Without trust, we have no shared reality, no democracy, and it becomes impossible to deal with our world’s existential problems: climate, coronavirus, the battle for truth."
"Assessing the value of information technology (IT) has never been easy. Delayed benefits, unintended uses, business changes, and hidden support costs inhibit meaningful evaluation of individual TT investments. This was true when most investments were focused on the support of a single business process or functional area. It is even more true as business executives ponder implementations of shared technologies like data warehouses and networks, replacement of large legacy systems, and reskilling of the IT staff. Although firms introduce some systems to reduce costs and can evaluate them in terms of their success in doing so, they want many IT initiatives to support a firm's objectives. The value of these initiatives rest in their contributions to a firm's competitiveness, which is often non quantifiable and uncertain."
"A fundamental change is taking place in the nature and application of technology in business, a change with profound and far-reaching implications for companies of every size and shape. A multimillion dollar research program conducted by the DMR Group, Inc., studied more than 4,500 organizations in North America, Europe, and the Far East to investigate the nature and impact of changes in technology. The synthesis and analysis of this information indicate that information technology is going through its first paradigm shift. Driven by the demands of the competitive business environment and profound changes in the nature of computers, the information age is evolving into a second era. Computing platforms in most organizations today are not able to deliver the goods for corporate rebirth. It is only through open network computing that the open networked client/server enterprise can be achieved. In nontechnical language this book shows managers and professionals how to take immediate action for the short-term benefits of the new technology while positioning their organizations for long-term growth and transformation..."
"In the last few years, an information resources management concept has emerged as a focus of managing information activities. Although lacking a concise or universal definition, the IRM concept has become a framework for planning more responsive and coordinated information management organization structures throughout Government and the private sector. In brief, IRM is viewed as an integration of management responsibilities for the control of information-related activities and related processes. It includes the planning and management of information collection, use, and dissemination as well as management of information technology."
"The objective of (CIM) is the appropriate integration of enterprise operations by means of efficient information exchange within the enterprise with the help of Information Technology (IT). Integration includes the physical and logical connection of processes by means of data communications technology operating to specified standards, but also the integration of enterprise functions as welt as enterprise information. Generalized models and an open systems architecture are required to reduce the system complexity to a manageable level. They are used to identify the principal components, processes, constraints and information sources used to describe a manufacturing enterprise progressing towards CIM. In this paper, the basic concepts of an open-systems architecture for CIM called are presented. The function view of the CIM-OSA modelling framework is discussed. CIM-OSA provides a unique set of advanced features to model functionality and behaviour of CIM systems at three distinct levels (requirements definition, design specification and implementation description)."
"Vannevar Bush is a great name for playing six degrees of separation. Turn back the clock on any aspect of information technology — from the birth of Silicon Valley and the marriage of science and the military to the advent of the World Wide Web — and you find his footprints. As historian Michael Sherry says, "To understand the world of Bill Gates and Bill Clinton, start with understanding Vannevar Bush.""
"The Information Age is unfolding just as predicted by many of the sociological prognosticators of this century. Information issues are on everyone’s mind and on multitudes of lips. It is hard to pick up a newspaper or current affairs magazine without seeing a feature on the internet, web pages, e-mail, television terminals or some other new technology. In fact, technology innovation is relentless and escalating and technology stocks continually drive the stock market to high after high. There is no field of human endeavor that is exempt from the onslaught of information technology."
"Unix was built for me. I didn't build it as an operating system for other people, I built it to do games, and to do my stuff."
"This is the Unix philosophy: Write programs that do one thing and do it well. Write programs to work together. Write programs to handle text streams, because that is a universal interface."
"Unix is user-friendly — it's just choosy about who its friends are."
"UNIX is simple and coherent, but it takes a genius (or at any rate, a programmer) to understand and appreciate its simplicity."
"I define UNIX as 30 definitions of regular expressions living under one roof."
"Unix was not designed to stop you from doing stupid things, because that would also stop you from doing clever things."
"The Unix room still exists, and it may be the greatest cultural reason for the success of Unix as a technology. More groups could profit from its lesson, but it's really hard to add a Unix-room-like space to an existing organization. You need the culture to encourage people not to hide in their offices, you need a way of using systems that makes a public machine a viable place to work - typically by storing the data somewhere other than the "desktop" - and you need people like Ken and Dennis (and Brian Kernighan and Doug McIlroy and Mike Lesk and Stu Feldman and Greg Chesson and ...) hanging out in the room, but if you can make it work, it's magical. When I first started at the Labs, I spent most of my time in the Unix room. The buzz was palpable; the education unparalleled."
"I think the Linux phenomenon is quite delightful, because it draws so strongly on the basis that Unix provided. Linux seems to be the among the healthiest of the direct Unix derivatives, though there are also the various BSD systems as well as the more official offerings from the workstation and mainframe manufacturers."
"Those who don't understand Unix are condemned to reinvent it, poorly."
"Anything new will have to come along with the type of revolution that came along with Unix. Nothing was going to topple IBM until something came along that made them irrelevant. I'm sure they have the mainframe market locked up, but that's just irrelevant. And the same thing with Microsoft: Until something comes along that makes them irrelevant, the entry fee is too difficult and they won't be displaced."
"Will journalling become prevalent in the Unix world at large? Probably not. After all, it's nonstandard."
"The P versus NP problem was first mentioned in a 1956 letter from Kurt Gödel to John von Neumann, two of the greatest mathematical minds of the twentieth century."
"Every year the Association for Computing Machinery awards the ACM Turing Award, the computer science equivalent of the Nobel Prize, named for Alan Turing, who gave computer science its foundations in the 1930s. In 1982 the ACM presented the Turing Award to Stephen Cook for his work formulating the P versus NP problem. But one Turing Award for the P versus NP problem is not enough, and in 1985 Richard Karp received the award for his work on algorithms, most notably for the twenty-one NP-complete problems."
"Nowadays, mathematicians routinely use computers to solve problems, even great problems. Computers are good at arithmetic, but mathematics goes far beyond mere ‘sums’, so putting a problem on a computer is seldom straightforward. Often the hardest part of the work is to convert the problem into one that a computer calculation can solve, and even then the computer may struggle. Many of the great problems that have been solved recently involve little or no work with a computer. Fermat’s last theorem and the Poincaré conjecture are examples. When computers have been used to solve great problems, like the four colour theorem or the Kepler conjecture, the computer effectively plays the role of servant. But sometimes the roles are reversed, with mathematics as the servant of computer science. Most of the early work on computer design made good use of mathematical insights, for example the connection between Boolean algebra – an algebraic formulation of logic – and switching circuits, developed in particular by the engineer Claude Shannon, the inventor of information theory. Today, both practical and theoretical aspects of computers rely on the extensive use of mathematics, from many different areas. One of the Clay millennium problems lies in the borderland of mathematics and computer science. It can be viewed both ways: computer science as a servant of mathematics, and mathematics as a servant of computer science. What it requires, and is helping to bring about, is more balanced: a partnership. The problem is about computer algorithms, the mathematical skeletons from which computer programs are made. The crucial concept here is how efficient the algorithm is: how many computational steps it takes to get an answer for a given amount of input data. In practical terms, this tells us how long the computer will take to solve a problem of given size."
"So, what is quantum mechanics? Even though it was discovered by physicists, it’s not a physical theory in the same sense as electromagnetism or general relativity. In the usual “hierarchy of sciences” – with biology at the top, then chemistry, then physics, then math – quantum mechanics sits at a level between math and physics that I don’t know a good name for. Basically, quantum mechanics is the operating system that other physical theories run on as application software (with the exception of general relativity, which hasn’t yet been successfully ported to this particular OS). There’s even a word for taking a physical theory and porting it to this OS: “to quantize.” But if quantum mechanics isn’t physics in the usual sense – if it’s not about matter, or energy, or waves, or particles – then what is it about? From my perspective, it’s about information and probabilities and observables, and how they relate to each other."
"Information? Whose information? Information about what?"
"I argue that quantum mechanics is fundamentally a theory about the representation and manipulation of information, not a theory about the mechanics of nonclassical waves or particles."
"In my view the most fundamental statement of quantum mechanics is that the wavefunction, or more generally the density matrix, represents our knowledge of the system we are trying to describe. I shall return later to the question "whose knowledge?". It is well known that we have to use a wavefunction if we have a "pure state" i.e. if our knowledge of the system is complete, in the sense that any further knowledge is barred by the uncertainty principle. Failing such complete knowledge we must use a density matrix, which therefore contains both quantum and classical ignorance. The wavefunction is a special case of a density matrix, and I shall here talk about "density matrix" when I mean "wavefunction or density matrix"."
"It from bit. Otherwise put, every it—every particle, every field of force, even the spacetime continuum itself—derives its function, its meaning, its very existence entirely—even if in some contexts indirectly—from the apparatus-elicited answers to yes or no questions, binary choices, bits. It from bit symbolizes the idea that every item of the physical world has at bottom—at a very deep bottom, in most instances—an immaterial source and explanation; that what we call reality arises in the last analysis from the posing of yes-no questions and the registering of equipment-evoked responses; in short, that all things physical are information-theoretic in origin and this is a participatory universe."
"Since the late 1980s, architecture frameworks have emerged within the federal government, beginning with the publication of the National Institute of Standards and Technology framework in 1989. Subsequently, we issued Enterprise architecture (EA) guidance, and our research of successful public and private sector organizations’ IT management practices identified the use of Enterprise architectures as a factor critical to these organizations’ success."
"If it walks like a duck and talks like a duck, it’s a duck, right? So if this duck is not giving you the noise that you want, you’ve got to just punch that duck until it returns what you expect."
"Even perfect program verification can only establish that a program meets its specification. […] Much of the essence of building a program is in fact the debugging of the specification."
"Much to the surprise of the builders of the first digital computers, programs written for them usually did not work."
"bug, n: An elusive creature living in a program that makes it incorrect. The activity of "debugging", or removing bugs from a program, ends when people get tired of doing it, not when the bugs are removed."
"If debugging is the process of removing bugs, then programming must be the process of putting them in."
"Testing can only prove the presence of bugs, not their absence."
"silver bullet (SIL-vuhr BOOL-it) noun: A quick solution to a thorny problem. [From the belief that werewolves could be killed when shot with silver bullets.] "Writing code, he (Stuart Feldman) explains, is like writing poetry: every word, each placement counts. Except that software is harder, because digital poems can have millions of lines which are all somehow interconnected. Try fixing programming errors, known as bugs, and you often introduce new ones. So far, he laments, nobody has found a silver bullet to kill the beast of complexity.""
"From then on, when anything went wrong with a computer, we said it had bugs in it."
"The most effective debugging tool is still careful thought, coupled with judiciously placed print statements."
"Everyone knows that debugging is twice as hard as writing a program in the first place. So if you're as clever as you can be when you write it, how will you ever debug it?"
"Beware of bugs in the above code; I have only proved it correct, not tried it."
"A documented bug is not a bug; it is a feature."
"Given enough eyeballs, all bugs are shallow."
"As soon as we started programming, we found to our surprise that it wasn't as easy to get programs right as we had thought. Debugging had to be discovered. I can remember the exact instant when I realized that a large part of my life from then on was going to be spent in finding mistakes in my own programs."
"The last bug isn't fixed until the last user is dead."
"Although distributed computer systems are highly desirable, putting together a properly functioning system is notoriously difficult. Some of the difficulties are pragmatic, for instance, the presence of heterogeneous hardware and software and the lack of adherence to standards. More fundamental difficulties are introduced by three factors: asynchrony, llimited local knowledge, and failures. The term asynchrony means that the absolute and relative times at which events take place cannot always be known precisely. Because each computing entity can only be aware of information that it acquires, it has only a local view of the global situation. Computing entities can fail independently, leaving some components operational while others are not."
"Research on architectures and interconnection networks has resulted in low-cost distributed systems with large numbers of powerful processors that can communicate at high speeds. Research on distributed operating systems has produced ways for employing this high computing potential by dividing the total workload among the available processors. By executing different programs on different processors, the system can have a high throughput. Some system programs (e.g., a file server) may also be distributed among multiple processors, to achieve higher speed and greater reliability. Many user applications can also benefit, for the same reasons. The task of distributing a single user program among multiple processors, however, clearly falls outside the scope of an operating system. Thus, to achieve this distribution, extra effort is required from the applications programmers."
"Today, almost everyone is connected to the Internet and uses different Cloud solutions to store, deliver and process data. Cloud computing assembles large networks of virtualized services such as hardware and software resources. The new era in which ICT penetrated almost all domains (healthcare, aged-care, social assistance, surveillance, education, etc.) creates the need of new multimedia content-driven applications. These applications generate huge amount of data, require gathering, processing and then aggregation in a fault-tolerant, reliable and secure heterogeneous distributed system created by a mixture of Cloud systems (public/private), mobile devices networks, desktop-based clusters, etc. In this context dynamic resource provisioning for Big Data application scheduling became a challenge in modern systems."
"Computer science departments have always considered 'user interface' research to be sissy work."
"When you design a new user interface... you have to start off saying, what are the simplest elements in it? What does a button look like? And you spend months working on a button."
"Thanks to the recent advances in processing speed, data acquisition and storage, machine learning (ML) is penetrating every facet of our lives, and transforming research in many areas in a fundamental manner. Wireless communications is another success story — ubiquitous in our lives, from handheld devices to wearables, smart homes, and automobiles. While recent years have seen a flurry of research activity in exploiting ML tools for various wireless communication problems, the impact of these techniques in practical communication systems and standards is yet to be seen."
"The future trends are illustrated by two case studies. The first describes a recently developed method for dealing with reliability of decisions of classifiers, which seems to be promising for intelligent data analysis in medicine. The second describes an approach to using machine learning in order to verify some unexplained phenomena from complementary medicine, which is not (yet) approved by the orthodox medical community but could in the future play an important role in overall medical diagnosis and treatment."
"Machine learning can be broadly defined as computational methods using experience to improve performance or to make accurate predictions. Here, experience refers to the past information available to the learner, which typically takes the form of electronic data collected and made avalaible for analysis. This data could be in the form of digitized human-labeled training sets, or other types of information obtained via interactin with the environment. In all cases, its quality and size are crucial to the success of the predictions made by the learner."
"Years after Simons's team at Renaissance adopted machine-learning techniques, other quants have begun to embrace these methods. Renaissance anticipated a transformation in decision-making that's sweeping almost every business and walk of life. More companies and individuals are accepting and embracing models that continuously learn from their successes and failures. As investor Matthew Granade has noted, Amazon, Tencent, Netflix, and others that rely on dynamic, ever-changing models are emerging dominant. The more data that's fed to the machines, the smarter they're supposed to become."
"I didn’t pay attention to it at all, to be perfectly honest. Having been trained as a computer scientist in the 90s, everybody knew that AI didn’t work. People tried it, they tried neural nets and none of it worked. The revolution in deep nets has been very profound, it definitely surprised me, even though I was sitting right there."
"In deep learning, nothing is ever just about the equations. It is how you ... put them on the hardware, it’s a giant bag of black magic tricks that only very few people have truly mastered."
"A major reason Google’s search engine is so successful is its PageRank algorithm, which assigns a pecking order to Web pages based on the pages that point to them. A page is important, according to Google, if other important pages link to it. But the Internet is not the only web around. In ecology, for instance, there are food webs — the often complex networks of who eats whom. Inspired by PageRank, Stefano Allesina of the University of Chicago and Mercedes Pascual of the University of Michigan have devised an algorithm of their own for the relationships in a food web. As described in the online open-access journal PLoS Computational Biology, the algorithm uses the links between species in a food web to determine the relative importance of species in a food web, which will have the most impact if they become extinct. ... One key to PageRank’s success is that its developers introduced a small probability that a Web user would jump from one page to any other. This in effect makes the Web circular, and makes the algorithm solvable. But in food webs, Dr. Allesina said, “you can’t go from the grass to the lion — the grass has to go through the gazelle first.""
"PageRank is a well-known algorithm for measuring centrality in networks. It was originally proposed by Google for ranking pages in the World Wide Web. One of the intriguing empirical properties of PageRank is the so-called ‘power-law hypothesis’: in a scale-free network, the PageRank scores follow a power law with the same exponent as the (in-)degrees. To date, this hypothesis has been confirmed empirically and in several specific random graphs models."
"PageRank (the name is a trademark of Google) is a method of measuring the popularity or importance of web pages. PageRank is a mathematical algorithm, or systematic procedure, at the heart of Google's search software."
"The importance of a Web page is an inherently subjective matter, which depends on the readers interests, knowledge and attitudes. But there is still much that can be said objectively about the relative importance of Web pages. This paper describes PageRank, a method for objectively and mechanically rating, effectively measuring the human interest and attention devoted to them. We compare PageRank to an idealized random Web surfer. We show how to efficiently compute PageRank for large numbers of pages. And we show how to apply PageRank to search and to user navigation."
"One of the major breakthroughs with Google’s search engine was a formula called PageRank, named after Larry Page, one of Google’s founders and now the chief executive of its parent company, Alphabet. PageRank works on the basic premise that a page’s value can be determined by how many sites link to it. In the early days of web search, this was a novel concept, and it helped to propel Google past competitors like Yahoo and AltaVista. The search engine has gotten more sophisticated over the years. (It was founded 20 years ago on Tuesday.) In addition to PageRank, the company has also said that the software looks at how often and where the keywords being searched for show up on a specific page, how recently the page was created (a sign of the freshness of the information) and the location of the person making the search."
"I now run a 15-person company and, in terms of making us productive, our systems are far better than those of any of big company. We bring up and roll out new apps in a matter of hours. If we like them, we keep them, if not, we abandon them. We self-administer, everything meshes, we have access everywhere, it’s safe, it’s got great uptime, it’s all backed up, and our costs are tiny. The vision came true."
"The ideal OS for me would be one that had a well-designed GUI that was easy to set up and use, but that included terminal windows where I could revert to the command line interface, and run GNU software, when it made sense. A few years ago, Be Inc. invented exactly that OS. It is called the BeOS."
"Thank God that [Apple buying BeOS] didn’t happen, because I hated Apple's management. I couldn’t picture myself in there."
"BeOS was pretty good, mind you. Positioned as a multimedia platform, BeOS benefited from symmetric multiprocessing, pervasive multithreading, preemptive multitasking and BFS, a custom 64-bit journaling file system known as BFS. It too was developed on the principles of clarity and an uncluttered design."
"If operating systems were judged purely on engineering, BeOS would be remembered as one of the greatest desktop OSs ever built. Instead, it became a footnote. Not because it was slow. Not because it was unstable. Not because it was badly designed. But because it showed up at the wrong time, against the wrong opponents, in an industry where technology alone doesn’t win."
"I would be pleased to see Haiku flourish, if only because someone chose to do so. Sometimes, you need those kind of efforts. Quixotic. But they make the world a happier place, simply by the grace of their quirky existence"
"Haiku is a true eye-opener for me. It shows how a desktop can “just work”. In many aspects this system is exactly addressing what has been driving me crazy on the “Linux” desktop for well over a decade, as someone originally coming from the Mac and looking for the same level of elegance and simplicity."