First Quote Added
April 10, 2026
Latest Quote Added
"I would be pleased to see Haiku flourish, if only because someone chose to do so. Sometimes, you need those kind of efforts. Quixotic. But they make the world a happier place, simply by the grace of their quirky existence"
"Haiku is a true eye-opener for me. It shows how a desktop can âjust workâ. In many aspects this system is exactly addressing what has been driving me crazy on the âLinuxâ desktop for well over a decade, as someone originally coming from the Mac and looking for the same level of elegance and simplicity."
"Thank God that [Apple buying BeOS] didnât happen, because I hated Apple's management. I couldnât picture myself in there."
"The ideal OS for me would be one that had a well-designed GUI that was easy to set up and use, but that included terminal windows where I could revert to the command line interface, and run GNU software, when it made sense. A few years ago, Be Inc. invented exactly that OS. It is called the BeOS."
"BeOS was pretty good, mind you. Positioned as a multimedia platform, BeOS benefited from symmetric multiprocessing, pervasive multithreading, preemptive multitasking and BFS, a custom 64-bit journaling file system known as BFS. It too was developed on the principles of clarity and an uncluttered design."
"If operating systems were judged purely on engineering, BeOS would be remembered as one of the greatest desktop OSs ever built. Instead, it became a footnote. Not because it was slow. Not because it was unstable. Not because it was badly designed. But because it showed up at the wrong time, against the wrong opponents, in an industry where technology alone doesnât win."
"I now run a 15-person company and, in terms of making us productive, our systems are far better than those of any of big company. We bring up and roll out new apps in a matter of hours. If we like them, we keep them, if not, we abandon them. We self-administer, everything meshes, we have access everywhere, itâs safe, itâs got great uptime, itâs all backed up, and our costs are tiny. The vision came true."
"A major reason Googleâs search engine is so successful is its PageRank algorithm, which assigns a pecking order to Web pages based on the pages that point to them. A page is important, according to Google, if other important pages link to it. But the Internet is not the only web around. In ecology, for instance, there are food webs â the often complex networks of who eats whom. Inspired by PageRank, Stefano Allesina of the University of Chicago and Mercedes Pascual of the University of Michigan have devised an algorithm of their own for the relationships in a food web. As described in the online open-access journal PLoS Computational Biology, the algorithm uses the links between species in a food web to determine the relative importance of species in a food web, which will have the most impact if they become extinct. ... One key to PageRankâs success is that its developers introduced a small probability that a Web user would jump from one page to any other. This in effect makes the Web circular, and makes the algorithm solvable. But in food webs, Dr. Allesina said, âyou canât go from the grass to the lion â the grass has to go through the gazelle first.""
"PageRank is a well-known algorithm for measuring centrality in networks. It was originally proposed by Google for ranking pages in the World Wide Web. One of the intriguing empirical properties of PageRank is the so-called âpower-law hypothesisâ: in a scale-free network, the PageRank scores follow a power law with the same exponent as the (in-)degrees. To date, this hypothesis has been confirmed empirically and in several specific random graphs models."
"One of the major breakthroughs with Googleâs search engine was a formula called PageRank, named after Larry Page, one of Googleâs founders and now the chief executive of its parent company, Alphabet. PageRank works on the basic premise that a pageâs value can be determined by how many sites link to it. In the early days of web search, this was a novel concept, and it helped to propel Google past competitors like Yahoo and AltaVista. The search engine has gotten more sophisticated over the years. (It was founded 20 years ago on Tuesday.) In addition to PageRank, the company has also said that the software looks at how often and where the keywords being searched for show up on a specific page, how recently the page was created (a sign of the freshness of the information) and the location of the person making the search."
"PageRank (the name is a trademark of Google) is a method of measuring the popularity or importance of web pages. PageRank is a mathematical algorithm, or systematic procedure, at the heart of Google's search software."
"The importance of a Web page is an inherently subjective matter, which depends on the readers interests, knowledge and attitudes. But there is still much that can be said objectively about the relative importance of Web pages. This paper describes PageRank, a method for objectively and mechanically rating, effectively measuring the human interest and attention devoted to them. We compare PageRank to an idealized random Web surfer. We show how to efficiently compute PageRank for large numbers of pages. And we show how to apply PageRank to search and to user navigation."
"Machine learning can be broadly defined as computational methods using experience to improve performance or to make accurate predictions. Here, experience refers to the past information available to the learner, which typically takes the form of electronic data collected and made avalaible for analysis. This data could be in the form of digitized human-labeled training sets, or other types of information obtained via interactin with the environment. In all cases, its quality and size are crucial to the success of the predictions made by the learner."
"In deep learning, nothing is ever just about the equations. It is how you ... put them on the hardware, itâs a giant bag of black magic tricks that only very few people have truly mastered."
"Thanks to the recent advances in processing speed, data acquisition and storage, machine learning (ML) is penetrating every facet of our lives, and transforming research in many areas in a fundamental manner. Wireless communications is another success story â ubiquitous in our lives, from handheld devices to wearables, smart homes, and automobiles. While recent years have seen a flurry of research activity in exploiting ML tools for various wireless communication problems, the impact of these techniques in practical communication systems and standards is yet to be seen."
"The future trends are illustrated by two case studies. The first describes a recently developed method for dealing with reliability of decisions of classifiers, which seems to be promising for intelligent data analysis in medicine. The second describes an approach to using machine learning in order to verify some unexplained phenomena from complementary medicine, which is not (yet) approved by the orthodox medical community but could in the future play an important role in overall medical diagnosis and treatment."
"I didnât pay attention to it at all, to be perfectly honest. Having been trained as a computer scientist in the 90s, everybody knew that AI didnât work. People tried it, they tried neural nets and none of it worked. The revolution in deep nets has been very profound, it definitely surprised me, even though I was sitting right there."
"Years after Simons's team at Renaissance adopted machine-learning techniques, other quants have begun to embrace these methods. Renaissance anticipated a transformation in decision-making that's sweeping almost every business and walk of life. More companies and individuals are accepting and embracing models that continuously learn from their successes and failures. As investor Matthew Granade has noted, Amazon, Tencent, Netflix, and others that rely on dynamic, ever-changing models are emerging dominant. The more data that's fed to the machines, the smarter they're supposed to become."
"Computer science departments have always considered 'user interface' research to be sissy work."
"When you design a new user interface... you have to start off saying, what are the simplest elements in it? What does a button look like? And you spend months working on a button."
"Although distributed computer systems are highly desirable, putting together a properly functioning system is notoriously difficult. Some of the difficulties are pragmatic, for instance, the presence of heterogeneous hardware and software and the lack of adherence to standards. More fundamental difficulties are introduced by three factors: asynchrony, llimited local knowledge, and failures. The term asynchrony means that the absolute and relative times at which events take place cannot always be known precisely. Because each computing entity can only be aware of information that it acquires, it has only a local view of the global situation. Computing entities can fail independently, leaving some components operational while others are not."
"Research on architectures and interconnection networks has resulted in low-cost distributed systems with large numbers of powerful processors that can communicate at high speeds. Research on distributed operating systems has produced ways for employing this high computing potential by dividing the total workload among the available processors. By executing different programs on different processors, the system can have a high throughput. Some system programs (e.g., a file server) may also be distributed among multiple processors, to achieve higher speed and greater reliability. Many user applications can also benefit, for the same reasons. The task of distributing a single user program among multiple processors, however, clearly falls outside the scope of an operating system. Thus, to achieve this distribution, extra effort is required from the applications programmers."
"Today, almost everyone is connected to the Internet and uses different Cloud solutions to store, deliver and process data. Cloud computing assembles large networks of virtualized services such as hardware and software resources. The new era in which ICT penetrated almost all domains (healthcare, aged-care, social assistance, surveillance, education, etc.) creates the need of new multimedia content-driven applications. These applications generate huge amount of data, require gathering, processing and then aggregation in a fault-tolerant, reliable and secure heterogeneous distributed system created by a mixture of Cloud systems (public/private), mobile devices networks, desktop-based clusters, etc. In this context dynamic resource provisioning for Big Data application scheduling became a challenge in modern systems."
"The last bug isn't fixed until the last user is dead."
"Given enough eyeballs, all bugs are shallow."
"As soon as we started programming, we found to our surprise that it wasn't as easy to get programs right as we had thought. Debugging had to be discovered. I can remember the exact instant when I realized that a large part of my life from then on was going to be spent in finding mistakes in my own programs."
"Everyone knows that debugging is twice as hard as writing a program in the first place. So if you're as clever as you can be when you write it, how will you ever debug it?"
"silver bullet (SIL-vuhr BOOL-it) noun: A quick solution to a thorny problem. [From the belief that werewolves could be killed when shot with silver bullets.] "Writing code, he (Stuart Feldman) explains, is like writing poetry: every word, each placement counts. Except that software is harder, because digital poems can have millions of lines which are all somehow interconnected. Try fixing programming errors, known as bugs, and you often introduce new ones. So far, he laments, nobody has found a silver bullet to kill the beast of complexity.""
"Beware of bugs in the above code; I have only proved it correct, not tried it."
"Much to the surprise of the builders of the first digital computers, programs written for them usually did not work."
"A documented bug is not a bug; it is a feature."
"bug, n: An elusive creature living in a program that makes it incorrect. The activity of "debugging", or removing bugs from a program, ends when people get tired of doing it, not when the bugs are removed."
"The most effective debugging tool is still careful thought, coupled with judiciously placed print statements."
"Even perfect program verification can only establish that a program meets its specification. […] Much of the essence of building a program is in fact the debugging of the specification."
"If debugging is the process of removing bugs, then programming must be the process of putting them in."
"Testing can only prove the presence of bugs, not their absence."
"From then on, when anything went wrong with a computer, we said it had bugs in it."
"If it walks like a duck and talks like a duck, itâs a duck, right? So if this duck is not giving you the noise that you want, youâve got to just punch that duck until it returns what you expect."
"The coordination of information technology management presents a challenge to firms with dispersed IT practices. Decentralization may bring flexibility and fast response to changing business needs, as well as other benefits, but decentralization also makes systems integration difficult, presents a barrier to standardization, and acts as a disincentive toward achieving economies of scale. As a result, there is a need to balance the decentralization of IT management to business units with some centralized planning for technology, data, and human resources."
"Since the late 1980s, architecture frameworks have emerged within the federal government, beginning with the publication of the National Institute of Standards and Technology framework in 1989. Subsequently, we issued Enterprise architecture (EA) guidance, and our research of successful public and private sector organizationsâ IT management practices identified the use of Enterprise architectures as a factor critical to these organizationsâ success."
"Information? Whose information? Information about what?"
"So, what is quantum mechanics? Even though it was discovered by physicists, itâs not a physical theory in the same sense as electromagnetism or general relativity. In the usual âhierarchy of sciencesâ â with biology at the top, then chemistry, then physics, then math â quantum mechanics sits at a level between math and physics that I donât know a good name for. Basically, quantum mechanics is the operating system that other physical theories run on as application software (with the exception of general relativity, which hasnât yet been successfully ported to this particular OS). Thereâs even a word for taking a physical theory and porting it to this OS: âto quantize.â But if quantum mechanics isnât physics in the usual sense â if itâs not about matter, or energy, or waves, or particles â then what is it about? From my perspective, itâs about information and probabilities and observables, and how they relate to each other."
"I argue that quantum mechanics is fundamentally a theory about the representation and manipulation of information, not a theory about the mechanics of nonclassical waves or particles."
"In my view the most fundamental statement of quantum mechanics is that the wavefunction, or more generally the density matrix, represents our knowledge of the system we are trying to describe. I shall return later to the question "whose knowledge?". It is well known that we have to use a wavefunction if we have a "pure state" i.e. if our knowledge of the system is complete, in the sense that any further knowledge is barred by the uncertainty principle. Failing such complete knowledge we must use a density matrix, which therefore contains both quantum and classical ignorance. The wavefunction is a special case of a density matrix, and I shall here talk about "density matrix" when I mean "wavefunction or density matrix"."
"It from bit. Otherwise put, every itâevery particle, every field of force, even the spacetime continuum itselfâderives its function, its meaning, its very existence entirelyâeven if in some contexts indirectlyâfrom the apparatus-elicited answers to yes or no questions, binary choices, bits. It from bit symbolizes the idea that every item of the physical world has at bottomâat a very deep bottom, in most instancesâan immaterial source and explanation; that what we call reality arises in the last analysis from the posing of yes-no questions and the registering of equipment-evoked responses; in short, that all things physical are information-theoretic in origin and this is a participatory universe."
"Every year the Association for Computing Machinery awards the ACM Turing Award, the computer science equivalent of the Nobel Prize, named for Alan Turing, who gave computer science its foundations in the 1930s. In 1982 the ACM presented the Turing Award to Stephen Cook for his work formulating the P versus NP problem. But one Turing Award for the P versus NP problem is not enough, and in 1985 Richard Karp received the award for his work on algorithms, most notably for the twenty-one NP-complete problems."
"The P versus NP problem was first mentioned in a 1956 letter from Kurt GĂśdel to John von Neumann, two of the greatest mathematical minds of the twentieth century."
"Nowadays, mathematicians routinely use computers to solve problems, even great problems. Computers are good at arithmetic, but mathematics goes far beyond mere âsumsâ, so putting a problem on a computer is seldom straightforward. Often the hardest part of the work is to convert the problem into one that a computer calculation can solve, and even then the computer may struggle. Many of the great problems that have been solved recently involve little or no work with a computer. Fermatâs last theorem and the PoincarĂŠ conjecture are examples. When computers have been used to solve great problems, like the four colour theorem or the Kepler conjecture, the computer effectively plays the role of servant. But sometimes the roles are reversed, with mathematics as the servant of computer science. Most of the early work on computer design made good use of mathematical insights, for example the connection between Boolean algebra â an algebraic formulation of logic â and switching circuits, developed in particular by the engineer Claude Shannon, the inventor of information theory. Today, both practical and theoretical aspects of computers rely on the extensive use of mathematics, from many different areas. One of the Clay millennium problems lies in the borderland of mathematics and computer science. It can be viewed both ways: computer science as a servant of mathematics, and mathematics as a servant of computer science. What it requires, and is helping to bring about, is more balanced: a partnership. The problem is about computer algorithms, the mathematical skeletons from which computer programs are made. The crucial concept here is how efficient the algorithm is: how many computational steps it takes to get an answer for a given amount of input data. In practical terms, this tells us how long the computer will take to solve a problem of given size."
"Those who don't understand Unix are condemned to reinvent it, poorly."
"I think the Linux phenomenon is quite delightful, because it draws so strongly on the basis that Unix provided. Linux seems to be the among the healthiest of the direct Unix derivatives, though there are also the various BSD systems as well as the more official offerings from the workstation and mainframe manufacturers."