First Quote Added
April 10, 2026
Latest Quote Added
"The way I and most guitarists produce a pinch harmonic is to grasp the pick close to its pointed tip with your thumb and index finger. You then pick a downstroke, intentionally allowing a bit of the fleshy part of the thumb to graze the string at the same time."
"In short, pick squealing, or pinch harmonics is part of the reason why a lot of people started using copious amounts of hairspray and dressing in very tight clothing during the '80s. However, pick squeals, or, in less cool words, pinch harmonics, can be used in a much broader spectrum of ways than just heavy metal, and the technique was originally probably first used by blues players of old."
"When you pick the string just right, a higher pitch other than the fretted note is sounded. This higher pitch is an overtone, or harmonic, that stems from the overtone series related to that note. Indicated by the abbreviation P.H., the pinch harmonic is a fantastic expressive device to use when playing a solo or melody. Much beloved by rock, blues, country and metal guitarists alike, the pinch harmonic has been used to great effect by such legendary axemen as Roy Buchanan, Billy Gibbons, Eddie Van Halen and Zakk Wylde."
"There has always been a good deal of mystery surrounding the pinch harmonic, or, as hip players like to call it, “pick squeal.” A pick squeal is simply an artificial harmonic, or high-pitched sound, produced by choking up on the pick and allowing the thumb or thumbnail to catch the string in just as it is picked. The result, of course, resembles a squeal. Or a squawk. Or a scream. (It could take several tries before you get the desired "s" word.) Anyhow, what was once the domain of blues-rock string benders is now a staple for most metal guitarists."
"The last few decades have provided abundant evidence for physics beyond the two standard models of particle physics and . As is now known, the by far largest part of our universe's matter/energy content lies in the `dark' and consists of and . Despite intensive efforts on the experimental as well as the theoretical side, the origins of both are still completely unknown. Screened scalar fields have been hypothesized as potential candidates for dark energy or dark matter. Among these, some of the most prominent models are the , , and environment-dependent ."
"2. Are there any new elementary scalars not yet discovered with masses below the mass of the -like Higgs boson? For example, do -like particles exist? ... 8. If additional scalars are discovered, how will these discoveries impact the question of the stability of the ? 9. Do neutral (inert) scalars comprise a significant fraction of the dark matter?"
"On July 4, scientists working with data from ongoing experiments at the Large Hadron Collider (LHC) announced the discovery of a new particle "consistent with" the Higgs boson — a subatomic particle also colloquially referred to as the "God particle." After years of design and construction, the LHC first sent protons around its 27 kilometer (17 mile) underground tunnel in 2008. Four years later, the LHC's role in the discovery of the Higgs boson provides a final missing piece for the Standard Model of Particle Physics — a piece that may explain how otherwise massless subatomic particles can acquire mass. Gathered here are images from the construction of the massive $4-billion-dollar machine that allowed us peer so closely into the subatomic world."
"The fundamental scientific purpose of the LHC is to explore the inner structure of matter and the forces that govern its behavior, and thereby understand better the present content of the Universe and its evolution since the Big Bang, and possibly into the future. The unparalleled high energy of the LHC, which is designed to be 7 TeV per proton in each colliding beam, and its enormous collision rate, which is planned to attain about a billion collisions per second, will enable the LHC to examine rare processes occurring at very small distances inside matter. It will be a microscope able to explore the inner structure of matter on scales an order of magnitude smaller than any previous collider. The energies involved in the proton-proton collisions will be similar to those in particle collisions in the first trillionth of a second of the history of the Universe. By studying these processes in the laboratory, the LHC experiments will, in a sense, be looking further back into time than is possible with any telescope."
"With the discovery of the Higgs boson, the next burning question at the LHC is why its mass is so low. Nobody knows the answer to that question, but it is definitely the next hot topic for LHC physicists ..."
"… The relied not on the detection of photon pairs with a certain energy but on the detection of more of those pairs than expected. That reliance on probabilities is why the L.H.C. and other major collider experiments often have independent teams, working with separate detectors, analyzing the same types of collisions—to avoid biasing each other. It is also the reason for the ."
"Standard cosmological models rely on an approximate treatment of gravity, utilizing solutions of the linearized Einstein equations as well as physical approximations. In an era of precision cosmology, we should ask: are these approximate predictions sufficiently accurate for comparison to observations, and can we draw meaningful conclusions about properties of our Universe from them? In this work we examine the accuracy of linearized gravity in the presence of collisionless matter and a cosmological constant utilizing fully general relativistic simulations. We observe the gauge dependence of corrections to linear theory, and note the amplitude of these corrections. For perturbations whose amplitudes are in line with expectations from the standard , we find that the full, general relativistic metric is well described by linear theory in Newtonian and harmonic gauges, while the metric in comoving-synchronous gauge is not. For the most extreme observed structures in our Universe, such as supervoids, our results suggest that corrections to linear gravitational theory can reach or surpass the percent level in all gauges."
"Both classical and quantum corrections can be analyzed by a perturbative approach based on the so-called weak field approximation. The heart of Einstein’s theory is represented by its ten coupled partial differential equations. The solutions of these equations, i.e., the gravitational potentials, are the metric tensor components. In the weak field approximation, the metric tensor can be decomposed in two terms: the flat Minkowski metric and the small perturbation multiplied by the gravitational constant. The solution of the non-linear equations can be considered the sum of infinite terms, and Newton’s theory emerges in the linear order. ... used this approximation to investigate the infinities that emerge by applying the early QFT techniques to quantize the gravitational interaction (Rosenfeld, 1930). Different kinds of divergent quantities had already appeared in the context of QED at the end of the 1920s: their correct treatment will be clarified only after the Second World War."
"It is a basic fact of life that Nature comes to us in many scales. Galaxies, planets, aardvarks, molecules, atoms and nuclei are very different sizes, and are held together with very different binding energies. Happily enough, it is another fact of life that we don’t need to understand what is going on at all scales at once in order to figure out how Nature works at a particular scale. Like good musicians, good physicists know which scales are relevant for which compositions. The mathematical framework which we use to describe nature — — itself shares this basic feature of Nature: it automatically limits the role which smaller distance scales can play in the description of larger objects. This property has many practical applications, since a systematic identification of how scales enter into calculations provides an important tool for analyzing systems which have two very different scales, m ≪ M. In these systems it is usually profitable to expand quantities in the powers of the small parameter, m/M, and the earlier this is done in a calculation, the more it is simplified."
"What makes a field theory effective? We shall argue in this book that the way s are set up in EFTs makes them the most natural and convenient tools to address multi scale problems. Problems with separated scales often appear in Nature, and we intuitively know that it is most convenient to only work with degrees of freedom that are relevant for a particular scale — otherwise the problem quickly becomes intractable! You never worry about physics of the atoms when designing bridges, nor try to track each and every molecule of a gas through ; you instead define some "macroscopic" variables, and once you know how to relate those variables to the more "fundamental" laws, you can stop thinking about those cases and focus only on the relevant large-scale physics. EFT techniques codify this principle when working with problems in quantum field theory."
"... Imagine you have an image with enormous resolution, but all you really need to know is whether a giant gorilla sits at the center of the image. To that end, a low-resolution picture would suffice. Effective-field theory is the tool a nuclear physicist would use to controllably blur the picture, reduce its complexity, and make the problem computationally tractable. Effective-field theory averages out interactions’ short-range components not relevant to the physics of nuclei, and it provides a form of the force using parameters that can be determined directly from QCD. The resulting NN force can then be used to solve the nuclear many-body problem and calculate all relevant nuclear properties"
"Vacuum expectation values of products of neutral operators are discussed. The properties of these distributions arising from , the absence of states and the of the scalar product are determined. The vacuum expectation values are shown to be s of s. Local commutativity of the field is shown to be equivalent to a symmetry property of the analytic functions. The problem of determining a theory of a neutral scalar field given its vacuum expectation values is posed and solved."
"A new conceptual foundation for Tμν on locally flat —to obtain the so-called Casimir effect—is presented. The Casimir ground state is viewed locally as a (nonvacuum) state on Minkowski space-time and the expectation value of the normal-ordered is taken. The same ideas allow us to treat, for the first time, self-interacting fields for arbitrary mass in —using traditional flat-space-time renormalization theory. First-order results for zero-mass λφ4 theory agree with those recently announced by . We point out the crucial role played by the simple renormalization condition that the vacuum expectation value of Tμν must vanish in Minkowski space-time, and in a critical discussion of other approaches, we clarify the question of renormalization ambiguities for Tμν in curved space-times."
"I remember a lunch in which Schwinger began by saying to Weisskopf, “Now I will make you a world.” The “world” was written down on a few paper napkins, one of which I saved. In any event, one of the things that he said, which has stuck with me ever since, was that scalar particles were the only ones that could have nonvanishing vacuum expectation values. He then went on to say that if you couple one of these to a fermion \Psi by a of the form \Phi \overline \Psi\Psi, then this vacuum expectation value would act like a mass. This sort of coupling is how mass generation is done in principle for the fermions. All particles in this picture would acquire their masses from the vacuum."
"Each solstice shows us that we can choose. We cannot stop the winter or the summer from coming. We cannot stop the spring or the fall or make them other than they are. They are gifts from the Universe that we cannot refuse. But we can choose what we will contribute to Life when each arrives."
"The winter solstice has always been special to me as a barren darkness that gives birth to a verdant future beyond imagination, a time of pain and withdrawal that produces something joyfully inconceivable, like a monarch butterfly masterfully extracting itself from the confines of its cocoon, bursting forth into unexpected glory."
"Bioelectricity is about the electrical phenomena of life processes, and is a parallel to the medical subject electrophysiology. One basic mechanism is the energy consuming cell membrane ion pumps polarising a cell, and the action potential generated if the cell is excited and ion channels open. The dipolarisation process generates current flow also in the extracellular volume, which again results in measurable biopotential differences in the tissue. An important part of the subject is intracellular and extracellular single cell measurements with microelecroeds. Single neuron activity and signal transmission can be studied by recording potentials with multiple microelectrode arrays. In addition to measure on endogenic sources, bioelectricty also comprises the use of active stimulating current carrying (CC) electrodes. Since bioelectricity is about life processes the experiments are per definition in vivo or ex vivo."
"Waste biomass is a cheap and relatively abundant source of electrons for microbes capable of producing electrical current outside the cell. Rapidly developing microbial electrochemical technologies, such as microbial fuel cells, are part of a diverse platform of future sustainable energy and chemical production technologies. We review the key advances that will enable the use of exoelectrogenic microorganisms to generate biofuels, hydrogen gas, methane, and other valuable inorganic and organic chemicals. Moreover, we examine the key challenges for implementing these systems and compare them to similar renewable energy technologies. Although commercial development is already underway in several different applications, ranging from wastewater treatment to industrial chemical production, further research is needed regarding efficiency, scalability, system lifetimes, and reliability."
"… once a healthy cell sort of abandons ship and decides that it's going to just be, like, a ravenous, invasive cancer cell, its voltage changes radically. And what you can do with an ion channel drug is change the electrical state of that cell by messing with the ion channels. And in tadpole experiments — and this is early days, but this is moving really fast. In tadpole experiments, they were able to use ion channel drugs to keep cells that had been genetically engineered to be tumors from changing their electrical voltage, right? And without doing any kind of genetic mucking around, they kept these tumors from forming in tadpoles that had been genetically engineered to express tumors."
"The plasma membrane is a heterogeneous structure whose thickness ia around 75 Å and which bounds the cell. An important constituent is lipid, which often represents as much as 70% of the membrane volume (depending on cell type). The membrane lipid readily excludes the passage of ions; it remains for imbedded proteins to form the channels which permit exchange of ions between intracellular and extracellular space. For nerve and muscle, electrical activation is associated with the movement of sodium and potassium (and other) ions across membranes by means of these channels; the proteins not only facilitate the flow of each ion but they control the flow of each giving rise to the ' of the membrane."
"As Oliver Cromwell said to the General Assembly of the Church of Scotland, "I beseech you, in the bowels of Christ, think it possible that you might be mistaken." Life and the affairs of the living are so tangled, the world not only stranger than we imagine but stranger than we can imagine, that all questions are conundrums, no answers "correct." Is it certain that parallel lines never meet? No. Does water freeze at 32 degrees Fahrenheit? Only probably. Shall I marry? Who can say? And yet the world's work must be done. One Oblomov is enough. Thus we learn a conventional certitude, acting as though all were light by blinking the shadow. A simple proof demonstrates that parallel lines do meet, but, on the assumption that they do not, the architect builds the skyscraper. Despite his knowledge of statistical mechanics, the engineer designs the refrigerator to maintain a constant temperature of 31 degrees. Le cœur a ses raisons que la raison ne connait pas [the heart has its reasons that reason does not know], and families are raised."
"Thermodynamics is more like a mode of reasoning than a body of physical law. ...we can think of thermodynamics as a certain pattern of arrows that occurs again and again in very different physical contexts, but, wherever this pattern of explanation occurs, the arrows can be traced back by the methods of statistical mechanics to deeper laws and ultimately to the principles of elementary particle physics. ...the fact that a scientific theory finds applications to a wide variety of different phenomena does not imply anything about the autonomy of this theory from deeper physical laws."
"Maxwell, and then Boltzmann, and then... J. Willard Gibbs consequently expended enormous intellectual effort in devising... statistical mechanics, or... . The uses... extend far beyond gases... describing electric and magnetic interactions, chemical reactions, phase transitions... and all other manner of exchanges of matter and energy. The success... has driven the belief among many physicists that it could be applied with similar success to society. ...[E]verything from the flow of funds in the stock market to the flow of traffic on interstate highways ..."
"The only important variables of interest must involve averaging over many of the degrees of freedom. Statistical mechanics is the formalization of this intuitive concept. The problems to be addressed... are threefold: under what circumstances can the properties of a physical system be defined by the behavior of an appropriate small set of variables, what are the appropriate sets of relevant variables, and how can one calculate the properties of the system in terms of these variables."
"Carnot's Principle. ...If physical phenomena were due exclusively to the movements of atoms whose mutual attraction depended only on the distance, it seems that all these phenomena should be reversible; if all the initial velocities were reversed, these atoms, always subjected to the same forces, ought to go over their trajectories in the contrary sense, just as the earth would describe in the retrograde sense this same elliptic orbit which it describes in the direct sense, if the initial conditions of its motion had been reversed. On this account, if a physical phenomenon is possible, the inverse phenomenon should be equally so, and one should be able to reascend the course of time. Now, it is not so in nature, and this is precisely what the principle of Carnot teaches us; heat can pass from the warm body to the cold body; it is impossible afterward to make it take the inverse route and to reestablish differences of temperature which have been effaced. Motion can be wholly dissipated and transformed into heat by friction; the contrary transformation can never be made except partially. We have striven to reconcile this apparent contradiction. If the world tends toward uniformity, this is not because its ultimate parts, at first unlike, tend to become less and less different; it is because, shifting at random, they end by blending. For an eye which should distinguish all the elements, the variety would remain always as great; each grain of this dust preserves its originality and does not model itself on its neighbors; but as the blend becomes more and more intimate, our gross senses perceive only the uniformity. This is why for example, temperatures tend to a level, without the possibility of going backwards. A drop of wine falls into a glass of water; whatever may be the law of the internal motion of the liquid, we shall soon see it colored of a uniform rosy tint, and however much from this moment one may shake it afterwards, the wine and the water do not seem capable of again separating. Here we have the type of the irreversible physical phenomenon : to hide a grain of barley in a heap of wheat, this is easy; afterwards to find it again and get it out, this is practically impossible. All this Maxwell and Boltzmann have explained; but the one who has seen it most clearly, in a book too little read because it is a little difficult to read, is Gibbs, in his 'Elementary Principles of Statistical Mechanics.’"
"I thought of calling it 'information,' but... Von Neumann told me, 'You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage.'"
"The way he taught statistical mechanics and electromagnetic theory, you got the feeling of a growing science that emerged out of conflict and debate. It was alive, like his lectures, which were full of personal references to men like Boltzmann, Klein, Ritz, Abraham, and Einstein. He told us at the beginning that we should teach ourselves in a fortnight—no babying. Ehrenfest's students all acknowledge how much his method of exposition has influenced their own teaching."
"The idea behind the Feynman path integral goes back to a paper by P. A. M. Dirac published in 1933 in Physikalische Zeitschrift der Sowjetunion. It formed the core of Richard Feynman’s space–time approach to quantum mechanics and quantum electrodynamics. Although the path integral was not mathematically well defined, it was widely used in quantum field theory, statistical mechanics, and string theory. Recently, path integrals have been the guide to spectacular developments in pure mathematics."
"As the natural sciences have developed to encompass increasingly s, scientific rationality has become ever more statistical, or probabilistic. The deterministic classical mechanics of the enlightenment was revolutionized by the near-equilibrium statistical mechanics of late 19th century atomists, by quantum mechanics in the early 20th century, and by the far-from-equilibrium complexity theorists of the later 20th century. Mathematical , information theory, and quantitative social sciences compounded the trend. Forces, objects, and natural types were progressively dissolved into statistical distributions: heterogeneous clouds, entropy deviations, s, gene frequencies, noise-signal ratios and redundancies, dissipative structures, and complex systems at the edge of chaos."
"Another crucial point is that MOND as we know it now is arguably only an approximate 'effective field theory' that approximates some more fundamental scheme at a deeper stratum — some 'FUNDAMOND' — conceptually, in a similar way to thermodynamics being an approximation of the statistical-mechanics, microscopic description."
"There is an interesting analogy... with the philosophy of the natural sciences, which has flourished under the combined influence of both general methodology and classical metaphysical questions (realism vs. antirealism, space, time, causation, etc.) interacting with detailed case studies in... (physics, biology, chemistry, etc.)... [C]ase studies both historical (studies of Einstein's relativity, Maxwell's electromagnetic theory, statistical mechanics, etc.). By contrast, with few exceptions, philosophy of mathematics has developed without the corresponding detailed case studies."
"The kinetic theory of gases is a small branch of physics which has passed from the stage of excitement and novelty into staid maturity. ...Formerly it was hoped that the subject of gases would ultimately merge into a general kinetic theory of matter; but the theory of condensed phases... today, involves an elaborate and technical use of wave mechanics, and for this reason it is best treated as a subject in itself. The scope of the present book is, therefore, the traditional kinetic theory of gases. ...[A]n account has been included of the wave-mechanical theory, and especially of the degenerate Fermi-Dirac case... There is also a concise chapter on statistical mechanics, which... may be of use as an introduction... [T]he discussion of electrical phenomena has been abbreviated... the latter voluminous subject is best treated separately. ...[F]undamental parts have been explained... [as] to be within the reach of college juniors and seniors. The... wave mechanics and statistical mechanics... are of graduate grade. ...[A] number of carefully worded theorems have been inserted in the guise of problems, without proof... to give... a chance to apply... lines of attack exemplified in the text. To facilitate use as a reference book, definitions have been repeated freely, I hope not ad nauseam. ...Ideas have been drawn freely from ...books such as ...of Jeans and Loeb..."
"[[Game theory|[G]ame theory]] has already established itself as an essential tool in the , where it is widely regarded as a unifying language for investigating human behavior. Game theory's prominence in evolutionary biology builds a natural bridge between the life sciences and the behavioral sciences. And connections have been established between game theory and the two most prominent pillars of physics: statistical mechanics and quantum theory. ...[M]any physicists, neuroscientists, and social scientists... are... pursuing the dream of a quantitative science of human behavior. Game theory is showing signs of... an increasing important role in that endeavor. It's a story of exploration along the shoreline separating the continent of knowledge from an ocean of ignorance... a story worth telling."
"The path integral is a formulation of quantum mechanics equivalent to the standard formulations, offering a new way of looking at the subject which is, arguably, more intuitive than the usual approaches. Applications of path integrals are as vast as those of quantum mechanics itself, including the quantum mechanics of a single particle, statistical mechanics, condensed matter physics and quantum field theory. ... It is in quantum field theory, both relativistic and nonrelativistic, that path integrals (functional integrals is a more accurate term) play a much more important role, for several reasons. They provide a relatively easy road to quantization and to expressions for s, which are closely related to amplitudes for physical processes such as scattering and decays of particles. The path integral treatment of gauge field theories (non-abelian ones, in particular) is very elegant: and ghosts appear quite effortlessly. Also, there are a whole host of nonperturbative phenomena such as solitons and that are most easily viewed via path integrals. Furthermore, the close relation between statistical mechanics and quantum mechanics, or and quantum field theory, is plainly visible via path integrals."
"The need for a fundamentally different approach to the study of physical processes at the molecular level motivated the development of relevant statistical methods, which turned out to be applicable not only to the study of molecular processes (statistical mechanics), but to a host of other areas such as the actuarial profession, design of large telephone exchanges, and the like. In statistical methods, specific manifestations of microscopic entities (molecules, individual telephone sites, etc.) are replaced with their statistical averages, which are connected with appropriate macroscopic variables. The role played in Newtonian mechanics by the calculus, which involves no uncertainty, is replaced in statistical mechanics by ', a theory whose very purpose is to capture uncertainty of a certain type."
"With the growing importance of models in statistical mechanics and in field theory, the path integral method of Feynman was soon recognized to offer frequently a more general procedure of enforcing the instead of the Schrödinger equation. To what extent the two methods are actually equivalent, has not always been understood... [T]here are few nontrivial models which permit deeper insight into their connection. However, the exactly solvable cases... the Coulomb potential and the harmonic oscillator... point the way: For scattering problems the path integral seems particularly convenient, whereas for the calculation of discrete eigenvalues the Schrödinger equation [is preferable]. ...[P]otentials with degenerate vacua ...arise ...in recently studied models of large spins."
"You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage."
"The rapid development of quantum mechanics stimulated research in and theory. Initiated during the mid-twenties, intensive study of s and their representations led to Haar's discovery of the basic construction of invariant integration on a topological group. Bohr's theory of s influenced the work of Wiener, Bochner and many other analysts. They enriched the technical arsenal of harmonic analysis and the scope of its applications (statistical mechanics, ergodic theory, , etc.) The new notion of the generalized made it possible to consider Plancherel's theory simultaneously with Bohr's theory, the continuous spectrum with the discrete. The Pontrjagin-van Kampen duality opened the way for an unobstructed development of on locally compact s, allowing , Fourier integrals and expansions via numerical characters to be viewed as objects of the same kind. The Peter–Weyl theory made it possible for von Neumann to analyze almost periodic functions on groups by connecting them to group representation theory. Along with the many other discoveries of that period, this led to the inclusion of group theorethical methods into the tool kit of harmonic analysis."
"In the history of Science it is possible to find many cases in which the tendency of Mathematics to express itself in the most abstract forms has proved to be of ultimate service in the physical order of ideas. Perhaps the most striking example is to be found in the development of abstract Dynamics. The greatest treatise which the world has seen, on this subject, is Lagrange's Mécanique Analytique, published in 1788. ...conceived in the purely abstract Mathematical spirit ...Lagrange's idea of reducing the investigation of the motion of a dynamical system to a form dependent upon a single function of the of the system was further developed by Hamilton and Jacobi into forms in which the equations of motion of a system represent the conditions for a stationary value of an of a single function. The extension by Routh and Helmholtz to the case in which "ignored co-ordinates" are taken into account, was a long step in the direction of the desirable unification which would be obtained if the notion of were removed by means of its interpretation as dependent upon the of concealed motions included in the dynamical system. The whole scheme of abstract Dynamics thus developed upon the basis of Lagrange's work has been of immense value in theoretical Physics, and particularly in statistical Mechanics... But the most striking use of Lagrange's conception of generalized co-ordinates was made by Clerk Maxwell, who in this order of ideas, and inspired on the physical side by... Faraday, conceived and developed his dynamical theory of the , and obtained his celebrated equations. The form of Maxwell's equations enabled him to perceive that oscillations could be propagated in the electromagnetic field with the velocity of light, and suggested to him the Electromagnetic theory of light. Heinrich Herz, under the direct inspiration of Maxwell's ideas, demonstrated the possibility of setting up electromagnetic waves differing from those of light only in respect of their enormously greater length. We thus see that Lagrange's work... was an essential link in a chain of investigation of which one result... gladdens the heart of the practical man, viz. ."
"[I]n the nineteenth century, even the could be reduced to mechanics by the assumption that heat really consists of a complicated statistical motion of the smallest parts of matter. By combining the concepts of the mathematical theory of probability with the concepts of Newtonian mechanics Clausius, Gibbs and Boltzmann were able to show that the fundamental laws in the theory of heat could be interpreted as statistical laws following from Newton's mechanics when applied to very complicated mechanical systems."
"In the history of sciences, important advances often come from... the recognition that two hitherto separate observations can be viewed from a new angle and seen to represent nothing but different facets of one phenomenon. Thus, terrestrial and celestial mechanisms became a single science with Newton's laws. Thermodynamics and mechanics were unified through statistical mechanics, as were optics and electromagnetism through Maxwell's theory of magnetic field, or chemistry and through quantum mechanics. Similarly different combinations of the same atoms, obeying the same laws, were shown by biochemists to compose both the inanimate and animate worlds. ... Despite such generalizations, however, large gaps remain... Following the line from physics to sociology, one goes from simpler to the more complex objects... from the poorer to the richer empirical content, as well as from the harder to the softer system of hypotheses and experimentation. ...Because of the hierarchy of objects, the problem is always to explain the more complex in terms and concepts applying to the simpler. This is the old problem of reduction, emergence, whole and parts... an understanding of the simple is necessary to understand the more complex, but whether it is sufficient is questionable. ...the appearance of life and later of thought and language—led to phenomena that previously did not exist... To describe and to interpret these phenomena new concepts, meaningless at the previous level, are required. ...At the limit total reductionism results in absurdity. ...explaining democracy in terms of the structure and properties of elementary particles... is clearly nonsense."
"Ludwig Boltzmann, who spent much of his life studying statistical mechanics, died in 1906, by his own hand. Paul Ehrenfest, carrying on the work, died similarly in 1933. Now it is our turn to study statistical mechanics. Perhaps it will be wise to approach the subject cautiously."
"The Schrödinger equation, which is at the heart of quantum theory, is applicable in principle to both microscopic and macroscopic regimes. Thus, it would seem that we already have in hand a non-classical theory of macroscopic dynamics, if only we can apply the Schrödinger equation to the macroscopic realm. However, this possibility has been largely ignored in the literature because the current statistical interpretation of quantum mechanics presumes the classicality of the observed macroscopic world to start with. But the Schrödinger equation does not support this presumption. The state of superposition never collapses under Schrödinger evolution."
"In the consistent-histories approach, the classical limit can be studies by using appropriate subspaces of the quantum as a "coarse graining," analogous to dividing up into nonoverlapping cells in classical statistical mechanics. This coarse graining can then be used to construct quantum histories. It is necessary to show that the resulting family of histories is consistent, so that the probabilities assigned by make good quantum mechanical sense. Finally, one needs to show that the resulting quantum dynamics is well approximated by appropriate classical equations."
"Maurice Goldhaber has emphasized that the situation with respect to possible nuclear resonances in (γ,n) or (γ,fission) reactions was quite unclear at the time of George C. Baldwin and G. Stanley Klaiber’s papers on these reactions. ... This was because the rapid rise of their yield to a prominent peak with increasing energy, followed by a slower fall off was then thought to have been due to the competition between the rapidly rising density of nuclear states and the eventual domination of other reaction channels at higher energies. Goldhaber realized, however, that there could be an analogy between a possible collective nuclear resonance and the restrahl resonance (essentially the transverse optical phonon mode) in polar crystals. Goldhaber sought out Teller because of his paper with Russell Lyddane and Robert Sachs, ... relating the restrahl frequency to the asymptotic behavior of the crystal’s dielectric function. Goldhaber and Teller, in their paper together, went on to predict universal, giant photo-nuclear resonances. ..."
"A powerful method to study the properties of a system is to subject it to a weak external perturbation and to examine its response. For the atomic nucleus subjected to the absorption of a photon or to the scattering of a particle (electron, proton, etc.) the response is ... a function of the energy and linear momentum transferred to the system. ... Up to about 10 MeV the nucleus responds through the excitation of relatively simple states often involving only one or a few particles. In the energy range between 10 and 30 MeV the system response exhibits broad resonances. These are the giant resonances ... Giant resonances correspond to a collective motion involving many if not all the particles in the nucleus. The occurrence of such a collective motion is a common feature of many-body quantum systems. In quantum-mechanical terms the resonance corresponds to a transition between the ground state and the collective state and its strength is described by a transition amplitude. Intuitively it is clear that the strength of the transition will depend on the basic properties of the system such as the number of particles participating in the response and the size of the system. This implies that the total transition strength should be limited by a sum rule which depends 'only' on ground-state properties. If the transition strength of an observed resonance exhausts a major part, say greater than 50%, of the corresponding sum rule we call it a giant resonance."