Friday, March 16, 2012

Fred Kavli and His Institutes

"The curiosity of the human being is what has brought us where we are today, and I have complete confidence that it will take us where we need to be in the future. " ... Fred Kavli

Poster at The M.I.T. Kavli Institute for Astrophysics and Space Science in Cambridge, MA

Poster at The M.I.T. Kavli Institute for Astrophysics and Space Science in Cambridge, MA 
Fred Kavli (born 1927) is a Norwegian and naturalized American physicist, business leader, inventor, and philanthropist. He was born in the village of Eresfjord, Nesset municipality in Møre og Romsdal county, Norway. Today Kavli lives in the city of Santa BarbaraCalifornia in the United States. He established The Kavli Foundation in the year 2000 to support basic scientific research. He has only recently appeared in the mainstream media for his work, primarily his philanthropic efforts. He is divorced and has two grown children. An avid art collector, Kavli has gathered a large collection of Norwegian oil paintings.


Contents

  

[edit]Kavli's life

Kavli grew up on the family farm in the tiny Norwegian village of Eresfjord (pop. 450).
At 14, together with his brother Aslak, he began his first enterprise creating wood pellet fuel for cars. This was during the Second World War and the Nazi occupation of Norway.
Inspired by his father's 13 years in San Francisco the young Kavli wanted to move to the US. Three days after he received his engineering physics degree from the Norwegian Institute of Technology(NTH) in Trondheim he left for America on the SS Stavangerfjord.
Having no job or sponsor waiting for him, his visa application was initially rejected, and so in 1955 he immigrated to MontrealCanada instead. The following year his visa was approved and he moved to the United States. He found work as an engineer for a Los Angeles business that developed feedback flight controls for Atlas missiles. He would rise to the position of Chief Engineer here.
Looking to start his own business he advertised in the Los Angeles Times newspaper soliciting financial backers with the simple but effective text "Engineer seeking financial backing to start own business".
Two years later he had founded the Kavlico Corporation, located in MoorparkCalifornia. Under his leadership, the company became one of the world's largest suppliers of sensors for aeronautic, automotive, and industrial applications supplying amongst others General Electric and the Ford Motor Company. In 2000 he sold Kavlico for $345 million to C-Mac Industries Inc. Kavlico is today owned by the French company Schneider Electric. Much of Kavli's wealth is a result of his real estate investments in Southern California. As a philanthropist, Kavli subsequently established The Kavli Foundation and has dedicated much of his wealth to funding research institutions and programs worldwide.
On June 19, 2006, he was appointed Grand Officer, Commander with Star, of the Royal Norwegian Order of Merit by King Harald V of Norway [1] in recognition of his work on behalf of Norway and humanity. In 2008, he was also awarded an honorary doctorate, Doctor Honoris Causa, by the Norwegian University of Science and Technology, in recognition of his work to the benefit and advancement of science and research.[2] Kavli is a Fellow of the American Academy of Arts and Sciences.[3] He is also a former member of the U.S. President’s Council of Advisors on Science and Technology, and former member of the University of California President’s Board on Science and Innovation. In 2009, Mr. Kavli received an honorary Doctor of Science degree from Northwestern University.[4] In 2011 he received the Bower Award for Business Leadership from the Franklin Institute,[5] one of the oldest science education centers in the United States, and the Carnegie Medal of Philanthropy, which is given every biennially to one or more individuals who, like Andrew Carnegie, have dedicated their private wealth to public good, and who have sustained impressive careers as philanthropists.[6] In 2011, Mr. Kavli was also conferred the degree of doctor philosopliae honoris causa by the University of Oslo.[7]
A Trustee of the University of California, Santa Barbara (UCSB) Foundation, in addition to supporting scientific research and education, his philanthropic activities include the Fred Kavli Theatre for Performing Arts at the Thousand Oaks Civic Arts Plaza, California, as well as other projects.

[edit]Kavli Prizes

Through The Kavli Foundation, Kavli established scientific prizes in the fields of AstrophysicsNanoscience, and Neuroscience. The Kavli Prizes are presented in cooperation with the Norwegian Academy of Science and Letters and the Norwegian Ministry of Education and Research, and have been awarded biennially at a ceremony in Oslo since 2008.[8] Each prize consists of a scroll, gold medal, and $1,000,000 cash.
Kavli chose to focus on these three areas of interest – "from the biggest, to the smallest, to the most complex" – because he thinks these fields are the most exciting scientific fields for the 21st century with potentially great benefits.[9] Kavli has also noted his intent that the Prizes distinguish themselves from the Nobel prizes in science.[10] Consequently, one key distinction between the prizes: Kavli Prize laureates are selected by committees composed of distinguished international scientists. These committee members are recommended by the Chinese Academy of Sciences, the French Academy of Sciences, the Max Planck Society, the U.S. National Academy of Sciences and The Royal Society, with committee chairs chosen by the Norwegian Academy of Science and Letters.[11]
The first Kavli Prize winners were announced on May 28, 2008, simultaneously in Oslo and at the opening of the World Science Festival in New York City. The first Kavli Prize for astrophysics was awarded to Maarten Schmidt and Donald Lynden-BellLouis E. Brus and Sumio Iijima shared the nanoscience prize, while Pasko RakicThomas Jessell and Sten Grillner were awarded the neuroscience prize.[12] The four US winners of the Kavli Prize were honored by President George W. Bush and Science Advisor, Dr. John Marburger, at an Oval Office reception in the White House on November 12, 2008.[13] (See Kavli Prize for laureates in subsequent years.)

[edit]The Kavli Foundation

The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.
The Foundation's mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.
The Kavli Foundation was established in December 2000 by its founder and benefactor, Fred Kavli, a prominent California business leader and noted philanthropist whose foundation is currently actively involved in establishing major research institutes at leading universities and institutions in the United States, Europe and Asia.
The Kavli Foundation has made grants to establish Kavli Institutes on the campuses of the University of California Santa BarbaraStanford Universitythe California Institute of Technology, theUniversity of ChicagoColumbia UniversityYale UniversityNew York UniversityCornell University, the University of California San DiegoDelft University of Technology in The Netherlands, the Massachusetts Institute of Technology, Peking UniversityChinese Academy of SciencesHarvard UniversityUniversity of Cambridge and the Norwegian University of Science and Technology. These institutions are the beneficiaries of the Kavli Foundation as on date, and the list is bound to grow in the future.
In addition to the Kavli Institutes, six Kavli professorships have been established: two at University of California Santa Barbara, one at University of California Los Angeles, one at University of California Irvine, one at Columbia University, and one at California Institute of Technology.

[edit]Kavli Institutes

The Kavli Foundation has established research institutes at leading universities worldwide. Consistent with its business-like approach, Kavli requires each partner University to match the average $7.5 million donation. The institutes are not required to focus on any specific subject but are free to do any basic research they see fit.
Three researchers associated with the Kavli institutes have been awarded Nobel prizes: David GrossFrank Wilczek and Richard Axel.
As of March 2008, there are 15 institutes in the United States, 2 in China, 1 in the Netherlands, 1 in Norway and 1 in the United Kingdom.[14] According to the Foundation eventually there might be as many 20 centres. The Institute for Physics and Mathematics of the Universe in Tokyo has also received an endowment to setup a Kavli institute from April 1st 2012 [15]
The fifteen Kavli Institutes are:

[edit]Astrophysics

  • Kavli Institute for Particle Astrophysics and Cosmology at Stanford University
  • Kavli Institute for Cosmological Physics at the University of Chicago
  • Kavli Institute for Astrophysics and Space Research at the Massachusetts Institute of Technology
  • Kavli Institute for Cosmology at the University of Cambridge
  • Kavli Institute for Astronomy and Astrophysics at Peking University in China

[edit]Nanoscience

[edit]Neuroscience

[edit]Theoretical physics

  • Kavli Institute for Theoretical Physics at the University of California, Santa Barbara
  • Kavli Institute for Theoretical Physics China at the Chinese Academy of Sciences
  • Kavli Institute for the Physics and Mathematics of the Universe (Kavli IPMU) [16]

[edit]Quotes

The curiosity of the human being is what has brought us where we are today, and I have complete confidence that it will take us where we need to be in the future.

[edit]References

[edit]External links

Monday, March 5, 2012

The History of Entropy


The concept of entropy developed in response to the observation that a certain amount of functional energy released from combustion reactions is always lost to dissipation or friction and is thus not transformed into useful work .[citation needed] Early heat-powered engines such as Thomas Savery's (1698), the Newcomen engine (1712) and the Cugnot steam tricycle (1769) were inefficient, converting less than two percent of the input energy into useful work output; a great deal of useful energy was dissipated or lost into what seemed like a state of immeasurable randomness.[original research?] Over the next two centuries, physicists investigated this puzzle of lost energy; the result was the concept of entropy.
In the early 1850s, Rudolf Clausius set forth the concept of the thermodynamic system and posited the argument that in any irreversible process a small amount of heat energy δQ is incrementally dissipated across the system boundary. Clausius continued to develop his ideas of lost energy, and coined the term entropy.
Since the mid-20th century the concept of entropy has found application in the analogous field of data loss in information transmission systems.[citation needed]

Contents

  [hide

[edit]Classical thermodynamic views

In 1803, mathematician Lazare Carnot published a work entitled Fundamental Principles of Equilibrium and Movement. This work includes a discussion on the efficiency of fundamental machines, i.e. pulleys and inclined planes. Lazare Carnot saw through all the details of the mechanisms to develop a general discussion on the conservation of mechanical energy. Over the next three decades, Lazare Carnot’s theorem was taken as a statement that in any machine the accelerations and shocks of the moving parts all represent losses of moment of activity, i.e. the useful work done. From this Lazare drew the inference that perpetual motion was impossible. This loss of moment of activity was the first-ever rudimentary statement of the second law of thermodynamics and the concept of 'transformation-energy' or entropy, i.e. energy lost to dissipation and friction.[1]
Lazare Carnot died in exile in 1823. During the following year Lazare’s son Sadi Carnot, having graduated from the École Polytechnique training school for engineers, but now living on half-pay with his brother Hippolyte in a small apartment in Paris, wrote the Reflections on the Motive Power of Fire. In this paper, Sadi visualized an ideal engine in which the heat of caloric converted into work could be reinstated by reversing the motion of the cycle, a concept subsequently known as thermodynamic reversibility. Building on his father's work, Sadi postulated the concept that “some caloric is always lost”, not being converted to mechanical work. Hence any real heat engine could not realize the Carnot cycle's reversibility and was condemned to be less efficient. This lost caloric was a precursory form of entropy loss as we now know it. Though formulated in terms of caloric, rather than entropy, this was an early insight into the second law of thermodynamics.

[edit]1854 definition

Rudolf Clausius - originator of the concept of "entropy"
In his 1854 memoir, Clausius first develops the concepts of interior work, i.e. that "which the atoms of the body exert upon each other", and exterior work, i.e. that "which arise from foreign influences [to] which the body may be exposed", which may act on a working body of fluid or gas, typically functioning to work a piston. He then discusses the three categories into which heat Q may be divided:
  1. Heat employed in increasing the heat actually existing in the body.
  2. Heat employed in producing the interior work.
  3. Heat employed in producing the exterior work.
Building on this logic, and following a mathematical presentation of the first fundamental theorem, Clausius then presented the first-ever mathematical formulation of entropy, although at this point in the development of his theories he called it "equivalence-value", perhaps referring to the concept of themechanical equivalent of heat which was developing at the time rather than entropy, a term which was to come into use later.[2] He stated:[3]
the second fundamental theorem in the mechanical theory of heat may thus be enunciated: If two transformations which, without necessitating any other permanent change, can mutually replace one another, be called equivalent, then the generations of the quantity of heat Q from work at the temperature T , has the equivalence-value:
 \frac {Q}{T}
and the passage of the quantity of heat Q from the temperature T1 to the temperature T2, has the equivalence-value:
 Q \left( \frac {1}{T_2} - \frac {1}{T_1}\right)
wherein T is a function of the temperature, independent of the nature of the process by which the transformation is effected.
In modern terminology, we think of this equivalence-value as "entropy", symbolized by S. Thus, using the above description, we can calculate the entropy change ΔS for the passage of the quantity ofheat Q from the temperature T1, through the "working body" of fluid (see heat engine), which was typically a body of steam, to the temperature T2 as shown below:
Diagram of Sadi Carnot's heat engine, 1824
If we make the assignment:
 S= \frac {Q}{T}
Then, the entropy change or "equivalence-value" for this transformation is:
 \Delta S = S_{\rm final} - S_{\rm initial} \,
which equals:
 \Delta S = \left(\frac {Q}{T_2} - \frac {Q}{T_1}\right)
and by factoring out Q, we have the following form, as was derived by Clausius:
 \Delta S = Q\left(\frac {1}{T_2} - \frac {1}{T_1}\right)

[edit]1856 definition

In 1856, Clausius stated what he called the "second fundamental theorem in the mechanical theory of heat" in the following form:
\int \frac{\delta Q}{T} = -N
where N is the "equivalence-value" of all uncompensated transformations involved in a cyclical process. This equivalence-value was a precursory formulation of entropy.[4]

[edit]1862 definition

In 1862, Clausius stated what he calls the “theorem respecting the equivalence-values of the transformations” or what is now known as the second law of thermodynamics, as such:
The algebraic sum of all the transformations occurring in a cyclical process can only be positive, or, as an extreme case, equal to nothing.
Quantitatively, Clausius states the mathematical expression for this theorem is as follows. Let δQ be an element of the heat given up by the body to any reservoir of heat during its own changes, heat which it may absorb from a reservoir being here reckoned as negative, and T the absolute temperature of the body at the moment of giving up this heat, then the equation:
\int \frac{\delta Q}{T} = 0
must be true for every reversible cyclical process, and the relation:
\int \frac{\delta Q}{T} \ge 0
must hold good for every cyclical process which is in any way possible. This was an early formulation of the second law and one of the original forms of the concept of entropy.

[edit]1865 definition

In 1865, Clausius gave irreversible heat loss, or what he had previously been calling "equivalence-value", a name:[5][6]
I propose to name the quantity S the entropy of the system, after the Greek word [τροπη trope], the transformation. I have deliberately chosen the word entropy to be as similar as possible to the word energy: the two quantities to be named by these words are so closely related in physical significance that a certain similarity in their names appears to be appropriate.

Although Clausius did not specify why he chose the symbol "S" to represent entropy, it is arguable that Clausius chose "S" in honor of Sadi Carnot, to whose 1824 article Clausius devoted over 15 years of work and research. On the first page of his original 1850 article "On the Motive Power of Heat, and on the Laws which can be Deduced from it for the Theory of Heat", Clausius calls Carnot the most important of the researchers in the theory of heat.[7]

[edit]Later developments

In 1876, physicist J. Willard Gibbs, building on the work of Clausius, Hermann von Helmholtz and others, proposed that the measurement of "available energy" ΔG in a thermodynamic system could be mathematically accounted for by subtracting the "energy loss" TΔS from total energy change of the system ΔH. These concepts were further developed by James Clerk Maxwell [1871] and Max Planck[1903].

[edit]Statistical thermodynamic views

In 1877, Ludwig Boltzmann formulated the alternative definition of entropy S defined as:
S = k_{\rm B} \ln \Omega \!
where
kB is Boltzmann's constant and
Ω is the number of microstates consistent with the given macrostate.
Boltzmann saw entropy as a measure of statistical "mixedupness" or disorder. This concept was soon refined by J. Willard Gibbs, and is now regarded as one of the cornerstones of the theory ofstatistical mechanics.

[edit]Information theory

An analog to thermodynamic entropy is information entropy. In 1948, while working at Bell Telephone Laboratories electrical engineer Claude Shannon set out to mathematically quantify the statistical nature of “lost information” in phone-line signals. To do this, Shannon developed the very general concept of information entropy, a fundamental cornerstone of information theory. Although the story varies, initially it seems that Shannon was not particularly aware of the close similarity between his new quantity and earlier work in thermodynamics. In 1949, however, when Shannon had been working on his equations for some time, he happened to visit the mathematician John von Neumann. During their discussions, regarding what Shannon should call the “measure of uncertainty” or attenuation in phone-line signals with reference to his new information theory, according to one source:[8]
My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.

According to another source, when von Neumann asked him how he was getting on with his information theory, Shannon replied:[9]
The theory was in excellent shape, except that he needed a good name for “missing information”. “Why don’t you call it entropy”, von Neumann suggested. “In the first place, a mathematical development very much like yours already exists in Boltzmann’s statistical mechanics, and in the second place, no one understands entropy very well, so in any discussion you will be in a position of advantage.

In 1948 Shannon published his famous paper A Mathematical Theory of Communication, in which he devoted a section to what he calls Choice, Uncertainty, and Entropy.[10] In this section, Shannon introduces an H function of the following form:
H = -K\sum_{i=1}^k p(i) \log p(i),
where K is a positive constant. Shannon then states that “any quantity of this form, where K merely amounts to a choice of a unit of measurement, plays a central role in information theory as measures of information, choice, and uncertainty.” Then, as an example of how this expression applies in a number of different fields, he references R.C. Tolman’s 1938 Principles of Statistical Mechanics, stating that “the form of H will be recognized as that of entropy as defined in certain formulations of statistical mechanics where pi is the probability of a system being in cell i of its phase space… H is then, for example, the H in Boltzmann’s famous H theorem.” As such, over the last fifty years, ever since this statement was made, people have been overlapping the two concepts or even stating that they are exactly the same.
Shannon's information entropy is a much more general concept than statistical thermodynamic entropy. Information entropy is present whenever there are unknown quantities that can be described only by a probability distribution. In a series of papers by E. T. Jaynes starting in 1957,[11][12] the statistical thermodynamic entropy can be seen as just a particular application of Shannon's information entropy to the probabilities of particular microstates of a system occurring in order to produce a particular macrostate.

[edit]Popular use

The term entropy is often used in popular language to denote a variety of unrelated phenomena. One example is the concept of corporate entropy as put forward somewhat humorously by authors Tom DeMarco and Timothy Lister in their 1987 classic publication Peopleware, a book on growing and managing productive teams and successful software projects. Here, they view energy waste as red tape and business team inefficiency as a form of entropy, i.e. energy lost to waste. This concept has caught on and is now common jargon in business schools.

[edit]Terminology overlap

When necessary, to disambiguate between the statistical thermodynamic concept of entropy, and entropy-like formulae put forward by different researchers, the statistical thermodynamic entropy is most properly referred to as the Gibbs entropy. The terms Boltzmann-Gibbs entropy or BG entropy, and Boltzmann-Gibbs-Shannon entropy or BGS entropy are also seen in the literature.

[edit]See also

[edit]References

  1. ^ Mendoza, E. (1988). Reflections on the Motive Power of Fire – and other Papers on the Second Law of Thermodynamics by E. Clapeyron and R. Clausius. New York: Dover Publications. ISBN 0-486-44641-7.
  2. ^ Mechanical Theory of Heat, by Rudolf Clausius, 1850-1865
  3. ^ Published in Poggendoff’s Annalen, December 1854, vol. xciii. p. 481; translated in the Journal de Mathematiques, vol. xx. Paris, 1855, and in the Philosophical Magazine, August 1856, s. 4. vol. xii, p. 81
  4. ^ Clausius, Rudolf. (1856). "On the Application of the Mechanical theory of Heat to the Steam-Engine." as found in: Clausius, R. (1865). The Mechanical Theory of Heat – with its Applications to the Steam Engine and to Physical Properties of Bodies. London: John van Voorst, 1 Paternoster Row. MDCCCLXVII.
  5. ^ Laidler, Keith J. (1995). The Physical World of Chemistry. Oxford University Press. pp. 104–105. ISBN 0-19-855919-4.
  6. ^ OED, Second Edition, 1989, "Clausius (Pogg. Ann. CXXV. 390), assuming (unhistorically) the etymological sense of energy to be ‘work-contents’ (werk-inhalt), devised the term entropy as a corresponding designation for the ‘transformation-contents’ (verwandlungsinhalt) of a system"
  7. ^ Clausius, Rudolf (1850). On the Motive Power of Heat, and on the Laws which can be deduced from it for the Theory of Heat. Poggendorff's Annalen der Physick, LXXIX (Dover Reprint). ISBN 0-486-59065-8.
  8. ^ M. Tribus, E.C. McIrvine, “Energy and information”, Scientific American, 224 (September 1971).
  9. ^ Avery, John (2003). Information Theory and Evolution. World Scientific. ISBN 981-238-400-6.
  10. ^ C.E. Shannon, "A Mathematical Theory of Communication", Bell System Technical Journal, vol. 27, pp. 379-423, 623-656, July, October, 1948, EprintPDF
  11. ^ E. T. Jaynes (1957) Information theory and statistical mechanicsPhysical Review 106:620
  12. ^ E. T. Jaynes (1957) Information theory and statistical mechanics IIPhysical Review 108:171

[edit]External links


History of entrop