by Eberhard Zeidler
For the earth, the physicists used the following image of the photon mill. Photons coming from the sun (light radiation) are reflected into cold space, transforming high-value thermal energy at a high temperature on the earth into low thermal energy at a low temperature. The formulas of thermodynamics maintain that the earth radiates entropy into space in this process. The earth gains the same amount of information coded in living matter. One of the remarkable achievements of the physics of the 19th century is the fact that it not only developed Maxwell's theory of electromagnetism as the prototype of the standard model of elementary particles, but also the two concepts of energy and entropy that dominate the unbelievable diversity of thermodynamic processes in nature and technology.
Statistical physics created by Ludwig Boltzmann (1844-1906) gives us the interconnection between information S and entropy that describes large systems of molecules using the mathematics of chance. Here, we notice the amazing fact that all of the properties of a system such as this follow from one single function, known as the partition function:
The prerequisite of this standard model of statistical physics is the system being inn states with the corresponding energies and particle numbers . Furthermore,T is the temperature, the chemical potential and k Boltzmann's constant. With his ingenious intuition of physics, the American physicist Richard Feynman (1918-1988) discovered a completely new approach to quantum mechanics that he described in his dissertation at Princeton in 1942. He observed all conceivable classical paths of an electron, averaged them off against one another depending upon their action and applied a statistical method. That brought him to what is known as the Feynman integral
that we may look upon as a continuous partition function for quantum fields and elementary particles. Here, means the action of the quantum field , and h designates Planck's quantum of action. We can gain all of the essential information on quantum fields and elementary particles from the Feynman integral Z in an analogous fashion to the partition function. Freeman Dyson (born in 1923), who works at the Institute for Advanced Study in Princeton, reports in his book ,,Disturbing the Universe`` that he made calculations for his teacher Hans Bethe (born in 1906) for months at a time. Feynman arrived at the same result in half an hour at the blackboard. Dyson couldn't understand what Feynman was doing. One day, in the middle of vacation, the key idea struck Dyson like a lightning bolt. Suddenly, he knew how three very different ways of accessing quantum electrodynamics were connected with one another. And Feynman, Schwinger and Tomonaga jointly received the Nobel Prize in Physics in 1965. Today, physicists all over the world use Feynman's methods of calculation to arrive at results in quantum electrodynamics that give unbelievably precise agreement with the experiment. The point is that, from a mathematical point of view, Feynman's method is actually non-rigorous and to the present day there is no strict mathematical basis for it. If physicists had started by looking for a sound mathematical basis, they wouldn't have got anywhere. Historical experience shows that the computation has a habit of being smarter than the computer. Non-rigorous, but successful methods of calculation have always been confirmed with rigorous methods, but sometimes at the costs of a great delay and with a more unwieldy and abstract mathematical theory. Some day, that will also be the case for Feynman's method. As Max Planck said:
If one does not sometimes think the illogical, one will never discover new ideas in science.
Mathematician have used imaginary numbers for more than 300 years before Gauss found a rigorous justification for them. Although the partition function encodes the properties of many-particle systems, mathematicians came upon the idea of the partition function in number theory in a completely different sense. The most productive mathematician of all times, Leonhard Euler (1707-1783), encoded the properties of the system of all prime numbers 2, 3, 5, 7, 11,... in one single function that today is called Riemann's zeta function and that plays an important role in calculating Feynman integrals. The solution of a famous unsolved problem in mathematics, Riemann's conjecture for the zeta function, would enable us to get some very precise statements on the asymptotic distribution of prime numbers. Gauss used empirical studies to find out that the number of prime numbers less than n is approximately equal to
for large n. For example, Gauss counted exactly 216,745 prime numbers below n = 3,000,000, while the approximation formula that he guessed supplies the number of 216,970. In 1896, more than forty years after Gauss' death, the two French mathematicians Hadamard (1865-1963) and de la Vallé-Poussin (1866-1962) were able to supply a rigorous and sophisticated mathematical basis for this prime number law independent of one another. There are chaotic oscillations in higher approximations that the mathematician would like to master by solving Riemann's conjecture. It is rather remarkable that mathematicians and physicists hit upon the idea of how to encode many- particle systems (of prime numbers, molecules or elementary particles) in a single partition function (such as a Feynman integral) in a very different fashion. Interestingly enough, there are some hard nuts to crack both in mathematics and in physics when decoding this information.
In school, we learned that the commutative law ab = ba applies to real numbers. The special thing about a variety of surprising quantum phenomena is the fact that non-commutative quantities play a crucial role in the quantum world. Heisenberg (1902-1976) created quantum mechanics in 1925 and he received the Nobel Prize in Physics for it in 1932. Quantum mechanics states that we can describe the position and momentum of a quantum particle by the quantities q and p which satisfy the anticommutation relation
For instance, we may conclude from this, purely mathematically, that we cannot precisely measure the position and speed of a quantum particle simultaneously. The young John von Neumann (1903-1957) experienced the explosive development of quantum mechanics in Göttingen with his own eyes and he created the mathematical basis of quantum theory in 1928 in the form of a non-commutative operator theory. Today, physicists try to use the surprising properties of quantum states to build quantum computers. Unfortunately, theory is farther along that experiment. If quantum states are used as words, we can encode significantly greater amounts of information than is possible with the words used in present computers, which would trigger a revolution in computer technology. At the last world congress of mathematicians in Berlin in 1998, the American computer scientist and mathematician Peter Shor received the Nevalinna Medal for his trail-blazing theoretical studies on quantum computers. The idea of the quantum computer goes back to Feynman. Non-commutative structures are also used when searching for unifying gravitation and quantum theory in the framework of quantum gravitation. In a theory such as this, the physicist expects that space changes its continuous structure below Planck's length 10-33 cm and becomes gritty.
In the past, the dialogue between mathematicians and physicists was of great importance for the development of mathematics and physics. Mathematics has traditionally been classified into algebra, analysis, geometry, logic, numerics and stochastics. In order to make it easier for mathematicians on the one hand and biologists, chemists, scholars from the humanities, computer scientists, engineers and physicists on the other hand to maintain a dialogue these days, it makes sense not to emphasise the traditional structure of mathematics too much, but to observe the phenomena under consideration from all sides. For instance, mathematics has very profound methods for dealing with the following phenomena and it is constantly working at perfecting them:
- the time evolution of systems (dynamic systems),
- the optimal design of processes,
- the loss of stability, breaking of symmetries and jumps in evolution,
- phase transitions,
- scale transitions and microstructures,
- hazardous resonances, turbulence and chaos,
- the qualitative behaviour of systems (topology),
- causal processes,
- random processes,
- information, entropy and complexity,
- collective phenomena and irreversibility,
- quantum phenomena,
- logical processes,
- forbidden processes and selection rules,
- methods of forming models,
- simulating processes on computers.
In this sense, we may speak of a mathematics of time, of the optimum, of symmetry, of phase transitions, of microstructures, of the qualitative behaviour of systems, of randomness, of information, of complexity, of quantization, of forming models or of simulation. The two-volume Teubner 'Handbook of Mathematics' gives us a systematic outline of this aspect of mathematics that permeates all fields of knowledge. Here are a couple of ideas in fragmented form.
About the Autor
Prof. Dr. Eberhard Zeidler wurde 1940 in Leipzig geboren. Dort studierte er Mathematik und Physik. 1974 wurde er zum ordentlichen Professor für Analysis an die Universität Leipzig berufen. Zusammen mit Prof. Dr. Jürgen Jost und Prof. Dr. Stefan Müller gründete er 1996 das Max-Planck-Institut für Mathematik in den Naturwissenschaften in Leipzig und war von 1996 bis 2003 dessen geschäftsführender Direktor. Er ist Mitglied der Deutschen Akademie der Naturforscher Leopoldina. Für sein Lebenswerk erhielt er den Alfried Krupp Wissenschaftspreis 2006 der Alfried Krupp von Bohlen und Halbach-Stiftung.
Prof. Zeidler starb im November 2016.
Figure 1: Einstein's Cross