Day Three
Contents
1.量子“論”と量子“力学” Quantum "Theory" and Quantum "Mechanics"
2.波動ということ――エーテルから場へ Wave - From Ether to Field
3.不確定性関係を導く二つの方式 Two methods leading to Uncertainty relation
4.物理学における認識 Recognition in Physics
5.電子の拡散 Diffusion of electrons
6.古典的因果律からの転換 Shift from Classical Causality
7.シュレーディンガーの猫 Schlodinger's Cat
8.量子力学の完成――場の量子論 Completion of Quantum Mechanics - Quantum Theory of Field
9.量子力学と特殊相対論 Quantum Mechanics and Theory of Special Relativity
10.孤高の理論・一般相対論――一般共変性をめぐってLone Theory - Theory of General Relativity - About General Covariance
11.物理量と幾何学的量とのアイデンティフィケーション Identification of Physics Quantity and Geometric Quantity
12.入れ物(時空)と中身(物質) Container (Space-Time) and Contents (Matters)
13.一般相対論はミクロの世界と無関係か? Does Theory of General Relativity have no relation with Micro World?
14.素粒子論――局所場と非局所場 Theory of Elementary Particles -Local Field and Non-local Field
15.差分的な考え方による可能性 Possibility of Finite Difference method
16.余話――外界認識の連続性と不連続性 Appendix - Continuity and discontinuity of recognition of the world
2.波動ということ――エーテルから場へ Wave - From Ether to Field
11.物理量と幾何学的量とのアイデンティフィケーション Identification of Physics Quantity and Geometric Quantity
I think that the creative activity is to reach an identification in a form of very high degree, through a remarkably increasing upward process of identification. This is a very big creation. For instance , although there many other examples, Boltzmann wrote
Boltzmann's most important scientific contributions were in kinetic theory, including the Maxwell–Boltzmann distribution for molecular speeds in a gas. In addition, Maxwell–Boltzmann statistics and the Boltzmann distribution over energies remain the foundations of classical statistical mechanics. They are applicable to the many phenomena that do not require quantum statistics and provide a remarkable insight into the meaning of temperature.
Much of the physics establishment did not share his belief in the reality of atoms and molecules — a belief shared, however, by Maxwell in Scotland and Gibbs in the United States; and by most chemists since the discoveries of John Dalton in 1808. He had a long-running dispute with the editor of the preeminent German physics journal of his day, who refused to let Boltzmann refer to atoms and molecules as anything other than convenient theoretical constructs. Only a couple of years after Boltzmann's death, Perrin's studies of colloidal suspensions (1908–1909), based on Einstein's theoretical studies of 1905, confirmed the values of Avogadro's number and Boltzmann's constant, and convinced the world that the tiny particles really exist.
To quote Planck, "The logarithmic connection between entropy and probability was first stated by L. Boltzmann in his kinetic theory of gases".[7] This famous formula for entropy S is[8][9]
Boltzmann was also one of the founders of quantum mechanics due to his suggestion in 1877 that the energy levels of a physical system could be discrete.
The equation for S is engraved on Boltzmann's tombstone at the Vienna Zentralfriedhof — his second grave.
----
Information theory
Main articles: Entropy (information theory), Entropy in thermodynamics and information theory, and Entropic uncertainty
In information theory, entropy is the measure of the amount of information that is missing before reception and is sometimes referred to as Shannon entropy.[65] Shannon entropy is a broad and general concept which finds applications in information theory as well as thermodynamics. It was originally devised by Claude Shannon
in 1948 to study the amount of information in a transmitted message.
The definition of the information entropy is, however, quite general,
and is expressed in terms of a discrete set of probabilities :The question of the link between information entropy and thermodynamic entropy is a debated topic. While most authors argue that there is a link between the two,[66][67][68] a few argue that they have nothing to do with each other.[22][69]
The expressions for the two entropies are similar. The information entropy H for equal probabilities is
There are many ways of demonstrating the equivalence of "information entropy" and "physics entropy", that is, the equivalence of "Shannon entropy" and "Boltzmann entropy". Nevertheless, some authors argue for dropping the word entropy for the H function of information theory and using Shannon's other term "uncertainty" instead.[71]
No comments:
Post a Comment