Friday, October 5, 2012

Day Three

Lecture on Physics by Dr Yukawa

Day Three


Contents

  1.量子“論”と量子“力学” Quantum "Theory" and Quantum "Mechanics"
  2.波動ということ――エーテルから場へ Wave - From Ether to Field
  3.不確定性関係を導く二つの方式 Two methods leading to Uncertainty relation
  4.物理学における認識 Recognition in Physics
  5.電子の拡散 Diffusion of electrons
  6.古典的因果律からの転換 Shift from Classical Causality
  7.シュレーディンガーの猫 Schlodinger's Cat
  8.量子力学の完成――場の量子論 Completion of Quantum Mechanics -
Quantum Theory of Field
  9.量子力学と特殊相対論 Quantum Mechanics and Theory of Special Relativity
  10.孤高の理論・一般相対論――一般共変性をめぐってLone Theory - Theory of General Relativity - About General Covariance
  11.物理量と幾何学的量とのアイデンティフィケーション Identification of Physics Quantity and Geometric Quantity
  12.入れ物(時空)と中身(物質) Container (Space-Time) and Contents (Matters)
  13.一般相対論はミクロの世界と無関係か? Does Theory of General Relativity have no relation with Micro World?

  14.素粒子論――局所場と非局所場 Theory of Elementary Particles -Local Field and Non-local Field
  15.差分的な考え方による可能性 Possibility of Finite Difference method
  16.余話――外界認識の連続性と不連続性 Appendix - Continuity and discontinuity of recognition of the world


 


2.波動ということ――エーテルから場へ Wave - From Ether to Field


11.物理量と幾何学的量とのアイデンティフィケーション Identification of Physics Quantity and Geometric Quantity

I think that the creative activity is to reach an identification in a form of very high degree, through a remarkably increasing upward process of identification. This is a very big creation. For instance
, although there many other examples, Boltzmann wrote
 S = k \log_e W \,
by using Boltzmann constant k, where S = Entropy, W= the number of micro states belonging to the macro states - Probability. He found this relationship. Thermodynamics explains that Entropy is the ratio of In and Out of Heat devided by the absolute temperature (T) when there is In and Out of Heat due to the application of energy. This is the macro quantity. It is an very outstanding thinking to connect the macro quantity with the micro quantity, which leads to the concepts of Probability and then Information Theory. This equation is not putting the same thing at each side of the equal sign (=). People tend to think that the both sides are the same. But it is only after Boltzmann put them this way that people think so. Boltzmann put the different things at the each side of equal (=).







Boltzmann's most important scientific contributions were in kinetic theory, including the Maxwell–Boltzmann distribution for molecular speeds in a gas. In addition, Maxwell–Boltzmann statistics and the Boltzmann distribution over energies remain the foundations of classical statistical mechanics. They are applicable to the many phenomena that do not require quantum statistics and provide a remarkable insight into the meaning of temperature.

Boltzmann’s 1898 I2 molecule diagram showing atomic “sensitive region” (α, β) overlap.
Much of the physics establishment did not share his belief in the reality of atoms and molecules — a belief shared, however, by Maxwell in Scotland and Gibbs in the United States; and by most chemists since the discoveries of John Dalton in 1808. He had a long-running dispute with the editor of the preeminent German physics journal of his day, who refused to let Boltzmann refer to atoms and molecules as anything other than convenient theoretical constructs. Only a couple of years after Boltzmann's death, Perrin's studies of colloidal suspensions (1908–1909), based on Einstein's theoretical studies of 1905, confirmed the values of Avogadro's number and Boltzmann's constant, and convinced the world that the tiny particles really exist.
To quote Planck, "The logarithmic connection between entropy and probability was first stated by L. Boltzmann in his kinetic theory of gases".[7] This famous formula for entropy S is[8][9]
 S = k \log_e W \,
where k = 1.3806505(24) × 10−23 J K−1 is Boltzmann's constant, and the logarithm is taken to the natural base e. W is the Wahrscheinlichkeit, the frequency of occurrence of a macrostate[10] or, more precisely, the number of possible microstates corresponding to the macroscopic state of a system — number of (unobservable) "ways" in the (observable) thermodynamic state of a system can be realized by assigning different positions and momenta to the various molecules. Boltzmann’s paradigm was an ideal gas of N identical particles, of which Ni are in the ith microscopic condition (range) of position and momentum. W can be counted using the formula for permutations
 W = \frac{N!}{\prod_i N_i!}
where i ranges over all possible molecular conditions. (! denotes factorial.) The "correction" in the denominator is because identical particles in the same condition are indistinguishable.
Boltzmann was also one of the founders of quantum mechanics due to his suggestion in 1877 that the energy levels of a physical system could be discrete.
The equation for S is engraved on Boltzmann's tombstone at the Vienna Zentralfriedhof — his second grave.

 ----

Information theory

In information theory, entropy is the measure of the amount of information that is missing before reception and is sometimes referred to as Shannon entropy.[65] Shannon entropy is a broad and general concept which finds applications in information theory as well as thermodynamics. It was originally devised by Claude Shannon in 1948 to study the amount of information in a transmitted message. The definition of the information entropy is, however, quite general, and is expressed in terms of a discrete set of probabilities p_i:
H(X) = -\sum_{i=1}^n {p(x_i) \log p(x_i)}.
In the case of transmitted messages, these probabilities were the probabilities that a particular message was actually transmitted, and the entropy of the message system was a measure of the average amount of information in a message. For the case of equal probabilities (i.e. each message is equally probable), the Shannon entropy (in bits) is just the number of yes/no questions needed to determine the content of the message.[22]
The question of the link between information entropy and thermodynamic entropy is a debated topic. While most authors argue that there is a link between the two,[66][67][68] a few argue that they have nothing to do with each other.[22][69]
The expressions for the two entropies are similar. The information entropy H for equal probabilities p_i = p = 1/n is
H = k\, \log(1/p),
where k is a constant which determines the units of entropy. For example, if the units are bits, then k = 1/ln(2). The thermodynamic entropy S, from a statistical mechanical point of view, was first expressed by Boltzmann:
S = k_\mathrm{B} \log(1/p),
where p is the probability of a system's being in a particular microstate, given that it is in a particular macrostate, and k_\mathrm{B} is Boltzmann's constant. It can be seen that one may think of the thermodynamic entropy as Boltzmann's constant, divided by log(2), times the number of yes/no questions that must be asked in order to determine the microstate of the system, given that we know the macrostate. The link between thermodynamic and information entropy was developed in a series of papers by Edwin Jaynes beginning in 1957.[70]
There are many ways of demonstrating the equivalence of "information entropy" and "physics entropy", that is, the equivalence of "Shannon entropy" and "Boltzmann entropy". Nevertheless, some authors argue for dropping the word entropy for the H function of information theory and using Shannon's other term "uncertainty" instead.[71]

Mathematics

No comments:

Post a Comment