Beta

Lecture 28: Boltzmann Hypothesis

Below is a short summary and detailed review of this video written by FutureFactual:

Boltzmann's Entropy and the Emergence of the Boltzmann Distribution in Statistical Thermodynamics | MIT OCW Lecture 28

In this MIT OCW lecture, the instructor introduces Boltzmann's hypothesis that entropy is a function of the number of microstates and demonstrates how the most stable macrostate corresponds to the maximum number of microstates. The talk develops Boltzmann entropy as S = k_B log Ω, links Boltzmann's constant to gas constants, and shows how configurational entropy arises from counting particle configurations. It then derives the Boltzmann distribution by maximizing entropy with fixed total energy and particle number, and introduces the partition function as the normalization factor for this distribution.

  • The entropy grows with the number of microstates, guiding stability
  • Boltzmann entropy connects microscopic counting to macroscopic thermodynamics
  • Occupation numbers follow an exponential dependence on energy
  • The partition function normalizes the Boltzmann distribution

Introduction to Boltzmann's Entropy in Statistical Thermodynamics

The lecture begins by framing entropy as a property tied to the number of microstates Ω corresponding to a macrostate. Boltzmann’s key insight is that the entropy should be a monotonically increasing function of Ω, so the state with the largest number of microstates appears the most stable and most probable. This establishes a bridge between microscopic counting and macroscopic stability, a cornerstone of statistical thermodynamics. The instructor highlights Boltzmann's preeminent role in translating thermodynamic stability into an information-theoretic quantity, setting the stage for the entropy formula S ∝ log Ω.

"Entropy is a monotonically increasing function of Omega, so max entropy means max omega" - MIT OCW Instructor

Configurational Entropy and Microstate Counting

The discussion then turns to configurational entropy, defined as the number of ways to arrange N molecules into R tiny boxes so that at most one molecule per box. In the limit where R is much larger than N, the log of the binomial coefficient log(R choose N) simplifies to N log R, reflecting that most space is empty in a gas-like system. This combinatorial step connects spatial configurability to a statistical measure of entropy, foreshadowing the link between microstate counting and macroscopic thermodynamics. The lecture uses this to motivate the form of entropy as a function of microstates and to illustrate how simple counting can reproduce familiar thermodynamic results.

"The log of R choose N is approximately N log R for R very large compared to N" - MIT OCW Instructor

Boltzmann Constant and Isothermal Expansion

By examining an isothermal expansion of an ideal gas, the instructor shows how Boltzmann’s counting perspective yields the familiar entropy change delta S = k_B N log(V_final/V_initial) and connects k_B to the macroscopic gas constant R through Avogadro's number: k_B = R/NA. This step links the microscopic counting constant to a fundamental macroscopic quantity, anchoring Boltzmann’s entropy in familiar thermodynamic terms. The isothermal expansion serves as a concrete bridge between statistical reasoning and classical thermodynamics, reinforcing the unity of the two perspectives.

"Boltzmann's constant is R divided by Avogadro's number" - MIT OCW Instructor

Maximum Entropy and the Boltzmann Distribution

The core mathematical move is to treat the occupation numbers NI as unconstrained internal variables that fluctuate between states. Using constrained optimization via Lagrange multipliers, the entropy functional is varied subject to fixed total energy and fixed total particle number. The result is a distribution for the fractional occupancy of each state that is exponential in the state energy: NI/N_total = exp(alpha/k_B) exp(-beta E_i/k_B) / Q, where Q is a normalization factor. This is the gravitation point where a general principle—maximizing entropy under given constraints—produces the Boltzmann distribution, a result that recurs across many areas of physics and chemistry.

"The distribution of occupation numbers is an exponential in E_i" - MIT OCW Instructor

Partition Function and Normalization

To ensure the probabilities sum to unity, the normalization constant alpha is fixed by normalization, leading to the introduction of the partition function Q, defined as the sum over all states of exp(-beta E_i/k_B). The distribution of occupancy becomes NI/N_total = exp(-beta E_i/k_B) / Q, which succinctly encapsulates how the whole system’s statistical behavior is determined by a single function that accounts for all accessible energy states. The partition function therefore serves as a compact descriptor of all the ways energy can be partitioned among states, linking microscopic energies to macroscopic thermodynamics and providing a practical computational tool for thermodynamic properties.

"The partition function Q is the sum over all possible states, exp(-beta E_i / k_B)" - MIT OCW Instructor

Concluding Connections and Outlook

In closing, the lecturer emphasizes that the maximum entropy condition under isolation leads directly to a Boltzmann-type distribution, with the Arrhenius rate law and other thermodynamic relations emerging as natural corollaries or special cases. The talk reiterates that while the distribution is powerful and widely applicable, its use rests on explicit assumptions—namely, fixed total energy and particle number, an isolated system, and a finite set of distinct states with well-defined energies. The takeaway is not simply the formula NI/N_total ∝ exp(-beta E_i/k_B) but the deeper principle that entropy maximization under appropriate constraints yields the probabilities that govern the microscopic configurations of many-body systems.

"This distribution is the occupancy of state I, and it's exponential in E_i" - MIT OCW Instructor

To find out more about the video and MIT OpenCourseWare go to: Lecture 28: Boltzmann Hypothesis.

Related posts

featured
MIT OpenCourseWare
·23/10/2023

Lecture 29: Boltzmann Distribution

featured
MIT OpenCourseWare
·23/10/2023

Lecture 27: Introduction to Statistical Thermodynamics

featured
MIT OpenCourseWare
·23/10/2023

Lecture 5: Second Law and Entropy Maximization

featured
MIT OpenCourseWare
·13/05/2024

Lecture 1: Introduction to Thermodynamics