The idea of this post is to show that the classical relations of thermodynamics are simple consequences of elementary probability theory, and in particular of exponential families.

The mathematical setting is thus the same as for exponential families.

There is a space of microstates $X$, with a support measure $δx$.

A macrostate is now a probability distribution on $X$.

Functions of the macrostate

There is also an energy function $E \colon X \times Y \to \RR$. Note the dependence on another set $Y$, which represents an arbitrary set macroscopic variables, such as volume, or pressure. We note the energy function $E_V \colon X \to \RR$ for a fixed macroscopic value collectively denoted by $V$.

The (mean) energy $\mathsf{U}$ of a given macrostate $p$ is then \[ \mathsf{U}(p,V) := \langle E_V \rangle_p = \int E_V(x) p(x) δx \]

Recall that the entropy of a distribution $p$ is defined by \[ \mathsf{S}(p) := \langle -\log(p) \rangle_p = \int -\log(p(x)) p(x) δx \]

Heat and Work

By differentiation, we immediately get the first law of thermodynamics:

\[ \dd \mathsf{U} = \underbrace{\frac{∂\mathsf{U}}{∂V} \dd V}_{δW} + \underbrace{\frac{∂\mathsf{U}}{∂p} \dd p}_{δQ} \]

Using the product rule, we obtain that the work $δW$ is defined by \[ δW := \int \frac{∂E_V}{∂V} (x) p(x) δx \dd V \] and the heat $δQ$ is defined by \[ δQ := \int E_V(x) \dd p(x) δx \]

Equilibrium

The next fundamental principle is that the state $p$ maximises entropy, given the constraint that its mean energy is $\mathsf{U}(p,V) = U$, so, at equilibrium,

\[ p_V(x) = \exp\Big(E_V(x) θ - A_V(θ)\Big) \] where $θ$ is the temperature, a function of $U$ and $V$, and $A(θ,V)$ is the free energy for the temperature $θ$.

From now on, $U$ will denote a coordinate in the space of equilibria, defined by $U := \mathsf{U(p_V,V)}$.

So now, $θ,V$ and $U,V$ are now just two ways of parametrising the two-dimensional manifold of equilibria.

Let us denote the equilibrium entropy by $S(U, V) := \mathsf{S}(p_V)$.

From general properties of exponential distributions, we immediately get: \[ U = \frac{∂A}{∂θ}(θ,V) \] as well as \[ θ = -\frac{∂S}{∂U}(U,V) \] \[ \frac{∂^2A}{∂θ^2} ≥ 0 \] and the equality relating the variables $(U,V)$ and $(θ,V)$: \[ A(θ,V) - S(U,V) = θU \]

Spontaneous Changes

These have to do with a macrostates which are not at equilibrium. The second law states that the entropy $\mathsf{S}$ of an isolated system spontaneously increases \[ \mathsf{S}(p_1) ≤ \mathsf{S}(p_2) \]

If we define the free energy for such a state using $\mathsf{A}(p,V) := θ(\mathsf{U}(p,V),V) \mathsf{U}(p,V) + \mathsf{S}(p)$, then we also obtain the relation \[ \mathsf{A}(p_1) ≥ \mathsf{A}(p_2) \]

Relation between Heat and Entropy

From the definition of $\mathsf{S}$, and using that $p$ is a probability distribution, we get \[ \dd \mathsf{S} = \int -\log(p(x)) \dd p(x) δx \]

Supposing the reversibility assumption, meaning that $p$ always stays at maximum entropy, it must then be an exponential distribution $p_V$, so \[ \dd S = \int (A(θ) - E_V(x)θ) \dd p δx = - \int E_V(x) θ \dd p δ x = - θ δQ \] so we get the fundamental relation between entropy, heat and temperature, during a reversible change:

\[ \dd S = - θ δQ \]

Classical Relations

As $\dd S = -θ \dd U + \frac{∂S}{∂V}\dd V$, we get that for a reversible transformation: \[ δW = \frac{1}{θ} \frac{∂S}{∂V} \dd V \qquad δQ = -\frac{\dd S}{θ} \]

Temperature equilibrium

The setting is that of a cross product of two microstate spaces $X_1$ and $X_2$, so $X = X_1 \times X_2$.

We also assume that the distributions are independent, so $p(x_1,x_2) = p_1(x_1) p_2(x_2)$. In this case, one can check that $S = S_1 + S_2$. One also has $U = U_1 + U_2$. \[ U_1 = \int_{X_1}E_1(x_1) p_1(x_1) δx_1 \qquad U_2 = \int_{X_2}E_2(x_2) p_2(x_2) δx_2 \] and \[ U = \int_{X_1 \times X_2} (E_1(x_1) + E_2(x_2)) p_1(x_1) p_2(x_2) δx_1 δx_2 = U_1 + U_2 \]

\[ \dd S = \dd S_1 + \dd S_2 = \frac{∂S_1}{∂U_1} \dd U_1 + \frac{∂S_2}{∂U_2} \dd U_2 = \dd U_2 (θ_1 - θ_2) \]

Say thanks!