## From combinatorics to entropy

Let $N = n_1 + ... + n_k$ and $p_i = \frac{n_i}{N}$. Then

$\displaystyle \log ( \frac{N!}{n_1 ! ... n_k ! } ) \approx - N \sum_i p_i \log p_i$

by Stirling’s formula. This is probably well-known to people who have studied statistical physics, but some of us including myself may not be very aware of this kind of relation.

I wonder if this was the first time ever in human history that such an expression $\sum_i p_i \log p_i$ appeared! Entropy is often too abstract to me. The approximation above is a link between counting combinations and entropy, and it seems to provide the most concrete grasp~ This is the genius of Boltzmann, Maxwell and Gibbs which leads to the development of statistical mechanics. Energy, entropy, free energy, enthalpy, Legendre transform, etc. are still difficult to understand to me though.