From combinatorics to entropy

Let N = n_1 + ... + n_k and p_i = \frac{n_i}{N} . Then

\displaystyle \log ( \frac{N!}{n_1 ! ... n_k ! } ) \approx - N \sum_i p_i \log p_i

by Stirling’s formula. This is probably well-known to people who have studied statistical physics, but some of us including myself may not be very aware of this kind of relation.

I wonder if this was the first time ever in human history that such an expression \sum_i p_i \log p_i appeared! Entropy is often too abstract to me. The approximation above is a link between counting combinations and entropy, and it seems to provide the most concrete grasp~ This is the genius of Boltzmann, Maxwell and Gibbs which leads to the development of statistical mechanics. Energy, entropy, free energy, enthalpy, Legendre transform, etc. are still difficult to understand to me though.

This entry was posted in Analysis, Probability and tagged . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s