1 min read

Entropy and Thermodynamics

This post does not directly deal with deep learning – rather, it's a passion project from last year when I was obsessed with understanding the physics of entropy better. Entropy is a concept from thermodynamics that snuck itself into computer science following the work of Shannon.

The second law of thermodynamics has taken on a life of its own as an adage for systems succumbing to chaos and decay. It states that entropy, in a closed system, only ever increases. This always struck me as one of those profound things where mathematics suddenly jumps to some general statement about the world.

So, I wanted to understand this claim better. What is entropy? Where does it come from? It really, really doesn't help that standard introductions to entropy assume countable microstates, which is extremely dubious and provides none of the foundations to understand how entropy generalizes.

I am also fascinated by how different laws are dependent on each other, and I tried to see how far I could get without assuming, say, Newton's laws. If assuming Newton's laws are unnecessary, it would imply that results are just as relevant to non-relativistic cases as they are to relativistic cases where those laws break down.

Here's a link to my blog post from a year back. If anyone is curious about thermodynamics, do check it out. I'm very happy with the writing and how accessible it is, as much as a dense mathematical exploration can be anyway. This does mean, however, that it is not extremely formal, and lots of tricks and theorems have been purposefully avoided.

Of course, thermodynamics is a highly developed field and my insights and derivations have doubtless been discovered by others numerous times. If anyone comes across similar approaches, let me know!