Entropy is not the easiest thing to understand.
It is rumored to describe something about information and disorder, but it is unclear why.
What do logarithms and sums have to do with the concept of information?
Let me explain!
↓ A thread. ↓
🐦🔗: https://twitter.com/TivadarDanka/status/1475456688547250176