AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |
Back to Blog
Entropy formula9/1/2023 ![]() ![]() We chose s to satisfy the following inequalities. The size of the domain is $ n^r $ and the size of the range is $ m^s $. The range of E must be greater than or equal to the size of the domain or otherwise two different messages in the domain would have to map to the same encoding in the range. Entropy is a measure of molecular disorder or randomness of a system, and the second law states that entropy can be created but it cannot be destroyed. Unlike P, V, and T, which are quite easy to measure, the entropy of a system is difficult to calculate. A container of ideal gas has an entropy value, just as it has a pressure, a volume, and a temperature. ![]() The symbol for entropy is S, and the units are J/K. $$L(m_A) = r, \space L(m_B) = s, \space m_B = E(m_A)$$ Entropy is in some sense a measure of disorder. To prove this is correct function for the entropy we consider an encoding $E: A^r \rightarrow B^s$ that encodes blocks of r letters in A as s characters in B. Entropy is a measure of uncertainty and was introduced in the field of information theory by Claude E. In this case the entropy only depends on the of the sizes of A and B. And if event $A$ has a certain amount of surprise, and event $B$ has a certain amount of surprise, and you observe them together, and they're independent, it's reasonable that the amount of surprise adds.įrom here it follows that the surprise you feel at event $A$ happening must be a positive constant multiple of $- \log \mathbb \log_m(n)$$ It's reasonable to ask that it be continuous in the probability. How surprising is an event? Informally, the lower probability you would've assigned to an event, the more surprising it is, so surprise seems to be some kind of decreasing function of probability. ![]()
0 Comments
Read More
|