shannon (unit)

The shannon (symbol: Sh), more commonly known as the bit, is a unit of information and of entropy defined by IEC 80000-13. One shannon is the information content of an event occurring when its probability is 12.[1] It is also the entropy of a system with two equally probable states. If a message is made of a sequence of a given number of bits, with all possible bit strings being equally likely, the message's information content expressed in shannons is equal to the number of bits in the sequence.[2] For this and historical reasons, the shannon is more commonly known as the bit. The introduction of the term shannon provides an explicit distinction between the amount of information that is expressed and the quantity of data that may be used to represent the information. IEEE Std 260.1-2004 still defines the unit for this meaning as the bit, with no mention of the shannon.

The shannon can be converted to other information units according to

1 Sh = 1 bit ≈ 0.693 nat ≈ 0.301 Hart.

The shannon is named after Claude Shannon, the founder of information theory.

See also

  • ban

References

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.