entropy

English

Etymology

First attested in 1867, as the translation of German Entropie, coined in 1865 by Rudolph Clausius in analogy to energy (Energie), replacing the root of Ancient Greek ἔργον (érgon, work) by Ancient Greek τροπή (tropḗ, transformation)).

We might call S the transformational content of the body, just as we termed U its thermal and ergonal content. But as I hold it to be better to borrow terms for important magnitudes from ancient languages, so that they may be adopted unchanged in all modern languages, I propose to call the magnitude S the entropy of the body, from the Greek word τροπὴ, transformation. I have intentionally formed the word entropy so as to be as similar as possible to the word energy; for the two magnitudes to be denoted by these words are so nearly allied in their physical meanings, that a certain similarity in design appears to be desirable.

Pronunciation

  • IPA(key): /ˈɛntɹəpi/
  • (file)

Noun

entropy (countable and uncountable, plural entropies)

  1. (thermodynamics, countable)
    1. strictly thermodynamic entropy. A measure of the amount of energy in a physical system that cannot be used to do work.
      The thermodynamic free energy is the amount of work that a thermodynamic system can perform; it is the internal energy of a system minus the amount of energy that cannot be used to perform work. That unusable energy is given by the entropy of a system multiplied by the temperature of the system. (Note that, for both Gibbs and Helmholtz free energies, temperature is assumed to be fixed, so entropy is effectively directly proportional to useless energy.)
    2. A measure of the disorder present in a system.
      Ludwig Boltzmann defined entropy as being directly proportional to the natural logarithm of the number of microstates yielding an equivalent thermodynamic macrostate (with the eponymous constant of proportionality). Assuming (by the fundamental postulate of statistical mechanics), that all microstates are equally probable, this means, on the one hand, that macrostates with higher entropy are more probable, and on the other hand, that for such macrostates, the quantity of information required to describe a particular one of its microstates will be higher. That is, the Shannon entropy of a macrostate would be directly proportional to the logarithm of the number of equivalent microstates (making it up). In other words, thermodynamic and informational entropies are rather compatible, which shouldn't be surprising since Claude Shannon derived the notation 'H' for information entropy from Boltzmann's H-theorem.
    3. The capacity factor for thermal energy that is hidden with respect to temperature .
    4. The dispersal of energy; how much energy is spread out in a process, or how widely spread out it becomes, at a specific temperature.
  2. (statistics, information theory, countable) A measure of the amount of information and noise present in a signal.
  3. (uncountable) The tendency of a system that is left to itself to descend into chaos.

Synonyms

Antonyms

Derived terms

See also

Translations

The translations below need to be checked and inserted above into the appropriate translation tables, removing any numbers. Numbers do not necessarily match those in definitions. See instructions at Wiktionary:Entry layout#Translations.

Further reading

  • entropy in Webster’s Revised Unabridged Dictionary, G. & C. Merriam, 1913.
  • entropy in The Century Dictionary, New York, N.Y.: The Century Co., 1911.
  • entropy at OneLook Dictionary Search

Anagrams

This article is issued from Wiktionary. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.