请输入您要查询的字词:

 

单词 entropy
释义
entropy

Physics
  • Symbol S. A measure of the unavailability of a system’s energy to do work; in a closed system an increase in entropy is accompanied by a decrease in energy availability. When a system undergoes a reversible change the entropy (S) changes by an amount equal to the energy (Q) transferred to the system by heat divided by the thermodynamic temperature (T) at which this occurs, i.e. Δ‎S=Δ‎Q/T. However, all real processes are to a certain extent irreversible changes and in any closed system an irreversible change is always accompanied by an increase in entropy.

    In a wider sense entropy can be interpreted as a measure of disorder; the higher the entropy the greater the disorder. As any real change to a closed system tends towards higher entropy, and therefore higher disorder, it follows that the entropy of the universe (if it can be considered a closed system) is increasing and its available energy is decreasing (see heat death of the universe). This increase in the entropy of the universe is one way of stating the second law of thermodynamics. See also Sackur–Tetrode equation.

    http://web.lemoyne.edu/~GIUNTA/clausius.html Translations of papers by Clausius on entropy (1850) and the second law (1865), published in Annalen der Physik und Chemie


Astronomy
  • A measure of disorder in a system; the higher the entropy, the greater the disorder. In a closed system an increase in entropy is accompanied by a decrease in energy availability. The Universe itself can be regarded as a closed system; therefore its entropy is increasing and its available energy is decreasing. See also Heat Death of the Universe.


Statistics
  • See diversity index.


Chemistry
  • A measure of the unavailability of a system’s energy to do work; in a closed system, an increase in entropy is accompanied by a decrease in energy availability. When a system undergoes a reversible change the entropy (S) changes by an amount equal to the energy (Q) transferred to the system by heat divided by the thermodynamic temperature (T) at which this occurs, i.e. Δ‎S = Δ‎Q/T. However, all real processes are to a certain extent irreversible changes and in any closed system an irreversible change is always accompanied by an increase in entropy.

    In a wider sense entropy can be interpreted as a measure of disorder; the higher the entropy the greater the disorder (see Boltzmann formula). As any real change to a closed system tends towards higher entropy, and therefore higher disorder, it follows that the entropy of the universe (if it can be considered a closed system) is increasing and its available energy is decreasing. This increase in the entropy of the universe is one way of stating the second law of thermodynamics.


Chemical Engineering
  • The extent to which energy in a closed system is unavailable to do useful work. An increase in entropy occurs when the free energy decreases or when the disorder of molecules increases. For a reversible process, entropy remains constant such as in a friction-free adiabatic expansion and compression. The change in entropy is defined as:

    dS=dQT

    where Q is the heat transferred to or from a system, and T is the absolute temperature. However, all real processes are irreversible, which means that in a closed system there is a small increase in entropy.

    Entropy can be regarded as a measure of disorder. That is, the higher the value, the higher the level of disorder. A closed system tends towards higher entropy and therefore a higher disorder. For example, two layers of coloured balls in a box shaken will generate a level of randomness and disorder, and will not return to their original state without the intervention of more work in their separation. See first law of thermodynamics.


Computer
  • A measure of the amount of information that is output by a source, or throughput by a channel, or received by an observer (per symbol or per second). Following Shannon (1948) and later writers, the entropy of a discrete memoryless source with alphabet A = {ai} of size n, and output X at time t is

    H(X)=i=0n1p(xi)logb(1/p(xi))
    where
    p(xi)=Prob{Xt=ai}
    The logarithmic base b is chosen to give a convenient scale factor. Usually,
    b=2b=e=2.71828
    or
    b=10
    Entropy is then measured in bits, in natural units or nats, or in Hartleys, respectively. When the source has memory, account has to be taken of the dependence between successive symbols output by the source.

    The term arises by analogy with entropy in thermodynamics, where the defining expression has the same form but with a physical scale factor k (Boltzmann constant) and with the sign changed. The word negentropy is therefore sometimes used for the measure of information, as is uncertainty or simply ‘information’.


Biology
  • Symbol S. A measure of the unavailability of a system’s energy to do work; an increase in entropy is accompanied by a decrease in energy availability. When a system undergoes a reversible change the entropy (S) changes by an amount equal to the energy (Q) absorbed by the system divided by the thermodynamic temperature (T) at which the energy is absorbed, i.e. Δ‎S=Δ‎Q/T. However, all real processes are to a certain extent irreversible changes and in any closed system an irreversible change is always accompanied by an increase in entropy.

    In a wider sense entropy can be interpreted as a measure of a system’s disorder; the higher the entropy the greater the disorder. As any real change to a closed system tends towards higher entropy, and therefore higher disorder, it follows that the entropy of the universe (if it can be considered a closed system) is increasing and its available energy is decreasing. This increase in the entropy of the universe is one way of stating the second law of thermodynamics.


Geology and Earth Sciences
  • 1. Measure of disorder or unavailable energy in a thermodynamic system; the measure of increasing disorganization of the universe.

    2. See least-work principle; least-work profile.


Geography
  • A measure of disorder in a system; the higher the entropy, the greater the disorder. ‘Entropy maximization is a method used to derive the most probable state of a spatial interaction system, given information on total inflows and total outflows and capacity constraints’ (Sheppard and Plummer in N. Thrift and R. Kitchin, eds 2009). For an exceptionally clear explanation of the role of entropy in the atmosphere, see Lockwood in J. Holden (2012), p. 84. Wilson (2010) Geogr. Analysis 42, 4, 364 provides a useful summary of entropy in human geography.


Philosophy
  • A property of a closed thermodynamical system (i.e. one considered in terms of interchanges of heat and other forms of energy) corresponding to the degree to which the particles of the system are randomly arranged. Entropy is a measure of the disorder in the system. The second law of thermodynamics states that entropy always increases.


随便看

 

科学参考收录了60776条科技类词条,基本涵盖了常见科技类参考文献及英语词汇的翻译,是科学学习和研究的有利工具。

 

Copyright © 2000-2023 Sciref.net All Rights Reserved
京ICP备2021023879号 更新时间:2024/6/30 20:20:20