输入任意单词!

"entropy" 怎么发音

word

ˈɛntɹəpi
EN-truh-pee
ˈɛntɹəpi
EN-truh-pee

释义

Entropy is a concept from physics and information theory that measures the amount of disorder or randomness in a system. In everyday language, it can also refer to things becoming more chaotic or disorganized over time.

IPA 音标

美式英语

ˈɛntɹəpi

英式英语

ˈɛntɹəpi

简化发音

美式

EN-truh-pee

英式

EN-truh-pee

语境中听

In physics, entropy measures how disordered a system is.

The entropy of the universe always increases over time.

Information theory uses entropy to measure uncertainty in data.

After the party, the kitchen was pure entropy—plates everywhere and spilled drinks.