Type any word!

How to Pronounce "entropy"

word

ˈɛntɹəpi
EN-truh-pee
ˈɛntɹəpi
EN-truh-pee

Definition

Entropy is a concept from physics and information theory that measures the amount of disorder or randomness in a system. In everyday language, it can also refer to things becoming more chaotic or disorganized over time.

IPA Transcription

American English

ˈɛntɹəpi

British English

ˈɛntɹəpi

Simplified Pronunciation

US

EN-truh-pee

UK

EN-truh-pee

Listen in Context

In physics, entropy measures how disordered a system is.

The entropy of the universe always increases over time.

Information theory uses entropy to measure uncertainty in data.

After the party, the kitchen was pure entropy—plates everywhere and spilled drinks.