In the fields of coding theory and information theory, a memoryless source generates identically distributed, independent source words. The (Shannon) entropy of the source is then a lower bound to how quickly the source can be encoded. See Shannon’s theorem.