Entropy is universal rule of language

May 19, 2011 | Source: Wired

The amount of information carried in the arrangement of words is the same across all languages, even languages that aren’t related to each other. This consistency could hint at a single common ancestral language, or universal features of how human brains process speech, according to systems biologist Marcelo Montemurro of the University of Manchester, lead author of a study May 13 in PLoS ONE.

Ref.: Universal Entropy of Word Ordering Across Linguistic Families.” Marcelo A. Montemurro and Damián H. Zanette. PLoS ONE, Vol. 6, Issue 5, May 13, 2011. DOI: 10.1371/journal.pone.0019875.