Grammatical features vary widely across languages and this variation has been studied in detail. The functions of grammatical features, however, are not entirely clear and a number of puzzles remain. For example, why do some languages have rich feature inventories but others have few if any grammatical features? Why do many languages have features that appear to encode semantic information (e.g. animacy) that is already known to the listener? We present a computational framework that addresses questions like these by formalizing one way in which grammatical features aid communication. We use the model to illustrate how morpho-syntactic feature inventories help to solve the problem of communicating semantic structures under cognitive pressures.