Towards semantically rich and recursive word learning models.

Abstract

Current models of word learning focus on the mapping between words and their referents and remain mute with regard to conceptual representation. We develop a cross-situational model of word learning that captures word-concept mapping by jointly inferring the referents and underlying concepts for each word. We also develop a variant of our model that incorporates recursion, which entertains the idea that children can use learned words to aid future learning. We demonstrate both models’ ability to learn kinship terms and show that adding recursion into the model speeds acquisition.

Publication
Proceedings of the 37th Annual Conference of the Cognitive Science Society