A Computational Simulation of Children’s Language Acquisition (Crazy New Idea)

Author Ben Ambridge

Thumbnail PDF


  • Filesize: 384 kB
  • 3 pages

Document Identifiers

Author Details

Ben Ambridge
  • ESRC International Centre for Language and Communicative Development (LuCiD), University of Liverpool, UK

Cite AsGet BibTex

Ben Ambridge. A Computational Simulation of Children’s Language Acquisition (Crazy New Idea). In 3rd Conference on Language, Data and Knowledge (LDK 2021). Open Access Series in Informatics (OASIcs), Volume 93, pp. 4:1-4:3, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Many modern NLP models are already close to simulating children’s language acquisition; the main thing they currently lack is a "real world" representation of semantics that allows them to map from form to meaning and vice-versa. The aim of this "Crazy Idea" is to spark a discussion about how we might get there.

Subject Classification

ACM Subject Classification
  • Theory of computation → Grammars and context-free languages
  • Child language acquisition
  • language development
  • deep learning
  • BERT
  • ELMo
  • GPT-3


  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    PDF Downloads


  1. Ben Ambridge. Against stored abstractions: A radical exemplar model of language acquisition. First Language, 40(5-6):509-559, 2020a. Google Scholar
  2. Ben Ambridge. Abstractions made of exemplars or "you’re all right, and i’ve changed my mind": Response to commentators. First Language, 40(5-6):640-659, 2020b. Google Scholar
  3. Emily M Bender and Alexander Koller. Climbing towards nlu: On meaning, form, and understanding in the age of data. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 5185-5198, 2020. Google Scholar
  4. Thea Cameron-Faulkner, Elena Lieven, and Michael Tomasello. A construction based analysis of child directed speech. Cognitive Science, 27(6):843-873, 2003. Google Scholar
  5. Daniel C Dennett. From bacteria to Bach and back: The evolution of minds. WW Norton & Company, 2017. Google Scholar
  6. Uri Hasson, Samuel A Nastase, and Ariel Goldstein. Direct fit to nature: An evolutionary perspective on biological and artificial neural networks. Neuron, 105(3):416-434, 2020. Google Scholar
  7. Dermot Lynott, Louise Connell, Marc Brysbaert, James Brand, and James Carney. The lancaster sensorimotor norms: multidimensional measures of perceptual and action strength for 40,000 english words. Behavior Research Methods, pages 1-21, 2019. Google Scholar
  8. Jean M Mandler. How to build a baby: Ii. conceptual primitives. Psychological Review, 99(4):587, 1992. Google Scholar
  9. Andrea E Martin. A compositional neural architecture for language. Journal of Cognitive Neuroscience, 32(8):1407-1427, 2020. Google Scholar
  10. James L McClelland, David E Rumelhart, PDP Research Group, et al. Parallel distributed processing, volume 2. MIT press Cambridge, MA, 1986. Google Scholar
  11. William Merrill, Yoav Goldberg, Roy Schwartz, and Noah A. Smith. Provable limitations of acquiring meaning from ungrounded form: What will future language models understand? CoRR, abs/2104.10809, 2021. URL: http://arxiv.org/abs/2104.10809.
  12. Mitja Nikolaus and Abdellah Fourtassi. Evaluating the acquisition of semantic knowledge from cross-situational learning in artificial neural networks, May 2021. URL: https://doi.org/10.31234/osf.io/mbesf.
  13. Kexin Yi, Jiajun Wu, Chuang Gan, Antonio Torralba, Pushmeet Kohli, and Joshua B Tenenbaum. Neural-symbolic vqa: Disentangling reasoning from vision and language understanding. arXiv preprint, 2018. URL: http://arxiv.org/abs/1810.02338.
  14. Chiyuan Zhang, Samy Bengio, Moritz Hardt, Benjamin Recht, and Oriol Vinyals. Understanding deep learning requires rethinking generalization, 2017. URL: http://arxiv.org/abs/1611.03530.
Questions / Remarks / Feedback

Feedback for Dagstuhl Publishing

Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail