Efficient and Equitable Natural Language Processing in the Age of Deep Learning (Dagstuhl Seminar 22232)

Authors Jesse Dodge, Iryna Gurevych, Roy Schwartz, Emma Strubell, Betty van Aken and all authors of the abstracts in this report



PDF
Thumbnail PDF

File

DagRep.12.6.14.pdf
  • Filesize: 1.46 MB
  • 14 pages

Document Identifiers

Author Details

Jesse Dodge
  • AI2 - Seattle, US
Iryna Gurevych
  • TU Darmstadt, DE
Roy Schwartz
  • The Hebrew University of Jerusalem, IL
Emma Strubell
  • Carnegie Mellon University - Pittsburgh, US
Betty van Aken
  • (Berliner Hochschule für Technik, DE
and all authors of the abstracts in this report

Cite AsGet BibTex

Jesse Dodge, Iryna Gurevych, Roy Schwartz, Emma Strubell, and Betty van Aken. Efficient and Equitable Natural Language Processing in the Age of Deep Learning (Dagstuhl Seminar 22232). In Dagstuhl Reports, Volume 12, Issue 6, pp. 14-27, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)
https://doi.org/10.4230/DagRep.12.6.14

Abstract

This report documents the program and the outcomes of Dagstuhl Seminar 22232 "Efficient and Equitable Natural Language Processing in the Age of Deep Learning". Since 2012, the field of artificial intelligence (AI) has reported remarkable progress on a broad range of capabilities including object recognition, game playing, speech recognition, and machine translation. Much of this progress has been achieved by increasingly large and computationally intensive deep learning models: training costs for state-of-the-art deep learning models have increased 300,000 times between 2012 and 2018 [1]. Perhaps the epitome of this trend is the subfield of natural language processing (NLP) that over the past three years has experienced even sharper growth in model size and corresponding computational requirements in the word embedding approaches (e.g. ELMo, BERT, openGPT-2, Megatron-LM, T5, and GPT-3, one of the largest models ever trained with 175B dense parameters) that are now the basic building blocks of nearly all NLP models. Recent studies indicate that this trend is both environmentally unfriendly and prohibitively expensive, raising barriers to participation in NLP research [2,3]. The goal of this seminar was to mitigate these concerns and promote equity of access in NLP. References. [1] D. Amodei and D. Hernandez. 2018. AI and Compute. https://openai.com/blog/ai-and-compute [2] R. Schwartz, D. Dodge, N. A. Smith, and O. Etzioni. 2020. Green AI. Communications of the ACM (CACM) [3] E. Strubell, A. Ganesh, and A. McCallum. 2019. Energy and Policy Considerations for Deep Learning in NLP. In Proc. of ACL.

Subject Classification

ACM Subject Classification
  • Computing methodologies → Natural language processing
  • Computing methodologies → Neural networks
  • Social and professional topics → Sustainability
Keywords
  • deep learning
  • efficiency
  • equity
  • natural language processing (nlp)

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail