BibTeX Export for Loss Minimization Yields Multicalibration for Large Neural Networks

Copy to Clipboard Download

@InProceedings{blasiok_et_al:LIPIcs.ITCS.2024.17,
  author =	{B{\l}asiok, Jaros{\l}aw and Gopalan, Parikshit and Hu, Lunjia and Kalai, Adam Tauman and Nakkiran, Preetum},
  title =	{{Loss Minimization Yields Multicalibration for Large Neural Networks}},
  booktitle =	{15th Innovations in Theoretical Computer Science Conference (ITCS 2024)},
  pages =	{17:1--17:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-309-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{287},
  editor =	{Guruswami, Venkatesan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2024.17},
  URN =		{urn:nbn:de:0030-drops-195452},
  doi =		{10.4230/LIPIcs.ITCS.2024.17},
  annote =	{Keywords: Multi-group fairness, loss minimization, neural networks}
}

The metadata provided by Dagstuhl Publishing on its webpages, as well as their export formats (such as XML or BibTeX) available at our website, is released under the CC0 1.0 Public Domain Dedication license. That is, you are free to copy, distribute, use, modify, transform, build upon, and produce derived works from our data, even for commercial purposes, all without asking permission. Of course, we are always happy if you provide a link to us as the source of the data.

Read the full CC0 1.0 legal code for the exact terms that apply: https://creativecommons.org/publicdomain/zero/1.0/legalcode

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail