Surbhi Goel

Surbhi Goel

Assistant Professor

University of Pennsylvania, Philadelphia

I am the Magerman Term Assistant Professor of Computer and Information Science at University of Pennsylvania. I am associated with the theory group, the ASSET Center on safe, explainable, and trustworthy AI systems, and the Warren Center for network and data sciences.

My research interests lie at the intersection of theoretical computer science and machine learning, with a focus on developing theoretical foundations for modern machine learning paradigms especially deep learning.

Prior to this, I was a postdoctoral researcher at Microsoft Research NYC in the Machine Learning group. I obtained my Ph.D. in the Computer Science department at the University of Texas at Austin advised by Adam Klivans. My dissertation was awarded UTCS’s Bert Kay Dissertation award. My Ph.D. research was generously supported by the JP Morgan AI Fellowship and several fellowships from UT Austin. During my PhD, I visited IAS for the Theoretical Machine learning program and the Simons Institute for the Theory of Computing at UC Berkeley for the Foundations of Deep Learning program (supported by the Simons-Berkeley Research Fellowship). Before that, I received my Bachelors degree from Indian Institute of Technology (IIT) Delhi majoring in Computer Science and Engineering.

For prospective students who are interested in working with me: send me an email with your CV, an overview of your research interests, and a brief description of 1-2 recent papers (not mine) you have read and enjoyed.

I am currently co-teaching CIS 5200: Machine Learning with Eric Wong.

Download my resumé.

Interests
  • Theory
  • Machine Learning
Education
  • PhD in Computer Science, 2020

    University of Texas at Austin

  • MS in Computer Science, 2019

    University of Texas at Austin

  • BTech in Computer Science and Engineering, 2015

    Indian Institute of Technology, Delhi

Recent Publications & Preprints

(2022). Transformers Learn Shortcuts to Automata. ICLR 2023 [notable-top-5%].

PDF Cite

(2022). Recurrent Convolutional Neural Networks Learn Succinct Learning Algorithms. NeurIPS 2022.

PDF Cite

(2022). Hidden Progress in Deep Learning: SGD Learns Parities Near the Computational Limit. NeurIPS 2022.

PDF Cite

(2022). Inductive Biases and Variable Creation in Self-Attention Mechanisms. ICML 2022.

PDF Cite

(2022). Understanding Contrastive Learning Requires Incorporating Inductive Biases. ICML 2022.

PDF Cite

(2022). Anti-Concentrated Confidence Bonuses for Scalable Exploration. ICLR 2022.

PDF Cite

Outreach

Co-organizer
Co-founded this community building and mentorship initiative for the learning theory community. Co-organized mentorship workshops at ALT 2021, COLT 2021, ALT 2022, and Fall 2022. Co-organized a graduate applications support program in collaboration with WiML-T.
Mentor

Professional Services

Program Committee
Program Committee
Program Committee
Virtual Experience Chair
Co-organized the virtual part of the hybrid conference, including the 2-day virtual-only program.
Program Committee
Program Committee
Treasurer