Andy J Yang

杨加锋


PhD student at the University of Notre Dame, advised by Dr. David Chiang in the NLP lab. Co-organizer for FLaNN. Supported by a Notre Dame Deans' Fellowship and an NSF Graduate Research Fellowship.

I study theoretical aspects of machine learning; proving what different kinds of machines can and cannot do.

Understanding small complexity classes will reveal the secrets of efficient natural language processing.

Publications
The Transformer Cookbook
Andy Yang, Christopher Watson, Anton Xue, Satwik Bhattamishra, Jose Llarena, William Merrill, Emile Dos Santos Ferreira, Anej Svete, and David Chiang.
Knee-deep in C-RASP: a transformer depth hierarchy
Andy Yang, Michaël Cadilhac, and David Chiang.
In Proc. NeurIPS 38. 2025. To appear.
Simulating hard attention using soft attention
Andy Yang, Lena Strobl, David Chiang, and Dana Angluin.
Transactions of the Association for Computational Linguistics, 2025. To appear.
A Formal Framework for Understanding Length Generalization in Transformers
Xinting Huang, Andy Yang, Satwik Bhattamishra, Yash Sarrof, Andreas Krebs, Hattie Zhou, Preetum Nakkiran, Michael Hahn.
In Proc. ICLR. 2025.
Masked hard-attention transformers recognize exactly the star-free languages.
Andy Yang, David Chiang, and Dana Angluin.
In Proc. NeurIPS 37, 10202–10235. 2024.
Other Projects
Notes and Such
Origami