Andy J Yang

杨加锋


PhD student at the University of Notre Dame, advised by Dr. David Chiang in the NLP lab. Co-organizer for FLaNN. Supported by a Notre Dame Deans' Fellowship and an NSF Graduate Research Fellowship.

Imagine breaking down a difficult cooking recipe into a series of easier steps to teach your younger cousin. I study how complex computations are decomposed into simpler ones inside neural networks.

Publications
The Transformer Cookbook
Andy Yang, Christopher Watson, Anton Xue, Satwik Bhattamishra, Jose Llarena, William Merrill, Emile Dos Santos Ferreira, Anej Svete, and David Chiang.
Knee-deep in C-RASP: a transformer depth hierarchy
Andy Yang, Michaël Cadilhac, and David Chiang.
In Proc. NeurIPS 38. 2025. To appear.
Simulating hard attention using soft attention
Andy Yang, Lena Strobl, David Chiang, and Dana Angluin.
Transactions of the Association for Computational Linguistics, 2025. To appear.
A Formal Framework for Understanding Length Generalization in Transformers
Xinting Huang, Andy Yang, Satwik Bhattamishra, Yash Sarrof, Andreas Krebs, Hattie Zhou, Preetum Nakkiran, Michael Hahn.
In Proc. ICLR. 2025.
Masked hard-attention transformers recognize exactly the star-free languages.
Andy Yang, David Chiang, and Dana Angluin.
In Proc. NeurIPS 37, 10202–10235. 2024.
Other Projects
Talk Slides
Notes and Such
More sprouting soon...
Origami