Skip to main content

Jerry Liu

Headshot of Jerry Liu
Program Year:
3
University:
Stanford University
Field of Study:
Computational Math
Advisor:
Chris Re
Degree(s):
B.S. Mathematics, and B.S. Computer Science, Duke University, 2022

Practicum Experience(s)

Lawrence Berkeley National Laboratory (2023)

Practicum Supervisor(s):
Michael
Mahoney
Practicum Title:
Does In-Context Operator Learning Generalize to Domain-Shifted Settings?

Summary of Research

Foundation models have revolutionized language and vision modeling, but so far they have shown limited effectiveness for regression-type tasks and continuous-valued data, such as time series and PDEs. I am broadly interested in understanding the fundamental shortcomings of existing machine learning techniques using tools from numerics, statistics, and complexity theory, towards developing proper foundation models for numerical tasks. Some questions from recent work:
- Do existing machine learning methods only perform pattern matching, or can they learn proper (numerical) algorithms directly from data?
- Why do existing architectures struggle to perform numerical operations precisely, and can we do better?
- What does generalization mean for continuous-valued regression tasks, e.g. for differential equations?

Publications

Jerry Weihong Liu, Jessica Grogan, Owen M Dugan, Ashish Rao, Simran Arora, Atri Rudra, Christopher Re. Towards Learning High-Precision Least Squares Algorithms with Sequence Models. ICLR 2025.

Roberto Garcia, Jerry Weihong Liu, Daniel Sorvisto, Sabri Eyuboglu. Adaptive Rank Allocation: Speeding Up Modern Transformers with RaNA Adapters. ICLR 2025.

Christopher Fifty, Ronald Guenther Junkins, Dennis Duan, Aniketh Iyengar, Jerry Weihong Liu, Ehsan Amid, Sebastian Thrun, Christopher Re. Restructuring Vector Quantization with the Rotation Trick. ICLR 2025.

Zhenshuo Zhang, Jerry Weihong Liu, Christopher Re, Hongyang R Zhang. A Hessian View of Grokking in Mathematical Reasoning. The 4th Workshop on Mathematical Reasoning and AI, NeurIPS 2024.

Jerry Weihong Liu, Jessica Grogan, Owen M Dugan, Simran Arora, Atri Rudra, Christopher Re. Can Transformers Solve Least Squares to High Precision? 1st Workshop on In-Context Learning, ICML 2024.

Jerry Weihong Liu, N. Benjamin Erichson, Kush Bhatia, Michael W. Mahoney, Christopher Re. Does In-Context Operator Learning Generalize to Domain-Shifted Settings? The Symbiosis of Deep Learning and Differential Equations III Workshop, NeurIPS 2023.

Jerry Liu, Jin Yao. Non-Diffusive Volume Advection with A High Order Interface Reconstruction Method. USNCCM, 2021.

Liu, J W, and Yao, J. Non-Diffusive Volume Advection with A High Order Interface Reconstruction Method. Lawrence Livermore National Laboratory, United States, 2021.

Jin Yao, Jerry Liu. Volume-of-Fluids Interface Reconstruction with Curvature and Corner Definition. WCCM-ECCOMAS, 2020.

Metaphor Detection Using Contextual Word Embeddings From Transformers, Second Workshop on Figurative Language Processing, NAACL, 2020. Jerry Liu, Nathan O' Hara, Alexander Rubin, Rachel Draelos, Cynthia Rudin.

Awards

Alex Vasilos Award, Duke University: 2022.
NSF Graduate Research Fellowship: 2022 (declined).
CRA Outstanding Undergraduate Researcher, honorable mention: 2022.
Phi Beta Kappa (honor society): 2021.
Karl Menger Award, Duke University: 2019.
William Lowell Putnam Mathematical Competition (top 250 participants): 2018, 2019.