FedEL: Federated Elastic Learning for Heterogeneous Devices
NeurIPS 2025
[pdf]
I am a first-year M.Phil. student at the University of Hong Kong, advised by Yingyu Liang. My interests focus on the mathematical principles underlying large language models (LLMs) and general intelligence, including representation, optimization, generalization, and reasoning.
I received my B.S. in Mathematics from the Department of Mathematical Sciences at Middle Tennessee State University (MTSU). I welcome research discussions and collaborations.

Publications
* denotes equal contribution.
Circuit Complexity Bounds for RoPE-based Transformer Architecture*
EMNLP 2025
[pdf]
Hsr-enhanced Sparse Attention Acceleration*
CPAL 2025
[pdf]
Bypassing the Exponential Dependency: Looped Transformers Efficiently Learn In-context by Multi-step Gradient Descent*
AISTATS 2025
[pdf]
This website is adapted from Gregory Gundersen.