Chang Lan's Homepage
Chang Lan is a researcher at Apple, where he works on scalable & efficient LLMs. Previously he was at Google DeepMind and ByteDance.
He received a Ph.D from UC Berkeley and a bachelor’s degree from Tsinghua.
- Projects:
- Apple Foundation Model
- AXLearn: A codebase and framework for large-scale deep learning models
- Gemini and PaLM-2
- BytePS: A distributed DNN training framework
- Blogs
- 2025-11-14 Optimal Checkpointing Frequency
- 2025-08-02 Sequence Sharding: How to train long-context LLMs
- Papers
- Contact