About
I am a member of technical staff at xAI. My research interests include machine learning systems, large language models, compilers, and distributed systems.
Previously, I completed my Ph.D. at UC Berkeley, where I was advised by Ion Stoica and Joseph E. Gonzalez. I was honored to receive the Meta PhD Fellowship and the a16z Open Source AI Grant in recognition of my innovative research and impactful open-source projects. I also co-founded the non-profit organization LMSYS.org to advance open large language model research. We have developed open models with millions of downloads, crowdsourced platforms with millions of users, and systems that are orders of magnitude faster.
The easiest way to reach me is via the SGLang Slack. We’re looking for open-source enthusiasts and learners to help build the SGLang project and community. Contact us on Slack, GitHub, or by email. If you’re a beginner and can commit 10-20 hours per week for at least six months, we can provide mentorship in machine learning systems.
Research
- Systems for training and serving large models
- Open large language models, benchmarks, and datasets for large language models
- Projects: Vicuna (lead), Chatbot Arena (lead), LMSYS-Chat-1M (lead)
- First-author publications: NeurIPS 23, ICML 24, ICLR 24, Preprint 23
- Deep learning compilers and auto-tuning
- Projects: TVM, Ansor (lead)
- First-author publications: OSDI 20, NeurIPS 21
- Other publications: OSDI 18, NeurIPS 20, ASPLOS 23
*(lead) indicates that I lead or co-lead the project.