Ruihang Lai

“It's the start of a new journey.”

profile_photo.jpg

I am Ruihang Lai (赖睿航), a third-year Ph.D. student at Computer Science Department of Carnegie Mellon University. I’m a member of the CMU Catalyst research group, and am fortunate to be co-advised by Prof. Tianqi Chen and Prof. Todd Mowry.

Prior to joining CMU, I finished my undergraduate at ACM Honors Class, Shanghai Jiao Tong University in June 2022, advised by Prof. Yong Yu.

I was a research intern at OctoML and Catalyst. I am a PMC member of Apache TVM project.

Email: ruihangl [at] cs [dot] cmu [dot] edu

GitHub: @MasterJH5574

News

Apr 29, 2023 We just released the MLC LLM, a universal solution allowing any language model to be deployed natively on a diverse set of hardware backends and also a productive framework for everyone to optimize model performance for their own use cases. This is the companion project of Web LLM, which brings large language models and stable diffusion models completely to people’s web browsers. Come on to check and try out our demo of Web LLM and application (including an iOS one for your iPhone!) of MLC LLM!
Mar 8, 2023 We are excited to share with Web Stable Diffusion, the world’s first stable diffusion completely running on the browser. Stable diffusion models are heavy and usually take a server to run. While this time, we bring stable diffusion models to the browser side completely. We have a runnable demo that you can try out on the website. And you are welcome to check out our GitHub repo for more details.
Jan 19, 2023 Our work SparseTIR will be appearing at ASPLOS 2023! Check out the paper, code and documentation if you are interested :-)
Aug 29, 2022 Officially becoming a CMU student! I’m excited to continue focusing my research on machine learning compilation!
Jun 22, 2022 Graduated from SJTU today! So excited to see my undergraduate life come to the end 😆 🎓. Can’t wait to join CMU and keep pushing the limits!

Research

My research interests lie in the intersection of computer systems and machine learning, especially the systems for emerging machine learning workloads. It is exciting to build systems that keep up with the trend of machine learning, or even guide the trend of machine learning. Currently my research focuses on dynamism in machine learning compilation, in order to optimize the compilation for machine learning models that involve sparsity, dynamic shapes and dynamic control flow in both inference and training settings. Checkout my CV to see the projects :-)