We just released the MLC LLM, a universal solution allowing any language model to be deployed natively on a diverse set of hardware backends and also a productive framework for everyone to optimize model performance for their own use cases. This is the companion project of Web LLM, which brings large language models and stable diffusion models completely to people’s web browsers. Come on to check and try out our demo of Web LLM and application (including an iOS one for your iPhone!) of MLC LLM!