OpenLLM: Operating LLMs in Production

Tim Liu is Head of Product at BentoML.

Learn about OpenLLM's open platform for operating LLMs in production. See how to easily experiment with different models and configs. Tackle the challenges of creating production-ready LLM apps -- including cost, latency, compute, data privacy, scalability, evaluation, and the ecosystem of dev tools -- as well as how to run open-source LLMs with ease.

This lecture was originally delivered at “From Toy To Production: Building LLM-Powered Systems that Work in the Real World,” an event in New York City dedicated to scaling LLM-powered systems from experimental stages to real-world production environments.

Subscribe to our resources and blogs