vLLM: Easy, Fast, and Cheap LLM Serving for Everyone
September 14
•
10:50 - 11:25
Location: Venue 4 - 338
vLLM is a fast and easy-to-use library for LLM inference and serving. In this talk, I will briefly introduce the evolution of the vLLM project, the open-source community behind it, and highlight some features that are interesting to many users.