Published On Sep 21, 2024
In this keynote Janakiram talks about the explosion of interest in Generative AI has brought to light the critical need for scalable and reliable infrastructure to support these advanced technologies. A lot of the model serving and inference endpoints for generative AI's foundational models, especially those used in commercial platforms like OpenAI, depend on Kubernetes.
This session delves into the practical aspects of deploying open-source generative AI models and applications within a Kubernetes and cloud-native framework. A key focus will be on how enterprises can establish end-to-end pipelines to develop context-rich, Large Language Model (LLM)-based applications in-house.
Designed for those seeking to understand the intersection of modern AI and cloud-native technology, this hands-on session will share insights and best practices for running cutting-edge AI applications on Kubernetes.
Join our community:
KCD Hyderabad:
Twitter: https://x.com/kcdhyderabad
LinkedIn: / kcdhyd
CNCF Hyderabad:
Twitter: https://x.com/cncfhyd
LinkedIn: / cncf-hyderabad
Community: https://community.cncf.io/hyderabad/