Preserve privacy using local RAG with Tonic.ai + LlamaIndex
Chiara Colombi
Jerry Liu
Adam Kamor, PhD
Retrieval Augmented Generation (RAG) has taken the generative AI world by force, but sharing private data with third party services is a no-go for many companies. Local deployments can solve the data privacy concern, and several approaches have recently emerged to leverage the power of generative AI while keeping your data squarely in your hands.
Join us for a webinar in which we’ll guide you through building an Ollama-based local RAG system using LlamaIndex and Tonic Validate. Adam Kamor, Co-Founder and Head of Engineering at Tonic.ai, and Jerry Liu, Co-founder and CEO of LlamaIndex, will take us through the privacy implications of generative AI tools and how to overcome these challenges using local deployments.
Once your private data is integrated with your RAG system in your cloud using LlamaIndex, Tonic Validate is there for you, via local deployment, to benchmark your RAG system’s performance without ever having to send your data to a third party LLM.
We’ll cover:
- Privacy implications of RAG systems and third-party LLMs
- Configuring a local instance of Ollama
- Integrating your private data locally using LlamaIndex
- Harnessing continuous performance monitoring locally through Tonic Validate
- ...and more!
See you there!