The Tensor Logo

Anthropic makes RAG smarter with Contextual Retrieval, achieving 67% reduction in hallucinations.

Anthropic makes RAG smarter with Contextual Retrieval, achieving 67% reduction in hallucinations.
Oct 4, 2024
RESEARCH

Anthropic has created Contextual Retrieval—a new method that helps them recall crucial details they used to forget and reduce hallucinations. By adding context back into the retrieval process, this approach cuts down on missed information by nearly half. Think of Contextual Retrieval as giving your AI model a better set of sticky notes to keep track of important facts.

How it works:

  • Contextual Retrieval enhances RAG: It builds upon Retrieval-Augmented Generation (RAG) by adding chunk-specific context before embedding information.
  • Two key techniques: Uses Contextual Embeddings and Contextual BM25 to improve search accuracy and relevance.
  • Significant reduction in retrieval failures: Cuts down failed information retrievals by 49%, and up to 67% when combined with reranking methods.
  • Automated context generation: Leverages AI models like Claude to automatically generate brief context summaries for each data chunk.
  • Easily deployable: You can implement Contextual Retrieval in your own projects with available cookbooks and minimal hassle.

Why it matters: For enterprise adoption and making people trust AI, it needs to be accurate. In cases like customer support, legal review and medical assistance—retrieving the right information is critical. Contextual Retrieval ensures AI models don't miss the mark by forgetting important details, leading to more accurate responses and better performance in downstream tasks. It's a significant step forward from traditional RAG methods that often dropped the ball on context.