Skip to main content

Spring Deadline: Sunday, March 1 @ 11:59pm PT. Click here to apply.

Entropy-Based Dynamic Hybrid Retrieval for Adaptive Query Weighting in RAG Pipelines

Entropy-Based Dynamic Hybrid Retrieval for Adaptive Query Weighting in RAG Pipelines

December 1, 2025

We introduce Dynamic Alpha Tuning (DAT), a dynamic model that adjusts the weighting coefficient between sparse and dense retrievers based on model confidence. DAT employs meta-learned schemes to adapt...

Accepted to VecDB @ ICML 2025

Authors: John Richard Perez, James Zhou, Radley Le, Alexander Menchtchikov, Ryan Lagasse

We introduce Dynamic Alpha Tuning (DAT), a dynamic model that adjusts the weighting coefficient between sparse and dense retrievers based on model confidence. DAT employs meta-learned schemes to adaptively skew contributions between retrieval methods. Our experimental results on HotPotQA and TriviaQA benchmarks show that this yields coverage and answer diversity advantages over static hybrid approaches. We also evaluate Bayesian methods, such as variational retrieval confidence and Monte Carlo dropout, as alternatives to estimate principled query-specific weighting.

Begin Your Journey

The application takes 10 minutes and is reviewed on a rolling basis. We look for strong technical signal—projects, coursework, or competition results—and a genuine curiosity to do real research.

If admitted, you will join a structured pipeline with direct mentorship to take your work from ideation to top conference submission at venues like NeurIPS, ACL, and EMNLP.