Hierarchical Abstract Tree for Cross-Document Retrieval-Augmented Generation
Abstract
Ψ-RAG addresses limitations in tree-based retrieval-augmented generation for cross-document multi-hop questions through a hierarchical abstract tree index and multi-granular retrieval agent.
Retrieval-augmented generation (RAG) enhances large language models with external knowledge, and tree-based RAG organizes documents into hierarchical indexes to support queries at multiple granularities. However, existing Tree-RAG methods designed for single-document retrieval face critical challenges in scaling to cross-document multi-hop questions: (1) poor distribution adaptability, where k-means clustering introduces noise due to rigid distribution assumptions; (2) structural isolation, as tree indexes lack explicit cross-document connections; and (3) coarse abstraction, which obscures fine-grained details. To address these limitations, we propose Ψ-RAG, a tree-RAG framework with two key components. First, a hierarchical abstract tree index built through an iterative "merging and collapse" process that adapts to data distributions without a priori assumption. Second, a multi-granular retrieval agent that intelligently interacts with the knowledge base with reorganized queries and an agent-powered hybrid retriever. Ψ-RAG supports diverse tasks from token-level question answering to document-level summarization. On cross-document multi-hop QA benchmarks, it outperforms RAPTOR by 25.9% and HippoRAG 2 by 7.4% in average F1 score. Code is available at https://github.com/Newiz430/Psi-RAG.
Community
We introduce Ψ-RAG, an efficient and powerful hierarchical tree-based RAG framework designed to tackle complex information-seeking scenarios. It features a hierarchical abstract tree index with different abstraction strategies, enabling efficient and precise retrieval with logarithmic time. It employs a multi-granular agentic retriever including a powerful reading & answering agent with a hybrid retrieval pipeline for diverse user requests.
Models citing this paper 1
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 0
No Space linking this paper
Collections including this paper 0
No Collection including this paper