Karpukhin+'20 - Dense Passage Retrieval for Open-Domain Question Answering (EMNLP)
Dense Passage Retrieval (DPR)
- Karpukhin+'20 - Dense Passage Retrieval for Open-Domain Question Answering (EMNLP) [ACL Anthology][arXiv][GitHub][huggingface]
Open-domain question answering relies on efficient passage retrieval to select candidate contexts, where traditional sparse vector space models, such as TF-IDF or BM25, are the de facto method. In this work, we show that retrieval can be practically implemented using dense representations alone, where embeddings are learned from a small number of questions and passages by a simple dual-encoder framework. When evaluated on a wide range of open-domain QA datasets, our dense retriever outperforms a strong Lucene-BM25 system greatly by 9%-19% absolute in terms of top-20 passage retrieval accuracy, and helps our end-to-end QA system establish new state-of-the-art on multiple open-domain QA benchmarks.
1. どんなもの?
オープンドメイン質問応答における de-facto standard モデル。① 質問に関連する文書を検索する検索モジュール (retriever) と ② 検索された関連文書集合から解答スパンを推定する解答モジュール (reader) の retriever-reader 型のシステムで質問の解答を行う。retriever のアーキテクチャは、質問エンコーダおよび文書エンコーダの二つのエンコーダを用いるデュアルエンコーダ型で、訓練時は対照学習に基づいて質問ベクトルと関連する文書ベクトルとのベクトル内積値を高くする(関連しないベクトルは低くする)ように学習を行う。関連度に基づくマッチングをエンコーダ外部で行うため、faiss を用いて高速に推論を行うことができる。従来の TF-IDF や BM25 のような表層情報を用いたマッチングとは異なり、密なベクトル表現による意味的なマッチングが可能である。
2. 先行研究と比べてどこがすごい?
- TF-IDF や BM25 などの表層マッチングによる検索手法に対して、BERT による contextualized embeddings を使用した semantic search を行うため、表記揺れなどに対して柔軟な対応が期待される。
- クロスエンコーダに対して二つの BERT エンコーダを使用することで、高速に推論を行うオフライン検索が可能。
3. 技術や手法のキモはどこ?
4. どうやって有効だと検証した?
5. 議論はある?
負例作成に関する工夫
- ミニバッチ内から負例を作成 → 推論時における検索対象数との大きな差が検索性能に影響
- クエリと文書間の関係のみモデル化 → 文書間同士の類似関係を考慮しない
- ハード負例文書を語のマッチングにより作成 → false negative 文書が負例となる可能性あり
検索対象のメモリ効率化
- 検索対象の文書数が膨大 → メモリコストが大きい
クエリとのマッチング強化
6. 次に読むべき論文は?
負例作成に関する工夫
- Xiong+'20 - Approximate Nearest Neighbor Negative Contrastive Learning for Dense Text Retrieval [arXiv][GitHub]
- Qu+'21 - RocketQA: An Optimized Training Approach to Dense Passage Retrieval for Open-Domain Question Answering (NAACL) [ACL Anthology][arXiv][GitHub]
- Ren+'21 - PAIR: Leveraging Passage-Centric Similarity Relation for Improving Dense Passage Retrieval (ACL) [ACL Anthology][arXiv][GitHub]
- Zhan+'21 - Optimizing dense retrieval model training with hard negatives (SIGIR) [arXiv][GitHub]
- Lu+'21 - Multi-stage Training with Improved Negative Contrast for Neural Passage Retrieval (EMNLP) [ACL Anthology]
- Wu+'22 - Sentence-aware Contrastive Learning for Open-Domain Passage Retrieval (ACL) - catshun’s blog
ベクトルのメモリ効率化
- Izacard+'20 - A Memory Efficient Baseline for Open Domain Question Answering [arXiv]
- Yamada+'21 - Efficient Passage Retrieval with Hashing for Open-domain Question Answering (ACL/IJCNLP) [ACL Anthology][GitHub]
- Santhanam+'21 - ColBERTv2: Effective and Efficient Retrieval via Lightweight Late Interaction [arXiv]
- Ma+'21 - Simple and Effective Unsupervised Redundancy Elimination to Compress Dense Vectors for Passage Retrieval (EMNLP) [ACL Anthology][GitHub]
- Zhan+'22 - Learning Discrete Representations via Constrained Clustering for Effective and Efficient Dense Retrieval (WSDM) [arXiv]
クエリとのマッチング強化
- Liu+'21 - Dense Hierarchical Retrieval for Open-domain Question Answering (EMNLP) [ACL Anthology][arXiv][GitHub]
- Khattab+'21 - Relevance-guided Supervision for OpenQA with ColBERT (TACL) [arXiv][GitHub]
- Gao+'21 - Condenser: a Pre-training Architecture for Dense Retrieval (EMNLP) [ACL Anthology][GitHub]
- Wang+'21 - Enhancing Dual-Encoders with Question and Answer Cross-Embeddings for Answer Retrieval (EMNLP)
- Wu+'22 - Sentence-aware Contrastive Learning for Open-Domain Passage Retrieval (ACL) - catshun’s blog
汎化性能向上
- Sciavolino+'21 - Simple Entity-Centric Questions Challenge Dense Retrievers (EMNLP) [arXiv]
- Zhuang+'21 - Dealing with Typos for BERT-based Passage Retrieval and Ranking (EMNLP) [ACL Anthology][arXiv][GitHub]
- Ni+'21 - Large Dual Encoders Are Generalizable Retrievers [arXiv]
- Liu+'21 - Improving Embedding-based Large-scale Retrieval via Label Enhancement (EMNLP) [ACL Anthology]
- Chen+'21- Salient Phrase Aware Dense Retrieval: Can a Dense Retriever Imitate a Sparse One? [arXiv]
- Wang+'21 - GPL: Generative Pseudo Labeling for Unsupervised Domain Adaptation of Dense Retrieval [arXiv][GitHub]
データ・クエリ拡張
- Lee+'19 - Latent Retrieval for Weakly Supervised Open Domain Question Answering (ACL) [ACL Anthology][arXiv]
- Guu+'20 - REALM: Retrieval-Augmented Language Model Pre-Training (ICML) [Google AI Blog][arXiv][GitHub]
- Qu+'21 - RocketQA: An Optimized Training Approach to Dense Passage Retrieval for Open-Domain Question Answering (NAACL) [ACL Anthology][arXiv][GitHub]
- Izacard+'21 - Towards Unsupervised Dense Information Retrieval with Contrastive Learning [arXiv][GitHub]
- Mao+'21 - Generation-Augmented Retrieval for Open-Domain Question Answering (ACL/IJCNLP) [ACL Anthology][GitHub]
アーキテクチャの工夫
- Izacard+'20 - Leveraging Passage Retrieval with Generative Models for Open Domain Question Answering (EACL) [ACL Anthology][arXiv][GitHub]
- Cheng+'21 - UnitedQA: A Hybrid Approach for Open Domain Question Answering (ACL/IJCNLP) [ACL Anthology][arXiv]
- Lee+'21 - You Only Need One Model for Open-domain Question Answering [arXiv]
- Tay+'22 - Transformer Memory as a Differentiable Search Index [arXiv]