Update README.md
Browse files
README.md
CHANGED
@@ -4,10 +4,12 @@ tags:
|
|
4 |
pipeline_tag: feature-extraction
|
5 |
---
|
6 |
DRAGON+ is a BERT-base sized dense retriever initialized from [RetroMAE](https://huggingface.co/Shitao/RetroMAE) and further trained on the data augmented from MS MARCO corpus, following the approach described in [How to Train Your DRAGON:
|
7 |
-
Diverse Augmentation Towards Generalizable Dense Retrieval](\url).
|
8 |
-
|
|
|
|
|
9 |
Model | Initialization | Query Encoder Path | Context Encoder Path
|
10 |
-
|
11 |
DRAGON+ | Shitao/RetroMAE| facebook/dragon-plus-query-encoder | facebook/dragon-plus-context-encoder
|
12 |
|
13 |
## Usage (HuggingFace Transformers)
|
|
|
4 |
pipeline_tag: feature-extraction
|
5 |
---
|
6 |
DRAGON+ is a BERT-base sized dense retriever initialized from [RetroMAE](https://huggingface.co/Shitao/RetroMAE) and further trained on the data augmented from MS MARCO corpus, following the approach described in [How to Train Your DRAGON:
|
7 |
+
Diverse Augmentation Towards Generalizable Dense Retrieval](\url).
|
8 |
+
|
9 |
+
The associated GitHub repository is available here https://github.com/facebookresearch/dpr-scale/tree/dragon. We use asymmetric dual encoder, with two distinctly parameterized encoders.
|
10 |
+
|
11 |
Model | Initialization | Query Encoder Path | Context Encoder Path
|
12 |
+
|---|---|---|---
|
13 |
DRAGON+ | Shitao/RetroMAE| facebook/dragon-plus-query-encoder | facebook/dragon-plus-context-encoder
|
14 |
|
15 |
## Usage (HuggingFace Transformers)
|