Update README.md
Browse files
README.md
CHANGED
@@ -24,7 +24,8 @@ It has state-of-the-art performance among multimodal models with a similar size
|
|
24 |
You can find all models in the Molmo family [here](https://huggingface.co/collections/allenai/molmo-66f379e6fe3b8ef090a8ca19).
|
25 |
**Learn more** about the Molmo family [in our announcement blog post](https://molmo.allenai.org/blog) or the [paper](https://huggingface.co/papers/2409.17146).
|
26 |
|
27 |
-
Molmo 7B-O is based on [OLMo-7B-
|
|
|
28 |
It performs comfortably between GPT-4V and GPT-4o on both academic benchmarks and human evaluation.
|
29 |
|
30 |
This checkpoint is a **preview** of the Molmo release. All artifacts used in creating Molmo (PixMo dataset, training code, evaluations, intermediate checkpoints) will be made available at a later date, furthering our commitment to open-source AI development and reproducibility.
|
|
|
24 |
You can find all models in the Molmo family [here](https://huggingface.co/collections/allenai/molmo-66f379e6fe3b8ef090a8ca19).
|
25 |
**Learn more** about the Molmo family [in our announcement blog post](https://molmo.allenai.org/blog) or the [paper](https://huggingface.co/papers/2409.17146).
|
26 |
|
27 |
+
Molmo 7B-O is based on [OLMo-7B-1024](https://huggingface.co/allenai/OLMo-7B-1024-preview) (a **preview** of next generation of OLMo models)
|
28 |
+
and uses [OpenAI CLIP](https://huggingface.co/openai/clip-vit-large-patch14-336) as vision backbone.
|
29 |
It performs comfortably between GPT-4V and GPT-4o on both academic benchmarks and human evaluation.
|
30 |
|
31 |
This checkpoint is a **preview** of the Molmo release. All artifacts used in creating Molmo (PixMo dataset, training code, evaluations, intermediate checkpoints) will be made available at a later date, furthering our commitment to open-source AI development and reproducibility.
|