File size: 1,285 Bytes
9cd028a
 
 
 
 
 
 
 
 
 
 
 
 
 
f9981b7
 
 
9cd028a
 
 
 
f9981b7
 
 
9a08303
b246286
9a08303
f9981b7
 
9a08303
 
f9981b7
 
9a08303
 
f9981b7
 
b246286
 
 
 
 
f9981b7
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
---
license: odc-by
size_categories:
- 1B<n<10B
configs:
- config_name: default
  data_files:
  - split: train
    path: data/*
task_categories:
- text-generation
language:
- en
pretty_name: open-web-math-pro
tags:
- math
- reasoning
---

# 📚 Open-Web-Math-Pro

<p align="center">
  <img src="prox-teaser.png">
</p>

[ArXiv](https://arxiv.org/abs/2409.17115) | [Models](https://huggingface.co/collections/gair-prox/prox-math-models-66e92c3e5d54b27612286eb9) | [Code](https://github.com/GAIR-NLP/ProX)

Open-Web-Math-Pro is refined from [open-web-math](https://huggingface.co/datasets/open-web-math/open-web-math) using the **ProX** refining framework.
It contains about 5B high quality math related tokens, ready for pre-training.


## License
Open-Web-Math-Pro is based on open-web-math, which is made available under an ODC-By 1.0 license; users should also abide by the CommonCrawl ToU: https://commoncrawl.org/terms-of-use/. We do not alter the license of any of the underlying data.


### Citation
```
@article{zhou2024programming,
  title={Programming Every Example: Lifting Pre-training Data Quality like Experts at Scale},
  author={Zhou, Fan and Wang, Zengzhi and Liu, Qian and Li, Junlong and Liu, Pengfei},
  journal={arXiv preprint arXiv:2409.17115},
  year={2024}
}
```