--- license: apache-2.0 task_categories: - text-generation language: - en tags: - web - common crawl size_categories: - 10B

[ArXiv](https://arxiv.org/abs/2409.17115) | [Models](https://huggingface.co/collections/gair-prox/prox-general-models-65f1674f0607712c4d6eec76) | [Code](https://github.com/GAIR-NLP/ProX) RedPajama-pro is refined from [RedPajama-Data-V2](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-V2) using the **ProX** refining framework. It contains about 30B high quality tokens, ready for general language model pre-training. ## License RedPajama-pro is based on RedPajama-Data-V2, which is made available under an apache-2.0 license; users should also abide by the CommonCrawl ToU: https://commoncrawl.org/terms-of-use/. We do not alter the license of any of the underlying data. ### Citation ``` @article{zhou2024programming, title={Programming Every Example: Lifting Pre-training Data Quality like Experts at Scale}, author={Zhou, Fan and Wang, Zengzhi and Liu, Qian and Li, Junlong and Liu, Pengfei}, journal={arXiv preprint arXiv:2409.17115}, year={2024} } ```