|
--- |
|
inference: false |
|
--- |
|
|
|
<br> |
|
<br> |
|
|
|
# LWM-Text-1M-Jax Model Card |
|
|
|
## Model details |
|
|
|
**Model type:** |
|
LWM-Text-1M-Jax is an open-source model trained from LLaMA-2 on a subset of Books3 filtered data. It is an auto-regressive language model, based on the transformer architecture. |
|
|
|
The model is a Jax checkpoint. Inference code and instructions can be found at: https://github.com/LargeWorldModel/lwm |
|
|
|
**Model date:** |
|
LWM-Text-1M-Jax was trained in December 2023. |
|
|
|
**Paper or resources for more information:** |
|
https://largeworldmodel.github.io/ |
|
|
|
## License |
|
Llama 2 is licensed under the LLAMA 2 Community License, |
|
Copyright (c) Meta Platforms, Inc. All Rights Reserved. |
|
|
|
**Where to send questions or comments about the model:** |
|
https://github.com/LargeWorldModel/lwm/issues |
|
|
|
## Training dataset |
|
- 800 subset of Books3 documents with 1M plus tokens |