metadata
license: apache-2.0
datasets:
- pankajmathur/orca_mini_v1_dataset
- pankajmathur/WizardLM_Orca
- pankajmathur/dolly-v2_orca
- pankajmathur/alpaca_orca
language:
- en
library_name: transformers
Mistral-7B-model_45k6e2e4
This model is trained on Mistral-7B-v0.1
License Disclaimer:
This model is released under Apache 2.0, and comes with no warranty or guarantees of any kind.
Limitations & Biases:
While this model aims for accuracy, it can occasionally produce inaccurate or misleading results.
Despite diligent efforts in refining the training data, there remains a possibility for the generation of inappropriate, biased, or offensive content.
Exercise caution and cross-check information when necessary. This is an uncensored model.
Citation:
Please kindly cite using the following BibTeX:
@misc{Mistral-7B-model_45k6e2e4,
author = {Pankaj Mathur},
title = {Mistral-7B-model_45k6e2e4: An Orca style Mistral-7B-v0.1 model},
year = {2023},
publisher = {HuggingFace},
journal = {HuggingFace repository},
howpublished = {\url{https://huggingface.co/pankajmathur/Mistral-7B-model_45k6e2e4},
}