This is a low-rank adapter for OpneCALM-3B trained on the 134K Japanese dataset.
It doesn't contain the foundation model itself, so it's Apache 2.0 licensed.
You can try it here.
colab notebook
QLoRA finetuning code is here.
colab notebook
This is a low-rank adapter for OpneCALM-3B trained on the 134K Japanese dataset.
It doesn't contain the foundation model itself, so it's Apache 2.0 licensed.
You can try it here.
colab notebook
QLoRA finetuning code is here.
colab notebook