Edit model card
YAML Metadata Error: "datasets[1]" with value "The Pile" is not valid. If possible, use a dataset id from https://hf.co/datasets.

Quantized Cedille/fr-boris with 8-bit weights

This is a version of Cedille's GPT-J (fr-boris) with 6 billion parameters that is modified so you can generate and fine-tune the model in colab or equivalent desktop gpu (e.g. single 1080Ti). Inspired by GPT-J 8bit.

Here's how to run it: colab

This model can be easily loaded using the GPTJForCausalLM functionality:

from transformers import GPTJForCausalLM
model = GPTJForCausalLM.from_pretrained("gustavecortal/fr-boris-8bit")

fr-boris

Boris is a 6B parameter autoregressive language model based on the GPT-J architecture and trained using the mesh-transformer-jax codebase.

Boris was trained on around 78B tokens of French text from the C4 dataset.

Links

Downloads last month
41
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.