English
sae-pythia-160m-32k / README.md
norabelrose's picture
Create README.md
87bb983 verified
metadata
license: mit
datasets:
  - EleutherAI/pile
language:
  - en

These SAEs were trained on the outputs of each of the MLPs in EleutherAI/pythia-160m. We used 8.2 billion tokens from the Pile training set at a context length of 2049. The number of latents is 32,768.