norabelrose commited on
Commit
87bb983
1 Parent(s): cab7c56

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -0
README.md ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ datasets:
4
+ - EleutherAI/pile
5
+ language:
6
+ - en
7
+ ---
8
+
9
+ These SAEs were trained on the outputs of each of the MLPs in [EleutherAI/pythia-160m](https://huggingface.co/EleutherAI/pythia-160m). We used 8.2 billion tokens from the Pile training set at a context length of 2049. The number of latents is 32,768.