metadata
language: en
library_name: mlsae
license: mit
tags:
- model_hub_mixin
- pytorch_model_hub_mixin
datasets:
- monology/pile-uncopyrighted
A Multi-Layer Sparse Autoencoder (MLSAE) trained on EleutherAI/pythia-70m-deduped and monology/pile-uncopyrighted, with an expansion factor of 1 and $k = 16$. For more details, see the paper and the Weights & Biases project.
This model has been pushed to the Hub using the PytorchModelHubMixin integration: