--- language: en library_name: mlsae license: mit tags: - model_hub_mixin - pytorch_model_hub_mixin datasets: - monology/pile-uncopyrighted --- A Multi-Layer Sparse Autoencoder (MLSAE) trained on [EleutherAI/pythia-70m-deduped](https://huggingface.co/EleutherAI/pythia-70m-deduped) and [monology/pile-uncopyrighted](https://huggingface.co/datasets/monology/pile-uncopyrighted), with an expansion factor of 1 and $k = 16$. For more details, see the [paper](https://arxiv.org/submit/5837813) and the [Weights & Biases project](https://wandb.ai/timlawson-/mlsae). This model has been pushed to the Hub using the [PytorchModelHubMixin](https://huggingface.co/docs/huggingface_hub/package_reference/mixins#huggingface_hub.PyTorchModelHubMixin) integration: - Library: - Docs: