Update README.md
Browse files
README.md
CHANGED
@@ -14,6 +14,14 @@ _Model:_ ELM introduces a new type of _(de)-composable LLM model architecture_ a
|
|
14 |
|
15 |
_Fast Inference with Customization:_ Once trained, the ELM model architecture permits flexible inference strategies at runtime depending on the deployment needs. For instance, the ELM model can be _decomposed_ into smaller slices, i.e., smaller (or larger) models can be extracted from the original model to create multiple inference endpoints. Alternatively, the original (single) ELM model can be loaded _as is_ for inference and different slices within the model can be queried directly to power faster inference. This provides an additional level of flexibility for users to make compute/memory tradeoffs depending on their application and runtime needs.
|
16 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
17 |
## ELM-v0.1 Model Release
|
18 |
Models are located in the `models` folder. ELM models in this repository comes in three sizes (elm-1.0, elm-0.75 and elm-0.25) and supports the following use-cases.
|
19 |
- news_classification
|
|
|
14 |
|
15 |
_Fast Inference with Customization:_ Once trained, the ELM model architecture permits flexible inference strategies at runtime depending on the deployment needs. For instance, the ELM model can be _decomposed_ into smaller slices, i.e., smaller (or larger) models can be extracted from the original model to create multiple inference endpoints. Alternatively, the original (single) ELM model can be loaded _as is_ for inference and different slices within the model can be queried directly to power faster inference. This provides an additional level of flexibility for users to make compute/memory tradeoffs depending on their application and runtime needs.
|
16 |
|
17 |
+
- **Blog:** [Medium](https://medium.com/sujith-ravi/introducing-elm-efficient-customizable-privacy-preserving-llms-cea56e4f727d)
|
18 |
+
|
19 |
+
- **Github:** https://github.com/slicex-ai/elm-v0.1
|
20 |
+
|
21 |
+
- **Demo** (try it out): https://huggingface.co/spaces/slicexai/elm-demo-v1
|
22 |
+
|
23 |
+
- **HuggingFace** (access ELM Model cards, code & app from HF): https://huggingface.co/slicexai
|
24 |
+
-
|
25 |
## ELM-v0.1 Model Release
|
26 |
Models are located in the `models` folder. ELM models in this repository comes in three sizes (elm-1.0, elm-0.75 and elm-0.25) and supports the following use-cases.
|
27 |
- news_classification
|