Fix typo: announcment -> announcement
#3
by
tomaarsen
HF staff
- opened
README.md
CHANGED
@@ -19,6 +19,6 @@ Eagle 7B is a 7.52B parameter model that:
|
|
19 |
- All while being an “Attention-Free Transformer”
|
20 |
- Is a foundation model, with a very small instruct tune - further fine-tuning is required for various use cases!
|
21 |
|
22 |
-
Find out more at our model
|
23 |
|
24 |
Or our wiki: https://wiki.rwkv.com
|
|
|
19 |
- All while being an “Attention-Free Transformer”
|
20 |
- Is a foundation model, with a very small instruct tune - further fine-tuning is required for various use cases!
|
21 |
|
22 |
+
Find out more at our model announcement: https://blog.rwkv.com/p/eagle-7b-soaring-past-transformers
|
23 |
|
24 |
Or our wiki: https://wiki.rwkv.com
|