umarbutler
commited on
Commit
β’
d7d2ef7
1
Parent(s):
22ed005
Update README.md
Browse files
README.md
CHANGED
@@ -11,10 +11,30 @@ tags:
|
|
11 |
- generated_from_trainer
|
12 |
datasets:
|
13 |
- umarbutler/open-australian-legal-corpus
|
|
|
|
|
14 |
widget:
|
15 |
- text: "Under the Crimes Act"
|
16 |
- text: "Section 51 of the Constitution provides"
|
17 |
- text: '"Unsatisfactory professional conduct" includes'
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
18 |
---
|
19 |
|
20 |
# Open Australian Legal GPT2 ββοΈ
|
@@ -28,8 +48,6 @@ To ensure its accessibility to as wide an audience as possible, the model is iss
|
|
28 |
|
29 |
Those interested in learning more about the model are encouraged to read Umar Butler's accompanying article, [How I built the first open LLM for Australian law](https://umarbutler.com/how-i-built-the-first-open-llm-for-australian-law/).
|
30 |
|
31 |
-
|
32 |
-
|
33 |
A smaller, distilled version of the model trained on the same dataset may be found [here](https://huggingface.co/umarbutler/open-australian-legal-distilgpt2), and a larger model trained on a greater number of Australian legal documents is available [here](https://huggingface.co/umarbutler/open-australian-legal-phi-1_5).
|
34 |
|
35 |
## Usage π©βπ»
|
|
|
11 |
- generated_from_trainer
|
12 |
datasets:
|
13 |
- umarbutler/open-australian-legal-corpus
|
14 |
+
metrics:
|
15 |
+
- perplexity
|
16 |
widget:
|
17 |
- text: "Under the Crimes Act"
|
18 |
- text: "Section 51 of the Constitution provides"
|
19 |
- text: '"Unsatisfactory professional conduct" includes'
|
20 |
+
model-index:
|
21 |
+
- name: open-australian-legal-gpt2
|
22 |
+
results:
|
23 |
+
- task:
|
24 |
+
type: text-generation
|
25 |
+
name: Text generation
|
26 |
+
dataset:
|
27 |
+
type: umarbutler/open-australian-legal-qa
|
28 |
+
name: Open Australian Legal QA
|
29 |
+
split: train
|
30 |
+
revision: b53a24f8edf5eb33d033a53b5b53d0a4a220d4ae
|
31 |
+
metrics:
|
32 |
+
- type: perplexity
|
33 |
+
value: 16.37054905087502
|
34 |
+
name: Perplexity
|
35 |
+
source:
|
36 |
+
name: lmppl
|
37 |
+
url: https://github.com/asahi417/lmppl
|
38 |
---
|
39 |
|
40 |
# Open Australian Legal GPT2 ββοΈ
|
|
|
48 |
|
49 |
Those interested in learning more about the model are encouraged to read Umar Butler's accompanying article, [How I built the first open LLM for Australian law](https://umarbutler.com/how-i-built-the-first-open-llm-for-australian-law/).
|
50 |
|
|
|
|
|
51 |
A smaller, distilled version of the model trained on the same dataset may be found [here](https://huggingface.co/umarbutler/open-australian-legal-distilgpt2), and a larger model trained on a greater number of Australian legal documents is available [here](https://huggingface.co/umarbutler/open-australian-legal-phi-1_5).
|
52 |
|
53 |
## Usage π©βπ»
|