macadeliccc commited on
Commit
269aabb
β€’
1 Parent(s): c7009d8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +75 -0
README.md CHANGED
@@ -1,3 +1,78 @@
1
  ---
2
  license: cc-by-nc-nd-4.0
 
 
 
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: cc-by-nc-nd-4.0
3
+ language:
4
+ - en
5
+ - de
6
+ - ko
7
+ library_name: transformers
8
  ---
9
+ # πŸŒžπŸš€ SOLAR-polyglot-4x10.7
10
+
11
+ Multilingual experiment based on my mixtral collection [Polyglot](https://huggingface.co/collections/macadeliccc/polyglot-65a2027a90b5e87bcdaa5e12)
12
+
13
+ ![solar](solar-polyglot.png)
14
+
15
+ The model is proficient in:
16
+ + English
17
+ + German
18
+ + Korean
19
+
20
+ ## πŸŒ… Code Example
21
+
22
+ Example with evaluation script also available in [colab](https://colab.research.google.com/drive/10FWCLODU_EFclVOFOlxNYMmSiLilGMBZ?usp=sharing)
23
+
24
+ ```python
25
+ from transformers import AutoModelForCausalLM, AutoTokenizer
26
+
27
+ def generate_response(prompt):
28
+ """
29
+ Generate a response from the model based on the input prompt.
30
+
31
+ Args:
32
+ prompt (str): Prompt for the model.
33
+
34
+ Returns:
35
+ str: The generated response from the model.
36
+ """
37
+ # Tokenize the input prompt
38
+ inputs = tokenizer(prompt, return_tensors="pt")
39
+
40
+ # Generate output tokens
41
+ outputs = model.generate(**inputs, max_new_tokens=512, eos_token_id=tokenizer.eos_token_id, pad_token_id=tokenizer.pad_token_id)
42
+
43
+ # Decode the generated tokens to a string
44
+ response = tokenizer.decode(outputs[0], skip_special_tokens=True)
45
+
46
+ return response
47
+
48
+
49
+ # Load the model and tokenizer
50
+ model_id = "macadeliccc/SOLAR-math-2x10.7b"
51
+ tokenizer = AutoTokenizer.from_pretrained(model_id)
52
+ model = AutoModelForCausalLM.from_pretrained(model_id, load_in_4bit=True)
53
+
54
+ prompt = "Explain the proof of Fermat's Last Theorem and its implications in number theory."
55
+
56
+
57
+ print("Response:")
58
+ print(generate_response(prompt), "\n")
59
+ ```
60
+
61
+
62
+ ## Evaluations
63
+
64
+ TODO
65
+
66
+
67
+ ### πŸ“š Citations
68
+
69
+ ```bibtex
70
+ @misc{kim2023solar,
71
+ title={SOLAR 10.7B: Scaling Large Language Models with Simple yet Effective Depth Up-Scaling},
72
+ author={Dahyun Kim and Chanjun Park and Sanghoon Kim and Wonsung Lee and Wonho Song and Yunsu Kim and Hyeonwoo Kim and Yungi Kim and Hyeonju Lee and Jihoo Kim and Changbae Ahn and Seonghoon Yang and Sukyung Lee and Hyunbyung Park and Gyoungjin Gim and Mikyoung Cha and Hwalsuk Lee and Sunghun Kim},
73
+ year={2023},
74
+ eprint={2312.15166},
75
+ archivePrefix={arXiv},
76
+ primaryClass={cs.CL}
77
+ }
78
+ ```