PhiMiX-2x2B
PhiMiX-2x2B is a Mixure of Experts (MoE) made with the following models using mergekit:
©️ Credits
- mlabonne's phixtral for the PhiConfig and inference code.
- mergekit code which I tweaked (you can find the PhiConfig here)
by mainly adding the config in the
moe_mixtral.py
script frommixtral
branch.
⏱️ Benchmarks
Model | AGIEval | GPT4All | TruthfulQA | Bigbench | Average |
---|---|---|---|---|---|
PhiMiX-2x2B | 33.34 | 71.75 | 49.25 | 37.62 | 47.99 |
phixtral-4x2_8 | 33.91 | 70.44 | 48.78 | 37.68 | 47.7 |
phixtral-2x2_8 | 34.1 | 70.44 | 48.78 | 37.82 | 47.78 |
phi-2-orange | 33.37 | 71.33 | 49.87 | 37.3 | 47.97 |
dolphin-2_6-phi-2 | 33.12 | 69.85 | 47.39 | 37.2 | 46.89 |
I have used bold to highlight this merge from the list, and italics to highlight it's base modes used in the merge, and then bold in the cells where it exceeds the performance of either.
🧩 Configuration
base_model: rhysjones/phi-2-orange
gate_mode: cheap_embed
dtype: float16
experts:
- source_model: cognitivecomputations/dolphin-2_6-phi-2
positive_prompts: ["research, logic, math, science"]
- source_model: rhysjones/phi-2-orange
positive_prompts: ["programming, reasoning"]
💻 Usage
!pip install -qU transformers bitsandbytes accelerate
from transformers import AutoTokenizer
import transformers
import torch
model = "paulilioaica/PhiMiX-2x2B"
tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
"text-generation",
model=model,
trust_remote_code=True,
model_kwargs={"torch_dtype": torch.float16, "load_in_4bit": True,},
)
prompt="How many continents are there?"
input = f"Instruct: {prompt}\nOutput:"
outputs = pipeline(input, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
Instruct: How many continents are there?
Output: There are seven continents: Africa, Antarctica, Asia, Europe, North America, Australia, and South America. The total number of continents on Earth is seven, including Antarctica, which is sometimes considered part of the continent of Antarctica or as its own continent.
It's important to note that the number of continents in popular education and geography is seven, but some sources may include Antarctica as its own continent, while others include it as part of the continent of Antarctica. Regardless of the exact categorization, there are seven continents that collectively make up the Earth's landmass.
The continents can be divided into several subregions, such as islands, archipelagos, and microcontinents, which are smaller land masses surrounded by water. These subregions can be considered part of the continents or their own unique entities, depending on the context.
Each continent has its own unique geography, climate, flora, fauna, and human cultures. The continents are interconnected through various landforms, bodies of water, and global trade routes.
In summary, there are seven continents on Earth, each with its own distinct characteristics and unique contributions to the world's diversity. While the number may vary depending on the categorization of Antarctica, all seven continents together make
♻️ Replicate this repo
beware this will only work with 2 phis, you might have to tinker in the naming thing for more layers
AFTER all the file modifications and run, you need to replace configs.json
with the one from this repo
AFTER that you need to add modeling_phi.py
and configurations.phi
from this repo to your repo
Steps
- Modify moe_mixtral.py from
/content/mergekit/mergekit/scripts/mixtral_moe.py
to your hf repo
- Modify architecture.py
/content/mergekit/mergekit/architecture.py
(this you can take from the link to the commit i have in description)
- replace
configs.json
with the one from this repo - you need to add
modeling_phi.py
andconfigurations.phi
from this repo to your repo
- Downloads last month
- 508
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.