Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
microsoft
/
Phi-3-mini-4k-instruct
like
1.07k
Follow
Microsoft
4,782
Text Generation
Transformers
Safetensors
English
French
phi3
nlp
code
conversational
custom_code
text-generation-inference
Inference Endpoints
License:
mit
Model card
Files
Files and versions
Community
97
Train
Deploy
Use this model
4eea1a7
Phi-3-mini-4k-instruct
9 contributors
History:
11 commits
gugarosa
fix(tokenizer_config): Adjusts `rstrip` of special tokens. (
#53
)
4eea1a7
verified
6 months ago
.gitattributes
Safe
1.52 kB
initial commit
7 months ago
CODE_OF_CONDUCT.md
Safe
444 Bytes
chore(root): Initial files upload.
7 months ago
LICENSE
Safe
1.08 kB
chore(root): Initial files upload.
7 months ago
NOTICE.md
Safe
1.77 kB
chore(root): Initial files upload.
7 months ago
README.md
Safe
17.2 kB
Update README.md
6 months ago
SECURITY.md
Safe
2.66 kB
chore(root): Initial files upload.
7 months ago
added_tokens.json
Safe
293 Bytes
fix(readme): Adds information about placeholder tokens.
7 months ago
config.json
Safe
904 Bytes
chore(root): Initial files upload.
7 months ago
configuration_phi3.py
Safe
10.4 kB
chore(root): Initial files upload.
7 months ago
generation_config.json
Safe
172 Bytes
chore(root): Initial files upload.
7 months ago
model-00001-of-00002.safetensors
Safe
4.97 GB
LFS
chore(root): Initial files upload.
7 months ago
model-00002-of-00002.safetensors
Safe
2.67 GB
LFS
chore(root): Initial files upload.
7 months ago
model.safetensors.index.json
Safe
16.3 kB
chore(root): Initial files upload.
7 months ago
modeling_phi3.py
Safe
73.8 kB
chore(root): Initial files upload.
7 months ago
sample_finetune.py
Safe
6.34 kB
Update docstrings
7 months ago
special_tokens_map.json
Safe
568 Bytes
chore(root): Initial files upload.
7 months ago
tokenizer.json
Safe
1.84 MB
chore(root): Initial files upload.
7 months ago
tokenizer.model
Safe
500 kB
LFS
chore(root): Initial files upload.
7 months ago
tokenizer_config.json
Safe
3.17 kB
fix(tokenizer_config): Adjusts `rstrip` of special tokens. (#53)
6 months ago