Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
FINGU-AI
/
Fingu-M-v1
like
1
Sentence Similarity
sentence-transformers
Safetensors
qwen2
feature-extraction
Generated from Trainer
dataset_size:693000
loss:MatryoshkaLoss
loss:MultipleNegativesRankingLoss
custom_code
Eval Results
text-embeddings-inference
Inference Endpoints
arxiv:
1908.10084
arxiv:
2205.13147
arxiv:
1705.00652
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
Fingu-M-v1
1 contributor
History:
2 commits
FINGU-AI
Upload folder using huggingface_hub
2a0cf32
verified
4 months ago
1_Pooling
Upload folder using huggingface_hub
4 months ago
2_Dense
Upload folder using huggingface_hub
4 months ago
.gitattributes
Safe
1.52 kB
initial commit
4 months ago
README.md
Safe
21.3 kB
Upload folder using huggingface_hub
4 months ago
added_tokens.json
Safe
80 Bytes
Upload folder using huggingface_hub
4 months ago
config.json
Safe
1.01 kB
Upload folder using huggingface_hub
4 months ago
config_sentence_transformers.json
Safe
397 Bytes
Upload folder using huggingface_hub
4 months ago
merges.txt
Safe
1.67 MB
Upload folder using huggingface_hub
4 months ago
model.safetensors
Safe
3.09 GB
LFS
Upload folder using huggingface_hub
4 months ago
modules.json
Safe
341 Bytes
Upload folder using huggingface_hub
4 months ago
optimizer.pt
Safe
pickle
Detected Pickle imports (4)
"torch.BFloat16Storage"
,
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
6.19 GB
LFS
Upload folder using huggingface_hub
4 months ago
rng_state_0.pth
pickle
Detected Pickle imports (7)
"numpy.core.multiarray._reconstruct"
,
"torch._utils._rebuild_tensor_v2"
,
"_codecs.encode"
,
"torch.ByteStorage"
,
"numpy.ndarray"
,
"collections.OrderedDict"
,
"numpy.dtype"
How to fix it?
15 kB
LFS
Upload folder using huggingface_hub
4 months ago
rng_state_1.pth
pickle
Detected Pickle imports (7)
"_codecs.encode"
,
"numpy.core.multiarray._reconstruct"
,
"numpy.ndarray"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.ByteStorage"
,
"collections.OrderedDict"
,
"numpy.dtype"
How to fix it?
15 kB
LFS
Upload folder using huggingface_hub
4 months ago
rng_state_2.pth
pickle
Detected Pickle imports (7)
"_codecs.encode"
,
"numpy.core.multiarray._reconstruct"
,
"numpy.ndarray"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.ByteStorage"
,
"collections.OrderedDict"
,
"numpy.dtype"
How to fix it?
15 kB
LFS
Upload folder using huggingface_hub
4 months ago
rng_state_3.pth
pickle
Detected Pickle imports (7)
"_codecs.encode"
,
"numpy.core.multiarray._reconstruct"
,
"numpy.ndarray"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.ByteStorage"
,
"collections.OrderedDict"
,
"numpy.dtype"
How to fix it?
15 kB
LFS
Upload folder using huggingface_hub
4 months ago
scheduler.pt
Safe
pickle
Pickle imports
No problematic imports detected
What is a pickle import?
1.06 kB
LFS
Upload folder using huggingface_hub
4 months ago
sentence_bert_config.json
Safe
54 Bytes
Upload folder using huggingface_hub
4 months ago
special_tokens_map.json
Safe
370 Bytes
Upload folder using huggingface_hub
4 months ago
tokenizer.json
Safe
7.03 MB
Upload folder using huggingface_hub
4 months ago
tokenizer_config.json
Safe
1.38 kB
Upload folder using huggingface_hub
4 months ago
trainer_state.json
Safe
6.81 kB
Upload folder using huggingface_hub
4 months ago
training_args.bin
pickle
Detected Pickle imports (11)
"transformers.trainer_pt_utils.AcceleratorConfig"
,
"sentence_transformers.training_args.SentenceTransformerTrainingArguments"
,
"torch.device"
,
"accelerate.utils.dataclasses.DistributedType"
,
"sentence_transformers.training_args.MultiDatasetBatchSamplers"
,
"transformers.trainer_utils.SchedulerType"
,
"transformers.training_args.OptimizerNames"
,
"accelerate.state.PartialState"
,
"transformers.trainer_utils.HubStrategy"
,
"sentence_transformers.training_args.BatchSamplers"
,
"transformers.trainer_utils.IntervalStrategy"
How to fix it?
5.37 kB
LFS
Upload folder using huggingface_hub
4 months ago
vocab.json
Safe
2.78 MB
Upload folder using huggingface_hub
4 months ago