OWG
/

Fill-Mask
Transformers
ONNX
English
bert
exbert
Inference Endpoints
Edit model card

BERT base model (uncased)

Model description

Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This model is uncased: it does not make a difference between english and English.

Original implementation

Follow this link to see the original implementation.

How to use

Download the model by cloning the repository via git clone https://huggingface.co/OWG/bert-base-uncased.

Then you can use the model with the following code:

from onnxruntime import InferenceSession, SessionOptions, GraphOptimizationLevel
from transformers import BertTokenizer


tokenizer = BertTokenizer.from_pretrained("bert-base-uncased")

options = SessionOptions()
options.graph_optimization_level = GraphOptimizationLevel.ORT_ENABLE_ALL

session = InferenceSession("path/to/model.onnx", sess_options=options)
session.disable_fallback()

text = "Replace me by any text you want to encode."
input_ids = tokenizer(text, return_tensors="pt", return_attention_mask=True)

inputs = {k: v.cpu().detach().numpy() for k, v in input_ids.items()}
outputs_name = session.get_outputs()[0].name

outputs = session.run(output_names=[outputs_name], input_feed=inputs)
Downloads last month
8
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Datasets used to train OWG/bert-base-uncased