yanekyuk's picture
Update README.md
fc2e64f
|
raw
history blame
5.05 kB
metadata
license: apache-2.0
tags:
  - generated_from_trainer
metrics:
  - precision
  - recall
  - accuracy
  - f1
language:
  - en
widget:
  - text: >-
      Broadcom agreed to acquire cloud computing company VMware in a $61 billion
      (€57bn) cash-and stock deal, massively diversifying the chipmaker’s
      business and almost tripling its software-related revenue to about 45% of
      its total sales. By the numbers: VMware shareholders will receive either
      $142.50 in cash or 0.2520 of a Broadcom share for each VMware stock.
      Broadcom will also assume $8 billion of VMware's net debt.
  - text: >-
      Canadian Natural Resources Minister Jonathan Wilkinson told Bloomberg that
      the country could start supplying Europe with liquefied natural gas (LNG)
      in as soon as three years by converting an existing LNG import facility on
      Canada’s Atlantic coast into an export terminal. Bottom line: Wilkinson
      said what Canada cares about is that the new LNG facility uses a
      low-emission process for the gas and is capable of transitioning to
      exporting hydrogen later on.
  - text: >-
      Google is being investigated by the UK’s antitrust watchdog for its
      dominance in the "ad tech stack," the set of services that facilitate the
      sale of online advertising space between advertisers and sellers. Google
      has strong positions at various levels of the ad tech stack and charges
      fees to both publishers and advertisers. A step back: UK Competition and
      Markets Authority has also been investigating whether Google and Meta
      colluded over ads, probing into the advertising agreement between the two
      companies, codenamed Jedi Blue.
  - text: >-
      Shares in Twitter closed 6.35% up after an SEC 13D filing revealed that
      Elon Musk pledged to put up an additional $6.25 billion of his own wealth
      to fund the $44 billion takeover deal, lifting the total to $33.5 billion
      from an initial $27.25 billion. In other news: Former Twitter CEO Jack
      Dorsey announced he's stepping down, but would stay on Twitter’s board
      \“until his term expires at the 2022 meeting of stockholders."
model-index:
  - name: bert-keyword-discriminator
    results: []

bert-keyword-discriminator

This model is a fine-tuned version of bert-base-cased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1310
  • Precision: 0.8522
  • Recall: 0.8868
  • Accuracy: 0.9732
  • F1: 0.8692
  • Ent/precision: 0.8874
  • Ent/accuracy: 0.9246
  • Ent/f1: 0.9056
  • Con/precision: 0.8011
  • Con/accuracy: 0.8320
  • Con/f1: 0.8163

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 8
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Precision Recall Accuracy F1 Ent/precision Ent/accuracy Ent/f1 Con/precision Con/accuracy Con/f1
0.1744 1.0 1875 0.1261 0.7176 0.7710 0.9494 0.7433 0.7586 0.8503 0.8018 0.6514 0.6561 0.6537
0.1261 2.0 3750 0.1041 0.7742 0.8057 0.9600 0.7896 0.8083 0.8816 0.8433 0.7185 0.6957 0.7070
0.0878 3.0 5625 0.0979 0.8176 0.8140 0.9655 0.8158 0.8518 0.8789 0.8651 0.7634 0.7199 0.7410
0.0625 4.0 7500 0.0976 0.8228 0.8643 0.9696 0.8430 0.8515 0.9182 0.8836 0.7784 0.7862 0.7823
0.0456 5.0 9375 0.1047 0.8304 0.8758 0.9704 0.8525 0.8758 0.9189 0.8968 0.7655 0.8133 0.7887
0.0342 6.0 11250 0.1207 0.8363 0.8887 0.9719 0.8617 0.8719 0.9274 0.8988 0.7846 0.8327 0.8080
0.0256 7.0 13125 0.1241 0.848 0.8892 0.9731 0.8681 0.8791 0.9299 0.9038 0.8019 0.8302 0.8158
0.0205 8.0 15000 0.1310 0.8522 0.8868 0.9732 0.8692 0.8874 0.9246 0.9056 0.8011 0.8320 0.8163

Framework versions

  • Transformers 4.19.2
  • Pytorch 1.11.0+cu113
  • Datasets 2.2.2
  • Tokenizers 0.12.1