Raj-Sanjay-Shah commited on
Commit
4b55848
1 Parent(s): 0d7b0a8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +37 -3
README.md CHANGED
@@ -4,14 +4,48 @@ language: "en"
4
  tags:
5
  - Financial Language Modelling
6
  widget:
7
- - text: "Stocks rallied and the British pound [MASK]."
8
  ---
 
 
 
 
 
 
 
9
 
 
10
  FLANG-DistilBERT is a pre-trained language model which uses financial keywords and phrases for preferential masking of domain specific terms. It is built by further training the DistilBERT language model in the finance domain with improved performance over previous models due to the use of domain knowledge and vocabulary.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
11
 
12
- Contact information
13
 
14
- Please contact Raj Sanjay Shah (rajsanjayshah[at]gatech[dot]edu) or Sudheer Chava (schava6[at]gatech[dot]edu) or Diyi Yang (diyiy[at]stanford[dot]edu) about any FLANG-BERT related issues and questions.
15
 
16
 
17
  ---
 
4
  tags:
5
  - Financial Language Modelling
6
  widget:
7
+ - text: "Stocks rallied and the British pound <mask>."
8
  ---
9
+ ## FLANG
10
+ FLANG is a set of large language models for Financial LANGuage tasks. These models use domain specific pre-training with preferential masking to build more robust representations for the domain. The models in the set are:\
11
+ [FLANG-BERT](https://huggingface.co/SALT-NLP/FLANG-BERT)\
12
+ [FLANG-SpanBERT](https://huggingface.co/SALT-NLP/FLANG-SpanBERT)\
13
+ [FLANG-DistilBERT](https://huggingface.co/SALT-NLP/FLANG-DistilBERT)\
14
+ [FLANG-Roberta](https://huggingface.co/SALT-NLP/FLANG-Roberta)\
15
+ [Flang-ELECTRA](https://huggingface.co/SALT-NLP/FLANG-ELECTRA)
16
 
17
+ ## FLANG-DistilBERT
18
  FLANG-DistilBERT is a pre-trained language model which uses financial keywords and phrases for preferential masking of domain specific terms. It is built by further training the DistilBERT language model in the finance domain with improved performance over previous models due to the use of domain knowledge and vocabulary.
19
+ ## Citation
20
+ Please cite the model with the following citation:
21
+ ```bibtex
22
+ @INPROCEEDINGS{shah-etal-2022-flang,
23
+ author = {Shah, Raj Sanjay and
24
+ Chawla, Kunal and
25
+ Eidnani, Dheeraj and
26
+ Shah, Agam and
27
+ Du, Wendi and
28
+ Chava, Sudheer and
29
+ Raman, Natraj and
30
+ Smiley, Charese and
31
+ Chen, Jiaao and
32
+ Yang, Diyi },
33
+ title = {When FLUE Meets FLANG: Benchmarks and Large Pretrained Language Model for Financial Domain},
34
+ booktitle = {Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing (EMNLP)},
35
+ year = {2022},
36
+ publisher = {Association for Computational Linguistics}
37
+ }
38
+ ```
39
+
40
+ ## Contact information
41
+ Please contact Raj Sanjay Shah (rajsanjayshah[at]gatech[dot]edu) or Sudheer Chava (schava6[at]gatech[dot]edu) or Diyi Yang (diyiy[at]stanford[dot]edu) about any FLANG-ELECTRA related issues and questions.
42
+
43
+
44
+ ---
45
+ license: afl-3.0
46
+ ---
47
 
 
48
 
 
49
 
50
 
51
  ---