Raj-Sanjay-Shah commited on
Commit
af6dfad
1 Parent(s): 6a6e179

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +23 -3
README.md CHANGED
@@ -6,7 +6,7 @@ tags:
6
  widget:
7
  - text: "Stocks rallied and the British pound <mask>."
8
  ---
9
- ##FLANG
10
  FLANG is a set of large language models for Financial LANGuage tasks. These models use domain specific pre-training with preferential masking to build more robust representations for the domain. The models in the set are:\
11
  [FLANG-BERT](https://huggingface.co/SALT-NLP/FLANG-BERT)\
12
  [FLANG-SpanBERT](https://huggingface.co/SALT-NLP/FLANG-SpanBERT)\
@@ -14,11 +14,31 @@ FLANG is a set of large language models for Financial LANGuage tasks. These mode
14
  [FLANG-Roberta](https://huggingface.co/SALT-NLP/FLANG-Roberta)\
15
  [Flang-ELECTRA](https://huggingface.co/SALT-NLP/FLANG-ELECTRA)
16
 
17
- ##FLANG-ELECTRA
18
  FLANG-ELECTRA is a pre-trained language model which uses financial keywords and phrases for preferential masking of domain specific terms. It is built by further training the ELECTRA language model in the finance domain with improved performance over previous models due to the use of domain knowledge and vocabulary.
19
 
20
- Contact information
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
21
 
 
22
  Please contact Raj Sanjay Shah (rajsanjayshah[at]gatech[dot]edu) or Sudheer Chava (schava6[at]gatech[dot]edu) or Diyi Yang (diyiy[at]stanford[dot]edu) about any FLANG-ELECTRA related issues and questions.
23
 
24
 
 
6
  widget:
7
  - text: "Stocks rallied and the British pound <mask>."
8
  ---
9
+ ## FLANG
10
  FLANG is a set of large language models for Financial LANGuage tasks. These models use domain specific pre-training with preferential masking to build more robust representations for the domain. The models in the set are:\
11
  [FLANG-BERT](https://huggingface.co/SALT-NLP/FLANG-BERT)\
12
  [FLANG-SpanBERT](https://huggingface.co/SALT-NLP/FLANG-SpanBERT)\
 
14
  [FLANG-Roberta](https://huggingface.co/SALT-NLP/FLANG-Roberta)\
15
  [Flang-ELECTRA](https://huggingface.co/SALT-NLP/FLANG-ELECTRA)
16
 
17
+ ## FLANG-ELECTRA
18
  FLANG-ELECTRA is a pre-trained language model which uses financial keywords and phrases for preferential masking of domain specific terms. It is built by further training the ELECTRA language model in the finance domain with improved performance over previous models due to the use of domain knowledge and vocabulary.
19
 
20
+ ## Citation
21
+ Please cite the model with the following citation:
22
+ ```bibtex
23
+ @INPROCEEDINGS{shah-etal-2022-flang,
24
+ author = {Shah, Raj Sanjay and
25
+ Chawla, Kunal and
26
+ Eidnani, Dheeraj and
27
+ Shah, Agam and
28
+ Du, Wendi and
29
+ Chava, Sudheer and
30
+ Raman, Natraj and
31
+ Smiley, Charese and
32
+ Chen, Jiaao and
33
+ Yang, Diyi },
34
+ title = {When FLUE Meets FLANG: Benchmarks and Large Pretrained Language Model for Financial Domain},
35
+ booktitle = {Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing (EMNLP)},
36
+ year = {2022},
37
+ publisher = {Association for Computational Linguistics}
38
+ }
39
+ ```
40
 
41
+ ## Contact information
42
  Please contact Raj Sanjay Shah (rajsanjayshah[at]gatech[dot]edu) or Sudheer Chava (schava6[at]gatech[dot]edu) or Diyi Yang (diyiy[at]stanford[dot]edu) about any FLANG-ELECTRA related issues and questions.
43
 
44