Eiki commited on
Commit
4295191
1 Parent(s): 1bf4c99

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +19 -8
README.md CHANGED
@@ -10,7 +10,7 @@ license: cc-by-nc-4.0
10
  # TaCOMET_ja
11
 
12
  This is the Japanese TaCOMET model, which is the finetuned [COMET](https://huggingface.co/nlp-waseda/comet-gpt2-xl-japanese) model on the Japanese ver. of [TimeATOMIC](https://github.com/nlp-waseda/TaCOMET) using causal language modeling (CLM) objective.
13
- The data and this model are introduced in [this paper](https://www.anlp.jp/proceedings/annual_meeting/2024/pdf_dir/P3-19.pdf) and LREC-COLING2024 (TBA).
14
 
15
  ### Preprocessing
16
 
@@ -19,12 +19,23 @@ The texts are segmented into words using Juman++ and tokenized using SentencePie
19
  ### BibTeX entry and citation info
20
 
21
  ```bibtex
22
- @InProceedings{murata_nlp2024_tacomet,
23
- author = "村田栄樹 and 河原大輔",
24
- title = "TaCOMET: 時間を考慮したイベント常識生成モデル",
25
- booktitle = "言語処理学会第30回年次大会",
26
- year = "2024",
27
- url = "https://www.anlp.jp/proceedings/annual_meeting/2024/pdf_dir/P3-19.pdf"
28
- note = "in Japanese"
 
 
 
 
 
 
 
 
 
 
 
29
  }
30
  ```
 
10
  # TaCOMET_ja
11
 
12
  This is the Japanese TaCOMET model, which is the finetuned [COMET](https://huggingface.co/nlp-waseda/comet-gpt2-xl-japanese) model on the Japanese ver. of [TimeATOMIC](https://github.com/nlp-waseda/TaCOMET) using causal language modeling (CLM) objective.
13
+ The data and this model are introduced in [this paper](https://www.anlp.jp/proceedings/annual_meeting/2024/pdf_dir/P3-19.pdf) and [LREC-COLING2024](https://aclanthology.org/2024.lrec-main.1405/).
14
 
15
  ### Preprocessing
16
 
 
19
  ### BibTeX entry and citation info
20
 
21
  ```bibtex
22
+ @inproceedings{murata-kawahara-2024-time-aware,
23
+ title = "Time-aware {COMET}: A Commonsense Knowledge Model with Temporal Knowledge",
24
+ author = "Murata, Eiki and
25
+ Kawahara, Daisuke",
26
+ editor = "Calzolari, Nicoletta and
27
+ Kan, Min-Yen and
28
+ Hoste, Veronique and
29
+ Lenci, Alessandro and
30
+ Sakti, Sakriani and
31
+ Xue, Nianwen",
32
+ booktitle = "Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)",
33
+ month = may,
34
+ year = "2024",
35
+ address = "Torino, Italia",
36
+ publisher = "ELRA and ICCL",
37
+ url = "https://aclanthology.org/2024.lrec-main.1405",
38
+ pages = "16162--16174",
39
+ abstract = "To better handle commonsense knowledge, which is difficult to acquire in ordinary training of language models, commonsense knowledge graphs and commonsense knowledge models have been constructed. The former manually and symbolically represents commonsense, and the latter stores these graphs{'} knowledge in the models{'} parameters. However, the existing commonsense knowledge models that deal with events do not consider granularity or time axes. In this paper, we propose a time-aware commonsense knowledge model, TaCOMET. The construction of TaCOMET consists of two steps. First, we create TimeATOMIC using ChatGPT, which is a commonsense knowledge graph with time. Second, TaCOMET is built by continually finetuning an existing commonsense knowledge model on TimeATOMIC. TimeATOMIC and continual finetuning let the model make more time-aware generations with rich commonsense than the existing commonsense models. We also verify the applicability of TaCOMET on a robotic decision-making task. TaCOMET outperformed the existing commonsense knowledge model when proper times are input. Our dataset and models will be made publicly available.",
40
  }
41
  ```