|
--- |
|
language: |
|
- ja |
|
library_name: transformers |
|
widget: |
|
- text: X が 部屋 で ゲームするxNeed 1 分 前 、 |
|
--- |
|
|
|
# TaCOMET_ja |
|
|
|
This is the Japanese TaCOMET model, which is the finetuned COMET model on the Japanese ver. of [TimeATOMIC](https://github.com/nlp-waseda/TaCOMET) using causal language modeling (CLM) objective. |
|
The data and this model are introduced in [this paper](https://www.anlp.jp/proceedings/annual_meeting/2024/pdf_dir/P3-19.pdf). |
|
|
|
### Preprocessing |
|
|
|
The texts are segmented into words using Juman++ and tokenized using SentencePiece. |
|
|
|
### BibTeX entry and citation info |
|
|
|
```bibtex |
|
@InProceedings{murata_nlp2023_tacomet, |
|
author = "村田栄樹 and 河原大輔", |
|
title = "TaCOMET: 時間を考慮したイベント常識生成モデル", |
|
booktitle = "言語処理学会第30回年次大会", |
|
year = "2024", |
|
url = "https://www.anlp.jp/proceedings/annual_meeting/2024/pdf_dir/P3-19.pdf" |
|
note = "in Japanese" |
|
} |
|
``` |