|
--- |
|
license: apache-2.0 |
|
--- |
|
## Model Card: dstc11-simmc2.1-iflytek |
|
|
|
## Recent Update |
|
- ππ» 2022.10.10: The repository `dstc11-simmc2.1-iflytek` for [DSTC11 Track1](https://github.com/facebookresearch/simmc2) is created. |
|
|
|
|
|
## Overview |
|
The [SIMMC2.1](https://github.com/facebookresearch/simmc2) challenge aims to lay the foundations for the real-world assistant agents that can handle multimodal inputs, and perform multimodal actions. It has 4 tasks: Ambiguous Candidate Identification, Multimodal Coreference Resolution, Multimodal Dialog State Tracking, Response Generation. We consider the joint input of textual context, tokenized objects and scene as multi-modal input, as well as compare the performance of single task training and multi task joint training. |
|
|
|
## Model Date |
|
Model was originally released in October 2022. |
|
|
|
## Model Type |
|
The **mt-bart**, **mt-bart-sys** and **mt-bart-sys-nvattr** have the same model framework (transformer with multi-task head), which are finetuned on [SIMMC2.1](https://github.com/facebookresearch/simmc2) based on the pretrained [BART-Large](https://huggingface.co/facebook/bart-large) model. This [repository](https://github.com/scutcyr/dstc11-simmc2.1-iflytek) also contains code to finetune the model. |
|
|
|
|
|
## Using with Transformers |
|
(1) You should first download the model from huggingface used the scripts: |
|
```bash |
|
cd ~ |
|
mkdir pretrained_model |
|
cd pretrained_model |
|
git lfs install |
|
git clone https://huggingface.co/scutcyr/dstc11-simmc2.1-iflytek |
|
``` |
|
(2) Then you should clone our code use the follow scripts: |
|
```bash |
|
cd ~ |
|
git clone https://github.com/scutcyr/dstc11-simmc2.1-iflytek.git |
|
``` |
|
(3) Follow the [README](https://github.com/scutcyr/dstc11-simmc2.1-iflytek#readme) to use the model. |
|
|