![]() ![]() ( 2021), especially in few-shot scenarios. This approach has proven to be successful in semantic parsing Pasupat et al. Moreover, compared to traditional few-shot finetuning approaches, in-context learning allows us to rapidly control the behavior of the DST system and correct its errors by simply updating in-context examples without re-training. This enables developers to quickly prototype DST systems in new domains and rapidly leverage new collected data. Makes in-context learning models flexible and scalable in that they can handle queries in a new domain via the exemplar retrieval process without re-training. ( 2020a), in which a large language model makes predictions based on retrieved exemplars from a small set of labeled training data.Ī key motivation behind this framework is that it requires no finetuning (i.e., no parameter updates), which To address the above challenges, we propose the IC-DST model to solve the DST problem with the in-context learning paradigm Brown et al. Method IC-DST outperforms previous fine-tuned state-of-the-art models inįew-shot settings. Empirical results on MultiWOZ 2.1 and 2.4 show that our To better leverage the pre-trained LMs, we also reformulate DST intoĪ text-to-SQL problem. We study ways to formulate dialogueĬontext into prompts for LMs and propose an efficient approach to retrieveĭialogues as exemplars given a test instance and a selection pool of few-shotĮxamples. The LM more flexible and scalable compared to prior few-shot DST work whenĪdapting to new domains and scenarios. Model (LM) takes a test instance and a few annotated examples as input, andĭirectly decodes the dialogue states without any parameter updates. In this work, we propose an in-context (IC) learning framework forįew-shot dialogue state tracking (DST), where a large pre-trained language Thus, few-shot learning for dialogue tasks presents an exciting ![]() ![]() Collecting and annotating task-oriented dialogues is time-consuming andĬostly. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |