【论文整理】自然语言理解必读论文x

来源:卫生资格 发布时间:2020-08-29 点击:

 【论文整理】自然语言理解必读论文 Natural-language-understanding-papers A list of recent papers regarding natural language understanding and spoken language understanding.

 It contains sequence labelling, sentence classification, dialogue act classification, dialogue state tracking and so on. A review about NLU datasets for task-oriented dialogue is here. • There is an implemention of joint training of slot filling and intent detection for SLU, which is evaluated on ATIS and SNIPS datasets. Bookmarks Variant networks for different semantic representations Robustness to ASR-error Zero-shot learning and domain adaptation Universal Language Representation • Which may inspire us 1 Variant networks for different semantic representations 1.1 Domain-intent-slot • Using Recurrent Neural Networks for Slot Filling in Spoken Language Understanding. Grégoire Mesnil, et al… TASLP, 2015. [Code+data] • Attention-based recurrent neural network models for joint intent detection and slot filling. Bing Liu and Ian Lane. InterSpeech, 2016. [Code1] [Code2] • Encoder-decoder with Focus-mechanism for Sequence Labelling Based Spoken Language Understanding. Su Zhu and Kai Yu. ICASSP, 2017. [Code] Neural Models for Sequence Chunking. Fei Zhai, et al. AAAI, 2017. • End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF. Xuezhe Ma, Eduard Hovy. ACL, 2016. • A Bi-model based RNN Semantic Frame Parsing Model for Intent Detection and Slot Filling. Yu Wang, et al. NAACL 2018. • A Self-Attentive Model with Gate Mechanism for Spoken Language Understanding. Changliang Li, et al. EMNLP 2018. [from Kingsoft AI Lab] • Joint SloFilling and Intent Detection via Capsule Neural Networks. Chenwei Zhang, et al. 2018.

 • BERT for Jt Intent Classification and Slot Filling. Qian Chen, et al. 2019.[ongoing work] 1.2 Dialogue act (act-slot-value triples) • Improving Slot Filling in Spoken Language Understanding with Joint Pointer and Attention. Lin Zhao and Zhe Feng. ACL, 2018. • A Hierarchical Decoding Model for Spoken Language Understanding from Unaligned Data. Zijian Zhao, et al. ICASSP 2019. [SJTU] 1.3 Hierarchical Representations • Semantic Parsing for Task Oriented Dialog using Hierarchical Representations. Sonal Gupta, et al. EMNLP 2018. [from Facebook AI Research] 2 Robustness to ASR-error • Discriminative spoken language understanding using word confusion networks. Matthew Henderson, et al… SLT, 2012. [Data] • Using word confusion networks for slot filling in spoken language understanding. Xiaohao Yang and Jia Liu. Interspeech, 2015. • Joint Online Spoken Language Understanding and Language Modeling with Recurrent Neural Networks. Bing Liu and Ian Lane. SIGDIAL, 2016. [Code] • Robust Spoken Language Understanding with unsupervised ASR-error adaptation. Su Zhu, et al… ICASSP, 2018. • Neural Confnet Classification: Fully Neural Network Based Spoken Utterance Classification Using Word Confusion Networks. Ryo Masumura, et al… ICASSP, 2018. • From Audio to Semantics: Approaches to end-to-end spoken language understanding. Parisa Haghani, et al. SLT, 2018. [Google] 3 Zero-shot learning and domain adaptation 3.1 Zero-shot learning • A model of zero-shot learning of spoken language understanding. Majid Yazdani and James Henderson. EMNLP, 2015. • Zero-shot Learning Of Intent Embeddings For Expansion By Convolutional Deep Structured Semantic Models. Yun-Nung Chen, et al… ICASSP 2016. • Online Adaptative Zero-shot Learning Spoken Language Understanding Using Word-embedding. Emmanuel Ferreira, et al. ICASSP 2015. • Label Embedding for Zero-shot Fine-grained Named Entity Typing. Yukun Ma et al. COLING, 2016. • Towards Zero-Shot Frame Semantic Parsing for Domain Scaling. Ankur Bapna, et al. Interspeech, 2017. • Concept Transfer Learning for Adaptive Language Understanding. Su Zhu and Kai Yu. SIGDIAL, 2018.

 • An End-to-end Approach for Handling Unknown Slot Values in Dialogue State Tracking. Puyang Xu and Qi Hu. ACL, 2018. • Large-Scale Multi-Domain Belief Tracking with Knowledge Sharing. Osman Ramadan, et al… ACL, 2018. [Data] • Zero-Shot Adaptive Transfer for Conversational Language Understanding. Sungjin Lee, et al. Arxiv 2018. [Microsoft] 3.2 Few-shot learning • Few-shot classification in Named Entity Recognition Task. Alexander Fritzler, et al. SAC, 2019. • Few-Shot Text Classification with Induction Network. Ruiying Geng, et al. Arxiv 2019. 3.3 Domain adaptation • Domain Attention with an Ensemble of Experts. Young-Bum Kim, et al… ACL, 2017. • Adversarial Adaptation of Synthetic or Stale Data. Young-Bum Kim, et al… ACL, 2017. • Fast and Scalable Expansion of Natural Language Understanding Functionality for Intelligent Agents. Anuj Goyal, et al. NAACL, 2018. [from Amazon Alexa Machine Learning] • Bag of Experts Architectures for Model Reuse in Conversational Language Understanding. Rahul Jha, et al. NAACL, 2018. [from Microsoft Corporation] 4 Universal Language Representation • Deep contextualized word representations. Matthew E. Peters, et al. NAACL 2018. [ELMo] • BERT: Pre-training of ep Bidirectional Transformers for Language Understanding. Jacob Devlin, et al. NAACL 2019. [from Google AI Language] • XLNet: Generalized Autoregressive Pretraining for Language Understanding. Zhilin Yang, et al. Arxiv 2019. [CMU && Google Brain] 5 Which may inspire us • Jointly Predicting Predicates and Arguments in Neural Semantic Role Labeling. Luheng He, et al. ACL, 2018. [Code] • Sentence-State LSTM for Text Representation. Yue Zhang, et al. ACL, 2018. [Code] Chinese NER Using Lattice LSTM. Yue Zhang, et al. ACL, 2018. [Code+data] • SoPa: Bridging CNNs, RNNs, and Weighted Finite-State Machines. Roy Schwartz, et al. ACL, 2018. [Code] • Coarse-to-Fine Decoding for Neural Semantic Parsing. Li Dong and Mirella Lapata. ACL, 2018. [Code]

 • Generalize Symbolic Knowledge With Neural Rule Engine. Shen Li, Hengru Xu, Zhengdong Lu. Arxiv 2018. [from Deeplycurious.ai] • Dual Supervised Learning for Natural Language Understanding and Generation. Shang-Yu Su, et al. ACL, 2019. • Neural Finite State Transducers: Beyond Rational Relations. Chu-Cheng Lin, et al. NAACL, 2019.

推荐访问:自然语言 论文 必读
上一篇:对于集团十周年改革发展座谈会讲话稿
下一篇:园长发言致辞合集

Copyright @ 2013 - 2018 优秀啊教育网 All Rights Reserved

优秀啊教育网 版权所有