【Github】nlp-paper: 按主题分类的自然语言处理文献大列表

项目地址,阅读原文可以直达:

https://github.com/changwookjun/nlp-paper


看了一下,这个项目的作者changwookjun貌似是韩国人,项目按主题分类整理了自然语言处理的相关文献列表,很详细,包括 Bert系列、Transformer系列、迁移学习、文本摘要、情感分析、问答系统、机器翻译、自动生成等以及NLP子任务系列,包括分词、命名实体识别、句法分析、词义消歧等等,相当丰富,感兴趣的同学可以关注。以下来自该项目介绍页,点击阅读原文可以直达相关资源链接,直达相关paper链接。



NLP Paper

natural language processing paper list

Contents

  • Bert Series

  • Transformer Series

  • Transfer Learning

  • Text Summarization

  • Sentiment Analysis

  • Question Answering

  • Machine Translation

  • Surver paper

  • Downstream task

    • QA MC Dialogue

    • Slot filling

    • Analysis

    • Word segmentation parsing NER

    • Pronoun coreference resolution

    • Word sense disambiguation

    • Sentiment analysis

    • Relation extraction

    • Knowledge base

    • Text classification

    • WSC WNLI NLI

    • Commonsense

    • Extractive summarization

    • IR

  • Generation

  • Quality evaluator

  • Modification (multi-task, masking strategy, etc.)

  • Probe

  • Multi-lingual

  • Other than English models

  • Domain specific

  • Multi-modal

  • Model compression

  • Misc

Bert Series

  • BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding - NAACL 2019)

  • ERNIE 2.0: A Continual Pre-training Framework for Language Understanding - arXiv 2019)

  • StructBERT: Incorporating Language Structures into Pre-training for Deep Language Understanding - arXiv 2019)

  • RoBERTa: A Robustly Optimized BERT Pretraining Approach - arXiv 2019)

  • ALBERT: A Lite BERT for Self-supervised Learning of Language Representations - arXiv 2019)

  • Multi-Task Deep Neural Networks for Natural Language Understanding - arXiv 2019)

  • What does BERT learn about the structure of language? (ACL2019)

  • Analyzing Multi-Head Self-Attention: Specialized Heads Do the Heavy Lifting, the Rest Can Be Pruned (ACL2019) [github]

  • Open Sesame: Getting Inside BERT's Linguistic Knowledge (ACL2019 WS)

  • Analyzing the Structure of Attention in a Transformer Language Model (ACL2019 WS)

  • What Does BERT Look At? An Analysis of BERT's Attention (ACL2019 WS)

  • Do Attention Heads in BERT Track Syntactic Dependencies?

  • Blackbox meets blackbox: Representational Similarity and Stability Analysis of Neural Language Models and Brains (ACL2019 WS)

  • Inducing Syntactic Trees from BERT Representations (ACL2019 WS)

  • A Multiscale Visualization of Attention in the Transformer Model (ACL2019 Demo)

  • Visualizing and Measuring the Geometry of BERT

  • How Contextual are Contextualized Word Representations? Comparing the Geometry of BERT, ELMo, and GPT-2 Embeddings (EMNLP2019)

  • Are Sixteen Heads Really Better than One? (NeurIPS2019)

  • On the Validity of Self-Attention as Explanation in Transformer Models

  • Visualizing and Understanding the Effectiveness of BERT (EMNLP2019)

  • Attention Interpretability Across NLP Tasks

  • Revealing the Dark Secrets of BERT (EMNLP2019)

  • Investigating BERT's Knowledge of Language: Five Analysis Methods with NPIs (EMNLP2019)

  • The Bottom-up Evolution of Representations in the Transformer: A Study with Machine Translation and Language Modeling Objectives (EMNLP2019)

  • A Primer in BERTology: What we know about how BERT works

  • Do NLP Models Know Numbers? Probing Numeracy in Embeddings (EMNLP2019)

  • How Does BERT Answer Questions? A Layer-Wise Analysis of Transformer Representations (CIKM2019)

  • Whatcha lookin' at? DeepLIFTing BERT's Attention in Question Answering

  • What does BERT Learn from Multiple-Choice Reading Comprehension Datasets?

  • Calibration of Pre-trained Transformers

  • exBERT: A Visual Analysis Tool to Explore Learned Representations in Transformers Models [github]

Transformer Series

  • Attention Is All You Need - arXiv 2017)

  • Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context - arXiv 2019)

  • Universal Transformers - ICLR 2019)

  • Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer - arXiv 2019)

  • Reformer: The Efficient Transformer - ICLR 2020)

  • Adaptive Attention Span in Transformers (ACL2019)

  • Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context (ACL2019) [github]

  • Generating Long Sequences with Sparse Transformers

  • Adaptively Sparse Transformers (EMNLP2019)

  • Compressive Transformers for Long-Range Sequence Modelling

  • The Evolved Transformer (ICML2019)

  • Reformer: The Efficient Transformer (ICLR2020) [github]

  • GRET: Global Representation Enhanced Transformer (AAAI2020)

  • Transformer on a Diet [github]

  • Efficient Content-Based Sparse Attention with Routing Transformers

  • BP-Transformer: Modelling Long-Range Context via Binary Partitioning

  • Recipes for building an open-domain chatbot

  • Longformer: The Long-Document Transformer

Transfer Learning

  • Deep contextualized word representations - NAACL 2018)

  • Universal Language Model Fine-tuning for Text Classification - ACL 2018)

  • Improving Language Understanding by Generative Pre-Training - Alec Radford)

  • BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding - NAACL 2019)

  • Cloze-driven Pretraining of Self-attention Networks - arXiv 2019)

  • Unified Language Model Pre-training for Natural Language Understanding and Generation - arXiv 2019)

  • MASS: Masked Sequence to Sequence Pre-training for Language Generation - ICML 2019)

Text Summarization

  • Positional Encoding to Control Output Sequence Length - Sho Takase(2019)

  • Fine-tune BERT for Extractive Summarization - Yang Liu(2019)

  • Language Models are Unsupervised Multitask Learners - Alec Radford(2019)

  • A Unified Model for Extractive and Abstractive Summarization using Inconsistency Loss - Wan-Ting Hsu(2018)

  • A Discourse-Aware Attention Model for Abstractive Summarization of Long Documents - Arman Cohan(2018)

  • GENERATING WIKIPEDIA BY SUMMARIZING LONG SEQUENCES - Peter J. Liu(2018)

  • Get To The Point: Summarization with Pointer-Generator Networks - Abigail See(2017) * A Neural Attention Model for Sentence Summarization - Alexander M. Rush(2015)

Sentiment Analysis

  • Multi-Task Deep Neural Networks for Natural Language Understanding - Xiaodong Liu(2019)

  • Aspect-level Sentiment Analysis using AS-Capsules - Yequan Wang(2019)

  • On the Role of Text Preprocessing in Neural Network Architectures: An Evaluation Study on Text Categorization and Sentiment Analysis - Jose Camacho-Collados(2018)

  • Learned in Translation: Contextualized Word Vectors - Bryan McCann(2018)

  • Universal Language Model Fine-tuning for Text Classification - Jeremy Howard(2018)

  • Convolutional Neural Networks with Recurrent Neural Filters - Yi Yang(2018)

  • Information Aggregation via Dynamic Routing for Sequence Encoding - Jingjing Gong(2018)

  • Learning to Generate Reviews and Discovering Sentiment - Alec Radford(2017)

  • A Structured Self-attentive Sentence Embedding - Zhouhan Lin(2017)

Question Answering

  • Language Models are Unsupervised Multitask Learners - Alec Radford(2019)

  • Improving Language Understanding by Generative Pre-Training - Alec Radford(2018)

  • Bidirectional Attention Flow for Machine Comprehension - Minjoon Seo(2018)

  • Reinforced Mnemonic Reader for Machine Reading Comprehension - Minghao Hu(2017)

  • Neural Variational Inference for Text Processing - Yishu Miao(2015)

Machine Translation

  • The Evolved Transformer - David R. So(2019)

Surver paper

  • Evolution of transfer learning in natural language processing

  • Pre-trained Models for Natural Language Processing: A Survey

  • A Survey on Contextual Embeddings

Downstream task

QA MC Dialogue

  • A BERT Baseline for the Natural Questions

  • MultiQA: An Empirical Investigation of Generalization and Transfer in Reading Comprehension (ACL2019)

  • Unsupervised Domain Adaptation on Reading Comprehension

  • BERTQA -- Attention on Steroids

  • A Multi-Type Multi-Span Network for Reading Comprehension that Requires Discrete Reasoning (EMNLP2019)

  • SDNet: Contextualized Attention-based Deep Network for Conversational Question Answering

  • Multi-hop Question Answering via Reasoning Chains

  • Select, Answer and Explain: Interpretable Multi-hop Reading Comprehension over Multiple Documents

  • Multi-step Entity-centric Information Retrieval for Multi-Hop Question Answering (EMNLP2019 WS)

  • End-to-End Open-Domain Question Answering with BERTserini (NAALC2019)

  • Latent Retrieval for Weakly Supervised Open Domain Question Answering (ACL2019)

  • Multi-passage BERT: A Globally Normalized BERT Model for Open-domain Question Answering (EMNLP2019)

  • Learning to Retrieve Reasoning Paths over Wikipedia Graph for Question Answering (ICLR2020)

  • Learning to Ask Unanswerable Questions for Machine Reading Comprehension (ACL2019)

  • Unsupervised Question Answering by Cloze Translation (ACL2019)

  • Reinforcement Learning Based Graph-to-Sequence Model for Natural Question Generation

  • A Recurrent BERT-based Model for Question Generation (EMNLP2019 WS)

  • Learning to Answer by Learning to Ask: Getting the Best of GPT-2 and BERT Worlds

  • Enhancing Pre-Trained Language Representations with Rich Knowledge for Machine Reading Comprehension (ACL2019)

  • Incorporating Relation Knowledge into Commonsense Reading Comprehension with Multi-task Learning (CIKM2019)

  • SG-Net: Syntax-Guided Machine Reading Comprehension

  • MMM: Multi-stage Multi-task Learning for Multi-choice Reading Comprehension

  • Cosmos QA: Machine Reading Comprehension with Contextual Commonsense Reasoning (EMNLP2019)

  • ReClor: A Reading Comprehension Dataset Requiring Logical Reasoning (ICLR2020)

  • Robust Reading Comprehension with Linguistic Constraints via Posterior Regularization

  • BAS: An Answer Selection Method Using BERT Language Model

  • Beat the AI: Investigating Adversarial Human Annotations for Reading Comprehension

  • A Simple but Effective Method to Incorporate Multi-turn Context with BERT for Conversational Machine Comprehension (ACL2019 WS)

  • FlowDelta: Modeling Flow Information Gain in Reasoning for Conversational Machine Comprehension (ACL2019 WS)

  • BERT with History Answer Embedding for Conversational Question Answering (SIGIR2019)

  • GraphFlow: Exploiting Conversation Flow with Graph Neural Networks for Conversational Machine Comprehension (ICML2019 WS)

  • Beyond English-only Reading Comprehension: Experiments in Zero-Shot Multilingual Transfer for Bulgarian (RANLP2019)

  • XQA: A Cross-lingual Open-domain Question Answering Dataset (ACL2019)

  • Cross-Lingual Machine Reading Comprehension (EMNLP2019)

  • Zero-shot Reading Comprehension by Cross-lingual Transfer Learning with Multi-lingual Language Representation Model

  • Multilingual Question Answering from Formatted Text applied to Conversational Agents

  • BiPaR: A Bilingual Parallel Dataset for Multilingual and Cross-lingual Reading Comprehension on Novels (EMNLP2019)

  • MLQA: Evaluating Cross-lingual Extractive Question Answering

  • Investigating Prior Knowledge for Challenging Chinese Machine Reading Comprehension (TACL)

  • SberQuAD - Russian Reading Comprehension Dataset: Description and Analysis

  • Giving BERT a Calculator: Finding Operations and Arguments with Reading Comprehension (EMNLP2019)

  • BERT-DST: Scalable End-to-End Dialogue State Tracking with Bidirectional Encoder Representations from Transformer (Interspeech2019)

  • Dialog State Tracking: A Neural Reading Comprehension Approach

  • A Simple but Effective BERT Model for Dialog State Tracking on Resource-Limited Systems (ICASSP2020)

  • Fine-Tuning BERT for Schema-Guided Zero-Shot Dialogue State Tracking

  • Goal-Oriented Multi-Task BERT-Based Dialogue State Tracker

  • Domain Adaptive Training BERT for Response Selection

  • BERT Goes to Law School: Quantifying the Competitive Advantage of Access to Large Legal Corpora in Contract Understanding

Slot filling

  • BERT for Joint Intent Classification and Slot Filling

  • Multi-lingual Intent Detection and Slot Filling in a Joint BERT-based Model

  • A Comparison of Deep Learning Methods for Language Understanding (Interspeech2019)


......

Author

ChangWookJun / @changwookjun (changwookjun@gmail.com)




关于AINLP

AINLP 是一个有趣有AI的自然语言处理社区,专注于 AI、NLP、机器学习、深度学习、推荐算法等相关技术的分享,主题包括文本摘要、智能问答、聊天机器人、机器翻译、自动生成、知识图谱、预训练模型、推荐系统、计算广告、招聘信息、求职经验分享等,欢迎关注!加技术交流群请添加AINLPer(id:ainlper),备注工作/研究方向+加群目的。


【Github】nlp-paper: 按主题分类的自然语言处理文献大列表


上一篇:程序员常用英文词汇


下一篇:Windows下命令行连接mysql及导入sql文件