Get Started. It's Free
or sign up with your email address
NLP by Mind Map: NLP

1. Topic Modelling

1.1. Labelling of Topics

1.1.1. Automatic Labelling of Topic Models using Word Vectors and Letter Trigram Vectors

1.2. Supervised Topic Modelling

1.2.1. Spectral Learning for Supervised Topic Models

1.2.2. Incorporating Lexical Priors into Topic Models

1.3. Relationship between Topics

1.3.1. The Nested Chinese Restaurant Process and Bayesian Nonparametric Inference of Topic Hierarchies

1.3.2. Correlated Topic Models

1.4. Probabilistic Topic Models

1.5. Gaussian LDA for Topic Models with Word Embeddings

2. Language Modelling

2.1. Embeddings

2.1.1. Word Embeddings: Explaining their properties

2.1.2. Order-Embeddings of Images and Language

2.1.3. Ultradense Word Embeddings by Orthogonal Transformation

2.1.4. Bilingual Word Representations with Monolingual Quality in Mind

2.1.5. How to Generate a Good Word Embedding?

2.1.6. Swivel: Improving Embeddings by Noticing What’s Missing

2.1.7. Improving Distributional Similarity with Lessons Learned from Word Embeddings

2.2. Exploring the Limits of Language Modeling

3. Named Entity Recognition

3.1. Building a Fine-Grained Entity Typing System Overnight for a New X (X = Language, Domain, Genre)

4. Miscellaneous

4.1. Contextual LSTM (CLSTM) models for Large scale NLP tasks

4.2. A Case Study of Pointwise Mutual Information

5. Neural Machine Translation

5.1. A Character-level Decoder without Explicit Segmentation for Neural Machine Translation

6. Sentence Modelling

6.1. A Fast Unified Model for Parsing and Sentence Understanding

7. Terminology extraction

7.1. Data-driven identification of fixed expressions and their modifiability

8. Document Classification

8.1. Bottom-up Classification (Clustering)

8.1.1. Dimensionality Reduction for Spectral Clustering

8.2. Top-down Classification (Ontology-based Supervised Classification)

9. Machine Comprehension

9.1. Toward the Machine Comprehension of Text : An Essay