Natural Language Processing (NLP)
Distributional Hypothesis
Context Windows
Co-occurrence Statistics
Latent Semantic Analysis
Non-Negative Matrix Factorization
Pointwise Mutual Information
Continuous Bag-of-Words
Skip-gram Model
Hierarchical Softmax
Negative Sampling
Global Matrix Factorization
Local Context Windows
Subword Information
Character N-grams
Out-of-Vocabulary Handling
Word Similarity Tasks
Analogy Tasks
Clustering Quality
Downstream Task Performance
Transfer Learning Assessment
Contextualized Embeddings
Multilingual Embeddings
Domain-Specific Embeddings
Temporal Embeddings
Previous
5. Feature Representation
Go to top
Next
7. Classical Machine Learning for NLP