Useful Links
Computer Science
Artificial Intelligence
Machine Learning
Supervised Learning
1. Foundations of Supervised Learning
2. The Supervised Learning Workflow
3. Linear Models
4. Tree-Based Models
5. Instance-Based Learning
6. Support Vector Machines
7. Probabilistic Models
8. Model Evaluation and Validation
9. Advanced Topics in Supervised Learning
10. Practical Implementation Considerations
Linear Models
Linear Regression
Simple Linear Regression
Mathematical Foundation
Equation of a Line
Interpretation of Coefficients
Slope Interpretation
Intercept Interpretation
Cost Function
Mean Squared Error Derivation
Geometric Interpretation
Parameter Estimation
Gradient Descent for Linear Regression
Analytical Solution
Normal Equation
Least Squares Method
Model Assumptions
Linearity
Independence of Errors
Homoscedasticity
Normality of Residuals
Residual Analysis
Residual Plots
Diagnostic Tests
Multiple Linear Regression
The Model Equation
Matrix Formulation
Vector Notation
Interpretation of Coefficients
Partial Effects
Holding Other Variables Constant
Extended Assumptions
No Perfect Multicollinearity
Sufficient Sample Size
Multicollinearity
Detection Methods
Variance Inflation Factor
Treatment Strategies
Model Diagnostics
Residual Analysis
Leverage and Influence
Cook's Distance
Feature Selection in Linear Regression
Forward Selection
Backward Elimination
Stepwise Selection
Polynomial Regression
Concept and Motivation
Fitting Non-linear Relationships
Degree of the Polynomial
Choosing Appropriate Degree
Bias-Variance Tradeoff
Feature Expansion
Polynomial Feature Generation
Interaction Terms
Risk of Overfitting
High-degree Polynomials
Validation Strategies
Regularization in Polynomial Regression
Ridge Regression Application
Lasso Regression Application
Regularized Linear Models
Ridge Regression
L2 Regularization Concept
Ridge Penalty Term
Effect on Coefficients
Choosing Regularization Parameter
Geometric Interpretation
Lasso Regression
L1 Regularization Concept
Lasso Penalty Term
Feature Selection Properties
Sparsity Induction
Coordinate Descent Algorithm
Elastic Net
Combination of L1 and L2 Penalties
Balancing Ridge and Lasso
When to Use Elastic Net
Parameter Selection
Logistic Regression
Binary Logistic Regression
The Sigmoid Function
Mathematical Definition
Properties and Shape
Logistic Model Formulation
Odds and Log-Odds
Decision Boundary
Maximum Likelihood Estimation
Log Loss
Binary Cross-Entropy Derivation
Gradient Computation
Multiclass Logistic Regression
Softmax Function
Mathematical Definition
Probability Interpretation
One-vs-Rest Approach
Multinomial Logistic Regression
Regularization in Logistic Regression
L1 Regularized Logistic Regression
L2 Regularized Logistic Regression
Interpretation of Coefficients
Odds Ratios
Marginal Effects
Previous
2. The Supervised Learning Workflow
Go to top
Next
4. Tree-Based Models