UsefulLinks
Computer Science
Data Science
Predictive Analytics
1. Foundations of Predictive Analytics
2. Data Foundation and Preparation
3. Regression Modeling
4. Classification Modeling
5. Ensemble Methods
6. Neural Networks and Deep Learning
7. Time Series Analysis and Forecasting
8. Unsupervised Learning
9. Model Evaluation and Validation
10. Model Interpretability and Explainability
11. Model Deployment and Production
12. Business Applications and Use Cases
13. Ethics and Responsible AI
3.
Regression Modeling
3.1.
Linear Regression Fundamentals
3.1.1.
Simple Linear Regression
3.1.1.1.
Mathematical Foundation
3.1.1.2.
Least Squares Method
3.1.1.3.
Model Assumptions
3.1.1.4.
Coefficient Interpretation
3.1.1.5.
Residual Analysis
3.1.2.
Multiple Linear Regression
3.1.2.1.
Model Specification
3.1.2.2.
Matrix Formulation
3.1.2.3.
Coefficient Estimation
3.1.2.4.
Statistical Inference
3.1.3.
Regression Diagnostics
3.1.3.1.
Linearity Assessment
3.1.3.2.
Independence Testing
3.1.3.3.
Homoscedasticity Evaluation
3.1.3.4.
Normality of Residuals
3.1.3.5.
Multicollinearity Detection
3.1.3.5.1.
Variance Inflation Factor
3.1.3.5.2.
Condition Index
3.1.3.5.3.
Correlation Matrix Analysis
3.2.
Advanced Linear Models
3.2.1.
Polynomial Regression
3.2.1.1.
Polynomial Feature Creation
3.2.1.2.
Degree Selection
3.2.1.3.
Overfitting Prevention
3.2.2.
Interaction Models
3.2.2.1.
Two-way Interactions
3.2.2.2.
Higher-order Interactions
3.2.2.3.
Interaction Interpretation
3.2.3.
Piecewise Regression
3.2.3.1.
Breakpoint Identification
3.2.3.2.
Spline Regression
3.2.3.3.
Threshold Models
3.3.
Regularized Regression
3.3.1.
Ridge Regression
3.3.1.1.
L2 Penalty Function
3.3.1.2.
Shrinkage Effects
3.3.1.3.
Lambda Parameter Tuning
3.3.2.
Lasso Regression
3.3.2.1.
L1 Penalty Function
3.3.2.2.
Feature Selection Properties
3.3.2.3.
Sparse Solutions
3.3.3.
Elastic Net
3.3.3.1.
Combined L1 and L2 Penalties
3.3.3.2.
Alpha Parameter Selection
3.3.3.3.
Mixing Parameter Optimization
3.3.4.
Regularization Path Analysis
3.3.4.1.
Cross-validation for Parameter Selection
3.3.4.2.
Regularization Strength Effects
3.4.
Non-linear Regression
3.4.1.
Generalized Linear Models
3.4.1.1.
Link Functions
3.4.1.2.
Exponential Family Distributions
3.4.1.3.
Maximum Likelihood Estimation
3.4.2.
Kernel Regression
3.4.2.1.
Kernel Functions
3.4.2.2.
Bandwidth Selection
3.4.2.3.
Local Polynomial Regression
3.4.3.
Regression Trees
3.4.3.1.
Tree Construction
3.4.3.2.
Pruning Strategies
3.4.3.3.
Ensemble Applications
Previous
2. Data Foundation and Preparation
Go to top
Next
4. Classification Modeling