Useful Links
Computer Science
Data Science
Time Series Analysis and Forecasting
1. Introduction to Time Series Data
2. Mathematical Foundations
3. Fundamental Concepts
4. Data Preprocessing and Exploration
5. Classical Forecasting Models
6. Advanced Statistical Models
7. Machine Learning for Time Series
8. Model Evaluation and Validation
9. Advanced Topics and Applications
Machine Learning for Time Series
Feature Engineering
Lag Features
Autoregressive features
Lag selection methods
Cross-correlation features
Rolling Window Statistics
Moving averages
Rolling standard deviation
Rolling minimum and maximum
Rolling quantiles
Date and Time Features
Day of week effects
Month effects
Quarter effects
Year effects
Holiday indicators
Special events encoding
Cyclical Feature Encoding
Sine and cosine transformations
Fourier features
Periodic encoding
Technical Indicators
Financial market indicators
Momentum indicators
Volatility indicators
Regression-Based Approaches
Linear Regression
Time series regression
Trend and seasonal variables
Assumptions and diagnostics
Polynomial Regression
Higher-order trends
Overfitting issues
Regularized Regression
Ridge regression
Lasso regression
Elastic net
Time-Varying Coefficient Models
Rolling window regression
Recursive estimation
Tree-Based Models
Decision Trees
Splitting criteria
Pruning methods
Time series adaptations
Random Forests
Bootstrap aggregating
Feature importance
Out-of-bag error
Gradient Boosting
Boosting algorithms
XGBoost implementation
LightGBM implementation
CatBoost implementation
Model Interpretation
Feature importance measures
Partial dependence plots
SHAP values
Neural Network Models
Feedforward Networks
Multi-layer perceptron
Input window design
Architecture selection
Recurrent Neural Networks
Basic RNN architecture
Vanishing gradient problem
Training challenges
Long Short-Term Memory Networks
LSTM architecture
Gate mechanisms
Memory cell operations
Bidirectional LSTM
Gated Recurrent Units
GRU architecture
Comparison with LSTM
Computational efficiency
Convolutional Neural Networks
1D convolutions
Temporal convolutions
Dilated convolutions
WaveNet architecture
Hybrid Architectures
CNN-LSTM models
CNN-GRU models
Attention mechanisms
Advanced Deep Learning
Sequence-to-Sequence Models
Encoder-decoder architecture
Multi-step forecasting
Variable length sequences
Attention Mechanisms
Self-attention
Multi-head attention
Temporal attention
Transformer Models
Transformer architecture
Positional encoding
Time series transformers
Generative Models
Variational autoencoders
Generative adversarial networks
Normalizing flows
Previous
6. Advanced Statistical Models
Go to top
Next
8. Model Evaluation and Validation