Useful Links
Computer Science
Artificial Intelligence
Deep Learning
Deep Learning for Computer Vision
1. Foundations of Computer Vision and Deep Learning
2. Convolutional Neural Networks
3. Training Deep Vision Models
4. Classical CNN Architectures
5. Modern CNN Architectures
6. Core Computer Vision Tasks
7. Advanced Topics and Applications
8. Practical Implementation and Deployment
Training Deep Vision Models
Dataset Preparation
Common Vision Datasets
MNIST
Handwritten Digit Recognition
Dataset Characteristics
Baseline Performance
Fashion-MNIST
Clothing Item Classification
CIFAR-10
Natural Image Classification
Dataset Challenges
CIFAR-100
Fine-grained Classification
ImageNet
Large-scale Visual Recognition
Dataset Statistics
Evaluation Protocol
COCO
Object Detection
Instance Segmentation
Keypoint Detection
Captioning Tasks
Pascal VOC
Object Detection and Segmentation
Open Images
Large-scale Object Detection
CelebA
Face Attribute Recognition
Data Loading and Preprocessing
Image Loading
File Format Handling
Memory Management
Resizing Strategies
Aspect Ratio Preservation
Center Cropping
Padding Methods
Normalization Techniques
Pixel Value Scaling
Channel-wise Normalization
Dataset Statistics
Data Type Conversion
Float vs Integer Representations
Data Augmentation
Geometric Transformations
Rotation
Angle Selection
Interpolation Methods
Scaling and Zooming
Scale Factor Selection
Translation
Boundary Handling
Shearing
Perspective Transformation
Elastic Deformation
Photometric Transformations
Brightness Adjustment
Contrast Modification
Saturation Changes
Hue Shifting
Gamma Correction
Noise Addition
Gaussian Noise
Salt and Pepper Noise
Speckle Noise
Occlusion Techniques
Random Erasing
Cutout
Hide-and-Seek
Advanced Augmentation
Mixup
Linear Interpolation
Label Smoothing
CutMix
Spatial Mixing
AutoAugment
Policy Search
RandAugment
Simplified Policy
Data Splitting Strategies
Training Set
Validation Set
Test Set
Cross-validation
Stratified Sampling
Loss Functions for Vision Tasks
Classification Losses
Binary Cross-Entropy
Sigmoid Activation
Class Imbalance Handling
Categorical Cross-Entropy
Softmax Activation
Multi-class Classification
Sparse Categorical Cross-Entropy
Integer Label Handling
Focal Loss
Hard Example Mining
Class Imbalance Solutions
Label Smoothing
Overconfidence Prevention
Regression Losses
Mean Squared Error
L2 Loss Properties
Mean Absolute Error
L1 Loss Properties
Robustness to Outliers
Huber Loss
Smooth L1 Loss
Combining L1 and L2
Segmentation Losses
Dice Loss
Overlap Measurement
Class Imbalance Handling
Intersection over Union Loss
Jaccard Index
Tversky Loss
Generalized Dice Loss
Boundary Loss
Edge-aware Segmentation
Detection Losses
Smooth L1 Loss
Bounding Box Regression
Balanced Cross-Entropy
Positive-Negative Balance
Regularization Techniques
Weight Regularization
L1 Regularization
Sparsity Promotion
Feature Selection
L2 Regularization
Weight Decay
Smooth Solutions
Elastic Net
L1 and L2 Combination
Architectural Regularization
Dropout Variants
Standard Dropout
Spatial Dropout
DropConnect
Batch Normalization Effects
Implicit Regularization
Training Regularization
Early Stopping
Validation Monitoring
Patience Parameter
Learning Rate Scheduling
Regularization Effects
Data-based Regularization
Data Augmentation
Noise Injection
Adversarial Training
Transfer Learning and Fine-tuning
Pre-trained Model Selection
Model Zoo Resources
Architecture Considerations
Domain Similarity Assessment
Feature Extraction Approach
Frozen Feature Extractor
Layer Selection
Feature Dimensionality
Fine-tuning Strategies
Full Network Fine-tuning
Layer-wise Fine-tuning
Gradual Unfreezing
Learning Rate Adjustment
Domain Adaptation
Domain Shift Challenges
Unsupervised Domain Adaptation
Few-shot Domain Adaptation
Multi-task Learning
Shared Representations
Task-specific Heads
Loss Balancing
Previous
2. Convolutional Neural Networks
Go to top
Next
4. Classical CNN Architectures