Useful Links
Computer Science
Artificial Intelligence
Deep Learning
Deep Learning and Neural Networks
1. Foundations of Machine Learning and Neural Networks
2. Training Shallow Neural Networks
3. Deepening the Network
4. Practical Considerations for Training
5. Convolutional Neural Networks (CNNs)
6. Recurrent Neural Networks (RNNs)
7. The Transformer Architecture
8. Generative Models
9. Deep Reinforcement Learning
10. Advanced Topics and Specialized Architectures
11. Deployment and Production
Convolutional Neural Networks (CNNs)
Motivation for CNNs
Challenges with High-Dimensional Data
Limitations of MLPs for Image Data
Spatial Structure in Images
Translation Invariance
Local Connectivity Principles
Parameter Sharing Benefits
Core Components of CNNs
The Convolutional Layer
Convolution Operation
Filters and Kernels
Filter Size Selection
Number of Filters
Learnable Parameters
Stride and Padding
Stride Definition and Effects
Padding Types
Valid Padding
Same Padding
Causal Padding
Feature Maps
Interpretation and Visualization
Depth and Spatial Dimensions
Receptive Fields
Local Connectivity
Effective Receptive Field
The Pooling Layer
Purpose of Pooling
Downsampling Operations
Max Pooling
Operation and Effects
Translation Invariance
Average Pooling
Operation and Effects
Smooth Downsampling
Global Pooling
Global Average Pooling
Global Max Pooling
Adaptive Pooling
Fully Connected Layers
Role in Classification
Flattening Feature Maps
Parameter Count Considerations
Activation Functions in CNNs
ReLU in Convolutional Layers
Softmax in Output Layers
CNN Architectures
Classic Architectures
LeNet-5
Structure and Innovations
Historical Significance
AlexNet
Deep Architecture
Use of ReLU and Dropout
GPU Implementation
VGGNet
Deep and Uniform Architecture
Small Filter Sizes
GoogLeNet (Inception)
Inception Modules
Multi-Scale Processing
1x1 Convolutions
Modern Architectures
ResNet (Residual Networks)
Residual Connections
Deep Architectures
Batch Normalization Integration
DenseNet
Dense Connections
Feature Reuse
EfficientNet
Compound Scaling
Neural Architecture Search
Advanced CNN Concepts
Dilated Convolutions
Atrous Convolutions
Receptive Field Expansion
Separable Convolutions
Depthwise Separable Convolutions
Parameter Reduction
Grouped Convolutions
Channel Grouping
Computational Efficiency
Transposed Convolutions
Upsampling Operations
Deconvolution Terminology
Computer Vision Applications
Image Classification
Single-Label Classification
Multi-Label Classification
Fine-Grained Classification
Object Detection
Bounding Box Prediction
Two-Stage Detectors
R-CNN Family
Region Proposal Networks
One-Stage Detectors
YOLO
SSD
Semantic Segmentation
Pixel-wise Classification
Fully Convolutional Networks
U-Net Architecture
Instance Segmentation
Object Detection and Segmentation
Mask R-CNN
Image Generation
Generative Models in Vision
Style Transfer
Transfer Learning with CNNs
Pre-trained Models
ImageNet Pre-training
Model Zoos
Fine-Tuning Strategies
Layer Freezing
Learning Rate Adjustment
Feature Extraction
Fixed Feature Extractor
Bottleneck Features
Domain Adaptation
Applications and Benefits
Previous
4. Practical Considerations for Training
Go to top
Next
6. Recurrent Neural Networks (RNNs)