Useful Links
Computer Science
Artificial Intelligence
Deep Learning
Deep Learning with PyTorch
1. Introduction to Deep Learning and PyTorch
2. PyTorch Fundamentals
3. Building Neural Networks with torch.nn
4. Data Loading and Processing
5. Training and Evaluating Models
6. Convolutional Neural Networks (CNNs) for Computer Vision
7. Recurrent Neural Networks (RNNs) for Sequential Data
8. Advanced Deep Learning Architectures
9. Model Deployment and Production
10. Practical Considerations and Best Practices
Building Neural Networks with torch.nn
The nn.Module Class
Understanding nn.Module
Base Class for All Neural Networks
Parameter Management
State Management
Creating Custom Modules
Subclassing nn.Module
init() Method Implementation
forward() Method Implementation
Module Hierarchy
Parent and Child Modules
Module Registration
Named Modules and Parameters
Parameter and Buffer Management
Registering Parameters
Registering Buffers
Parameter Initialization
Linear Layers
Fully Connected Layers (nn.Linear)
Input and Output Features
Weight Matrix
Bias Vector
Mathematical Formulation
Linear Layer Variations
Bilinear Layers
Identity Layers
Convolutional Layers
1D Convolution (nn.Conv1d)
Input and Output Channels
Kernel Size
Stride and Padding
Dilation
2D Convolution (nn.Conv2d)
Feature Map Processing
Spatial Dimensions
Parameter Sharing
3D Convolution (nn.Conv3d)
Transposed Convolution
Upsampling Operations
Deconvolution Concept
Depthwise and Separable Convolutions
Pooling Layers
Max Pooling
nn.MaxPool1d
nn.MaxPool2d
nn.MaxPool3d
Average Pooling
nn.AvgPool1d
nn.AvgPool2d
nn.AvgPool3d
Adaptive Pooling
AdaptiveMaxPool2d
AdaptiveAvgPool2d
Global Pooling
Recurrent Layers
Simple RNN (nn.RNN)
Vanilla RNN Architecture
Hidden State Processing
Long Short-Term Memory (nn.LSTM)
LSTM Cell Architecture
Bidirectional LSTM
Gated Recurrent Unit (nn.GRU)
GRU Cell Architecture
Bidirectional GRU
RNN Input and Output Handling
Sequence Length Handling
Batch First Option
Activation Functions
Linear Activations
Identity Function
Non-linear Activations
Sigmoid
Tanh
ReLU
LeakyReLU
PReLU
ELU
SELU
Swish
GELU
Softmax and LogSoftmax
Choosing Activation Functions
Normalization Layers
Batch Normalization
nn.BatchNorm1d
nn.BatchNorm2d
nn.BatchNorm3d
Layer Normalization
Instance Normalization
Group Normalization
Regularization Layers
Dropout
nn.Dropout
nn.Dropout2d
nn.Dropout3d
AlphaDropout
Loss Functions
Regression Losses
Mean Squared Error (MSELoss)
Mean Absolute Error (L1Loss)
Smooth L1 Loss
Huber Loss
Classification Losses
Cross Entropy Loss
Negative Log Likelihood Loss
Binary Cross Entropy Loss
Focal Loss
Custom Loss Functions
Implementing Custom Losses
Combining Multiple Losses
Previous
2. PyTorch Fundamentals
Go to top
Next
4. Data Loading and Processing