Lab 06: Neural Networks from Scratch (NumPy)
Objective
Background
Input x → [W1, b1] → ReLU → [W2, b2] → ReLU → [W3, b3] → Softmax → Output ŷ
Forward pass: compute ŷ from x
Backward pass: compute ∂Loss/∂W for every weight W
Update: W := W - learning_rate * ∂Loss/∂WStep 1: Environment Setup
docker run -it --rm zchencow/innozverse-ai:latest bashimport numpy as np
print(f"NumPy: {np.__version__}")
np.random.seed(0)Step 2: Activation Functions and Their Derivatives
Step 3: Weight Initialisation
Step 4: Forward Pass
Step 5: Loss Function and Backpropagation
Step 6: Mini-Batch Training and Validation
Step 7: Adding Dropout Regularisation
Step 8: Real-World Capstone — Network Intrusion Detection Neural Network
Summary
Further Reading
Last updated
