Lab 15: AutoML & Neural Architecture Search

Objective

Automate ML pipeline design: hyperparameter optimisation with Bayesian search, neural architecture search (NAS), automated feature engineering, and ensemble construction — applied to building the best intrusion detection model without manual tuning.

Time: 50 minutes | Level: Advanced | Docker Image: zchencow/innozverse-ai:latest


Background

Traditional ML workflow:
  Human picks: features → algorithm → hyperparams → repeat manually

AutoML:
  Algorithm selection → hyperparameter optimisation → feature engineering → ensemble
  All automated. Human only provides: data + metric + compute budget.

Key techniques:
  Grid search:      exhaustive, O(n^k) — impractical beyond 3 params
  Random search:    better than grid for high-D (Bergstra & Bengio 2012)
  Bayesian optimisation: surrogate model of objective → smarter exploration
  NAS:              search over model architectures, not just hyperparams

Step 1: Bayesian Hyperparameter Optimisation

📸 Verified Output:


📸 Verified Output:


Step 3: Automated Feature Engineering

📸 Verified Output:


Step 4–8: Capstone — Full AutoML Pipeline

📸 Verified Output:


Summary

Technique
Search Space
Time Complexity
When to Use

Random Search

Hyperparams

O(n)

Quick baseline

Bayesian Opt

Hyperparams

O(n·GP)

Limited budget

Evolutionary NAS

Architectures

O(gen×pop)

Custom architectures

Auto feature eng

Transforms

O(k)

Tabular data

Full AutoML

All of above

Budget-based

Production

Further Reading

Last updated