Learning Resource for AD Simulation

Master Autonomous Driving Simulation

From neural rendering and synthetic data to scalable simulation infrastructure — explore the technologies powering next-generation ADAS/AD systems.

6
Domains
13
Deep Dives
3
Project Tracks
20K+
Lines of Content

Why AD Simulation?

Autonomous driving simulation spans neural rendering, synthetic data, sensor physics, and closed-loop testing — each critical for building safe, scalable AD systems.

Neural Rendering

Reconstruct photorealistic 3D scenes from drive logs using Gaussian Splatting and NeRF for sensor-accurate simulation.

Synthetic Data

Generate unlimited auto-labeled training data at scale — reduce real data needs by 90% while maintaining model performance.

Sensor Physics

Simulate camera, lidar, and radar with physics-based models including ray tracing, beam divergence, and multipath effects.

Closed-Loop Testing

Validate AD systems end-to-end with reactive agents, scenario generation, and realism metrics.

Deep Dive Papers

Comprehensive guides covering theory, implementation, mental models, and interview preparation for each major topic.

Deep Dive45 min

Data-Driven Simulation

Core simulator architecture, data-driven simulation, metrics system, and evaluation framework for autonomous driving.

Deep Dive50 min

WOSAC Challenge

Sim Agents Challenge evaluation framework, realism metrics, and winning strategies from 2023-2025.

Deep Dive55 min

JAX Scaling RL

How to scale RL training across GPUs/TPUs using JAX primitives: jit, vmap, pmap, scan, and distributed PPO.

Deep Dive50 min

V-Max Framework

Complete RL training pipeline including ScenarioMax, observation design, and reward hierarchy for driving policies.

Deep Dive40 min

BehaviorGPT

State-of-the-art sim agent modeling with transformers, Next-Patch Prediction, and the 2024 WOSAC winner approach.

Deep Dive55 min

Sim-to-Real Gap

Bridging virtual and physical worlds: perception, actuation, and behavioral gaps with neural rendering and world models.

Deep Dive50 min

Long-Tail Scenarios

Safety-critical testing at scale: adversarial generation, scenario mining, and coverage metrics for AV validation.

Deep Dive55 min

Distributed Training

Scaling RL to billions of steps: PureJaxRL, actor-learner architectures, and GPU-accelerated simulation infrastructure.

Deep Dive50 min

Neural Rendering for AD

3D Gaussian Splatting, NeRF, NeuRAD, SplatAD, and differentiable rendering for photorealistic sensor simulation.

Deep Dive45 min

Synthetic Data for Perception

Data generation pipelines, domain randomization, auto-labeling, and domain gap mitigation for perception training.

Deep Dive50 min

Physics-Based Sensor Simulation

Camera, lidar, and radar physics modeling with ray tracing, Vulkan rendering engines, and multi-fidelity approaches.

Deep Dive45 min

World Models for AD

GAIA-1, Waymo World Model, DriveDreamer, and generative simulation as an alternative to reconstruction-based approaches.

Deep Dive55 min

Applied Intuition Platform

Complete platform analysis: Neural Sim, Synthetic Data, Sensor Sim, SDS autonomy stack, and competitive landscape.

Structured Learning Path

Progress from JAX fundamentals through neural rendering, sensor simulation, and advanced topics with our structured curriculum.

Weeks 1-2

JAX Fundamentals

  • JIT Compilation
  • vmap Vectorization
  • pmap Parallelism
  • scan for Sequences
Weeks 3-4

Data-Driven Simulation

  • Data-Driven Sim
  • WOSAC Metrics
  • Log Playback
  • Agent Interfaces
Weeks 5-6

Neural Rendering

  • 3D Gaussian Splatting
  • NeRF Fundamentals
  • Scene Reconstruction
  • Novel View Synthesis
Weeks 7-8

Sensor Sim & Synthetic Data

  • Physics-Based Models
  • Data Pipelines
  • Domain Gap Mitigation
  • Auto-Labeling
Weeks 9-10

Behavior Modeling

  • BehaviorGPT
  • RL Training
  • V-Max Framework
  • Realism Metrics
Weeks 11-12

Advanced Topics

  • World Models
  • Distributed Training
  • Closed-Loop Evaluation
  • Generative Simulation

Key Insights

Critical lessons from studying autonomous driving simulation infrastructure

Neural Rendering Closes the Gap

Neural rendering from real data minimizes sim-to-real gap — Gaussian Splatting achieves 100+ FPS with photorealistic quality.

Synthetic Data at Scale

Synthetic data reduces real training data needs by 90% while maintaining downstream model performance.

Sensor Physics Matters

Physics-based sensor simulation requires modeling photon-level interactions — ray tracing enables accurate ghost targets and multipath effects.

World Models are Emerging

World models are emerging as generative alternatives to reconstruction — creating unseen scenarios from language prompts for scalable simulation.

Ready to Start Learning?

Dive into the deep dive papers to understand the theory, explore the hands-on code examples, and master the technologies powering next-generation AD simulation.