AlpaSim is an open-source autonomous vehicle simulation platform designed specifically for research and development. It allows you to test end-to-end AV policies in a closed-loop setting by simulating realistic sensor data, vehicle dynamics, and traffic scenarios within a modular and extensible testbed.Documentation Index
Fetch the complete documentation index at: https://mintlify.com/NVlabs/alpasim/llms.txt
Use this file to discover all available pages before exploring further.
Use cases
AlpaSim is suitable for a wide range of autonomous driving research:Algorithm validation
Test new autonomous driving algorithms in realistic environments
Safety analysis
Evaluate vehicle behavior in edge cases and challenging scenarios
Performance benchmarking
Compare different models and configurations with regression testing
Debugging
Understand and debug complex autonomous driving behaviors
Core principles
AlpaSim is built on three foundational principles:Sensor fidelity
- Neural rendering integration: NuRec integration for photorealistic sensor simulation of novel views
- High-fidelity camera feeds: Configurable field-of-view, resolution, and frame rates
- Realistic conditions: Accurate sensor noise and environmental conditions
Research hackability
- Python-based implementation: Built for rapid prototyping and experimentation
- Modular gRPC interface: Swap out components with custom implementations
- Extensive configuration: Rich configuration options and debugging tools
Horizontal scalability
- Microservices architecture: Distributed computing across multiple services
- Individual component scaling: Optimal load balancing per service
- Multi-node deployments: Support for distributed execution
Architecture overview
AlpaSim consists of multiple networked microservices that work together to create a complete simulation environment:- Renderer: Provides observed video frames using neural rendering
- Physics simulation: Constrains actors to the road surface
- Runtime: Orchestrates the simulation and coordinates services
- Controller: Manages egomotion history
- Driver: Implements the driving policy
- Traffic simulation: Simulates other vehicles and traffic
Supported driving policies
AlpaSim currently supports the following driver policies:- Alpamayo-R1: NVIDIA Alpamayo, a VLA driving policy with chain-of-causation reasoning
- VaVAM: An autoregressive video-action driving policy
- Transfuser: Latent TransFuser v6 (LTFv6) policy developed for NAVSIM (provisional)
Additional model support is coming soon. Community contributions are appreciated.
Data and scenes
AlpaSim uses NuRec reconstructions of real-world driving logs as simulation scenes. These provide photorealistic environments for testing autonomous driving systems. Sample data: Publicly available scenes are hosted on Hugging Face and include the 25.07 release dataset with over 900 validated scenes.Next steps
Quick start
Run your first simulation in minutes
Onboarding
Set up dependencies and environment
System design
Learn about the technical architecture