Autonomous Driving: The Anatomy of Autonomy
Back to Home
Technology

Autonomous Driving: The Anatomy of Autonomy

Part II: Technical Architecture and Components

20195 min readMacroPolo
Archive Notice: This article was originally published on macropolo.org on 2019. MacroPolo was the Paulson Institute's in-house think tank (2018–2024). This archived version preserves the original research for continued citation and reference.

Building on our overview of the autonomous driving landscape, this article examines the technical architecture that makes self-driving vehicles possible.

Sensing Stack

Autonomous vehicles rely on multiple sensor types working together:

  • LiDAR for precise 3D mapping of surroundings
  • Cameras for visual recognition and reading signs
  • Radar for detecting motion and measuring distance
  • Ultrasonic sensors for close-range detection

Perception and Planning

The AI system must interpret sensor data to understand the environment and make driving decisions. This involves object detection, tracking, prediction, and path planning.

China's Approach

Chinese companies like Baidu, Pony.ai, and WeRide are developing their own full-stack autonomous driving solutions, often with unique approaches to sensor fusion and AI.