Deep Neural Networks in Autonomous Driving

By Gil Golov - 2018-03-21

Rear-end crashes reduced by 27% for vehicles with forward collision warning systems – Insurance Institute for Highway Safety 20161

With the waterfall of Advanced Driver Assistance Systems (ADAS) moving to lower-end car models, and with a strong consumer take rate, car makers are continuing to innovate and integrate more capable autonomous systems into cars, including:

  • Safety features which can reduce human error in driving through automotive features and functions
  • Security systems that increase system-level security to thwart malicious hacking
  • Deep neural networks which use artificial intelligence to detect and recognize objects as well as—if not better than—humans
  • Sensor fusion which combines multiple sensor types to yield better “vision” than humans
  • In-cabin cameras to monitor driver status, including drowsy driver detection, to ensure seamless hand-off between car and driver in autonomous situations
  • V2X technology which further extends ADAS safety by providing vehicle-to-vehicle, vehicle-to-infrastructure communication in support of advanced traffic hazard communication

Deep neural networks (DNNs) play a critical role in realizing the future of autonomous driving. Implementing object detection and classification via DNNs consists of two phases: training and inference. During training, enormous amounts of labeled data are used to train the system and develop an algorithm that can consistently and accurately detect an object. Once the neural network has been trained, it is then deployed in the field (inference), where the algorithm is used to accurately detect objects in real time including, cars, pedestrians, street signs, bicyclists and more.

For the autonomous driving DNN algorithm to be realized, huge amounts of parameters must be processed to accurately detect surroundings. To understand the magnitude:

  • SuperLotto: To correctly guess five numbers (1-47) plus one mega number (1-27) requires 40M combinations
  • Complex DNNs: To process approximately 100M parameters (some 32bits) requires 4B^100M combinations

As we move closer to autonomous driving, automotive memory bandwidth also increases in direct correlation to the complexity of the DNN. Today, the automotive industry has already showcased platforms that require more than 1 terabytes (TBs) of memory bandwidth to the computation associated with the highest level in autonomous driving—level 5, which refers to a fully autonomous system. Practical implementations with this level of memory bandwidth can only be realized using GDDR Graphics memories traditionally intended for the graphics markets.

Micron’s introduction of automotive-grade GDDR6 memory is the essential puzzle piece that will enable next-generation autonomous driving today. With over 25 years’ commitment to the automotive market and as a recognized industry leader in GDDR memory, Micron’s GDDR6 products offer high densities, high bandwidth and discrete design to simplify system integration and meet the demands of high-performance automotive applications.


Gil Golov

Gil Golov

Gil Golov, senior manager of Automotive System Architecture & Strategic Marketing, is responsible for Micron’s autonomous driving system architecture and solutions. Prior to working for Micron, Gil spent more than 15 years in various R&D roles. He holds a Bachelor of Science in electronics from Tel Aviv University and a Master of Science in microelectronics from Brunel University (in the U.K.). Gil holds 26 patents.