Introduction to Self-Driving
9:00 AM - 9:30 AM
In this section we will give a general introduction of self-driving and review the content of this tutorial.
Hardware and Sensors
9:30 AM - 10:00 AM
Learn about different sensor setups (LiDAR, RADAR, Camera), trade-offs between different kinds of sensors, as well as how to put the sensors together and how to design the associated compute unit.
☕️ Short break
10:00 AM - 10:10 AM
10:10 AM - 10:40 AM
In this session we will discuss how to build a robust 3d perception system by exploiting information from different sources, with different sensor fusion strategies. We will also talk about different output representations that have been used for perception and introduce challenges when we deploy the perception system in the real world, such as unknown object recognition, and take into account the system latency.
10:40 AM - 11:40 AM
Learn about how and why the future state of the world is forecasted in autonomous driving. We will look into the challenges of this task, different input and output representations, as well as different architectures to tackle this problem.
Motion Planning and Control
11:40 AM - 12:25 PM
In this session, we will discuss various learnable motion palnning pipelines, important aspects of the planning problem, and main approaches to control.
🍔 Lunch break
1:00 PM - 1:45 PM
Vehicle-to-Vehicle (V2V) Communication
1:45 PM - 2:05 PM
In this session, we will discuss how to make self-driving vehicles even safer and better through intelligent communication between connected vehicles as well as infrastructure. We will review existing approaches to V2V communication, their different tradeoffs, and datasets.
2:05 PM - 2:35 PM
In this session you will learn how and why maps are used in autonomous driving. We will cover the different kinds of map representations that are used by tasks like motion forecasting, motion planning, simulation and explain their trade-offs. We also cover online mapping together with its benefits and challenges.
2:35 PM - 3:05 PM
This session will help you understand how self-driving vehicles robustly establish their precise position within HD maps in order to leverage them for safe and efficient autonomous driving. We will cover the broad range of approaches to localization, covering topics as diverse as place recognition, map matching, point cloud registration, as well as the nascent field of neural SLAM.
🧋 Afternoon break
3:05 PM - 3:20 PM
Data & Evaluation
3:20 PM - 3:50 PM
In this section, we’ll take a step back from machine learning models and provide a broader overview of the ML development cycle, focusing on the importance of data for training and evaluation. In particular, we’ll cover recent trends in self-driving datasets, techniques for dataset curation, and provide a high-level overview of approaches for evaluating self-driving models.
Siva Manivasagam and Kelvin Wong
3:50 PM -5:20 PM
In this session, we'll explain the different components required to build a comprehensive simulator for autonomy testing and development. We'll explain different approaches and recent trends for building virtual worlds, simulating their dynamics, and modelling the vehicle platform interacting within the simulator.
System Safety Validation and Verification
5:20 PM - 5:40 PM
In this session we introduce the concepts and steps in order to turn a self-driving system into a safe self-driving product. This will be a broad overview of expectations and process around engineering culture, product development, and creation of a safety arguement for an autonmous system.
5:40PM - 5:55PM
In this section we will give concluding remarks about the tutorial.