Publications

This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.

Search for Publication


Year(s) from:  to 
Author:
Keywords (separated by spaces):

End-to-End Learning of Driving Models with Surround-View Cameras and Route Planners

Simon Hecker and Dengxin Dai and Luc Van Gool
European Conference on Computer Vision
September 2018

Abstract

For human drivers, having rear and side-view mirrors is vital for safe driving. They deliver a more complete view of what is happening around the car. Human drivers also heavily exploit their mental map for navigation. Nonetheless, several methods have been published that learn driving models with only a front-facing camera and without a route planner. This lack of information renders the self-driving task quite intractable. We investigate the problem in a more realistic setting, which consists of a surround-view camera system with eight cameras, a route planner, and a CAN bus reader. In particular, we develop a sensor setup that provides data for a 360-degree view of the area surrounding the vehicle, the driving route to the destination, and low-level driving maneuvers (e.g. steering angle and speed) by human drivers. With such a sensor setup we collect a new driving dataset, covering diverse driving scenarios and varying weather/illumination conditions. Finally, we learn a novel driving model by integrating information from the surround-view cameras and the route planner. Two route planners are exploited: 1) by representing the planned routes on OpenStreetMap as a stack of GPS coordinates, and 2) by rendering the planned routes on TomTom Go Mobile and recording the progression into a video. Our experiments show that: 1) 360-degree surround-view cameras help avoid failures made with a single front-view camera, in particular for city driving and intersection scenarios; and 2) route planners help the driving task significantly, especially for steering angle prediction. Code, data and more visual results will be made available at http://www.vision.ee.ethz.ch/~heckers/Drive360.


Link to publisher's page
@InProceedings{eth_biwi_01442,
  author = {Simon Hecker and Dengxin Dai and Luc Van Gool},
  title = {End-to-End Learning of Driving Models with Surround-View Cameras and Route Planners},
  booktitle = {European Conference on Computer Vision },
  year = {2018},
  month = {September},
  keywords = {}
}