Skip to main content
SHARE
Publication

Integrating Overhead Camera Imagery with Reinforcement Learning to Improve Fuel Economy Through Adaptive Traffic Control...

by Thomas P Karnowski, Ryan A Tokola, Timothy S Oesch, Jeff Price, Tim Gee
Publication Type
Conference Paper
Book Title
Electronic Imaging, Intelligent Robotics and Industrial Applications using Computer Vision 2020
Publication Date
Page Numbers
1 to 7
Conference Name
Intelligent Transportation Systems World Congress 2020
Conference Location
Los Angeles, California, United States of America
Conference Sponsor
ITS America
Conference Date
-

Overhead cameras have been deployed for some time as a sensor for traffic control, with many advantages over conventional inductive loop sensors in terms of robustness, ease of installation, and simplicity with respect to maintenance. We explore the utility of such imagery (in particular, fisheye lens cameras) for intelligent traffic control in real-time using reinforcement learning, with a goal of improving fuel economy for the overall intersection and grid of intersections. We summarize our initial work in data collection and approach to ascertain the ability to visually sense vehicular fuel consumption using real-world data. We then describe our deep reinforcement learning topology to test adaptive control informed by the camera capabilities. Next, we tested the proof-of-concept through a variety of simulation scenarios using the Simulation of Urban Mobility (SUMO) traffic simulator, which included different estimates on vehicle classifications, different traffic distributions, and different traffic densities. Six different control policies were tested ranging from simple timing, to heuristic visual-based methods, to actual policies learned via the reinforcement learning approach. Our results demonstrate the feasibility of the approach for controlling traffic lights for better fuel efficiency based on visual vehicle estimates from commercial, fisheye lens cameras.