Date Approved

12-2018

Graduate Degree Type

Thesis

Degree Name

Engineering (M.S.E.)

Degree Program

School of Engineering

First Advisor

Dr. Nicholas Baine

Second Advisor

Dr. Bruce Dunne

Third Advisor

Dr. Samhita Rhodes

Academic Year

2018/2019

Abstract

The detection and tracking of objects around an autonomous vehicle is essential to operate safely. This paper presents an algorithm to detect, classify, and track objects. All objects are classified as moving or stationary as well as by type (e.g. vehicle, pedestrian, or other). The proposed approach uses state of the art deep-learning network YOLO (You Only Look Once) combined with data from a laser scanner to detect and classify the objects and estimate the position of objects around the car. The Oriented FAST and Rotated BRIEF (ORB) feature descriptor is used to match the same object from one image frame to another. This information fused with measurements from a coupled GPS/INS using an Extended Kalman Filter. The resultant solution aids in the localization of the car itself and the objects within its environment so that it can safely navigate the roads autonomously. The algorithm has been developed and tested using the dataset collected by Oxford Robotcar. The Robotcar is equipped with cameras, LiDAR, GPS and INS collected data traversing a route through the crowded urban environment of central Oxford.

Share

COinS