I really enjoyed working my way through Udacity's Sensor Fusion Nanodegree Program. Each lesson culminated in a final project. My final project submissions are described below; all submissions are implemented in C++, except for the radar project which is in Matlab.
Note: If interested in runnable code please follow the links in the project list below.
Lidar - Obstacle Detection: From Lidar measurements of a traffic scene filter out points to the left and right of the roadway and points representing the roadway itself. Then detect obstacles by nearest neighbor clustering the filtered points. The implemented nearest neighbor algorithm uses a kD tree data structure to quickly find neighboring points.
Camera - 2D Feature Tracking: As a preparation for the final project investigate various 2D keypoint detector / descriptor combinations to see which pairs perform best.
Camera - 3D Object Tracking: Build a collision detection system. Using work from the 2D Feature Tracking project compute time-to-collision (TTC) to 3D objects (cars, bicycles, pedestrians, etc.) from camera features. Also compute TTC from Lidar measurements. Project Lidar points into the camera image and associate the projected points with the detected 3D objects. My final report.
Radar: Radar target generation and detection. CFAR processing of range - doppler image. My final report
Kalman Filters - Tracking with Unscented Kalman Filters: Track vehicles on a multi-lane road using a constant turn rate and velocity (CTRV) magnitude model. Fuse Lidar and radar measurements in an unscented Kalman filter.