Autonomous Driving Simulator: Unity and OpenCV

An image of 3 low-poly style cars on a track.

The Background

In Person

The University of Delaware Department of Mechanical Engineering frequently offers an elective in Autonomous Driving. The course is taught by Dr. Adam Wickenheiser. I was a student in this course my Junior year, when it was offered in-person. We used Duckiebots and the MIT Duckietown platform for our coursework. Duckiebots are miniature car platforms controlled by a Raspberry Pi. There are 2 DC motors (1 each for the left and right wheel) allowing for differential steering. Additionally, there is a Raspberry Pi Camera on board to capture a video feed to help control the vehicle. The robots were typically tasked with driving around a foam-tile road, with taped off lanes, while avoiding other vehicles.

A red robot 2 wheeled car with black and yellow wheels
A duckiebot with a Raspberry Pi, Camera, and Adafruit motor hat.
Gray foam tiles arranged in a ring with white and yellow tape lane markers
An example Duckietown road. Image taken from duckietown.com

Remote Learning

In the Spring of 2021, the course was being offered again. This time, however, the course would be fully remote. Simple, 2D simulations were created to drive a rectangular car around a track, so students could learn the basics of PID controllers for autonomous driving. While this was helpful, it deprived the students of the full autonomous-driving-car-controlling experience. While it wasn't feasible to ship each student their own duckiebot and road tiles, building a simulator was another great option.


The Solution

Real world physics?

What better way to simulate a driving experience than by using a physics-based game engine? I turned to Unity. While AAA quality was certainly off the table (given the expidited 3-week go-live date), having some low-poly cars on a track was certainly possible. Unity's built-in physics the heavy lifting, as one might imagine. The built-in rigid body colliders, especially those made specificially for wheels, were finnicky but workable.

In the image below, you can see the Unity Editor view in the top left, as well as the "Game View" in the bottom left. The "Game View" uses a 3rd-person following camera, so the students could see how their car moved on the road. Not shown is the 3rd view - a camera positioned on the hood of the car. This camera was analogous to the camera on the front of the duckiebots, and is the camera used to control the car.

A screenshot of the Unity game engine editor with the project loaded.
A screenshot of the Unity editor with the simulator loaded.

Controlling the Car

As was required for the course, the students were familiar with Python. This led to an interesting challenge - Since Unity can't directly interface with an external python script, how would the students write their controllers? The answer was temporary files. When the simulator was running, the "hood" camera on the car would capture images, and each frame, write the data out to a temporary file on the computer's disk. This data was read in by the Python wrapper class I wrote, and passed to the students' code akin to if they were using a duckiebot. This way, the students saw minimal interruption to the normal workflow. Below is a code snippet that loads data from the temporary file and passes it to the students' control code:

Most of the code here is to set up the link between the temporary file and Python, as well as some safety checks (to ensure we don't try to control the same car multiple times, making sure the correct data format is followed, etc). The simulator allows for 2 control systems:

On lines 26 and 33, there is a call to a control_func. This is the piece of code the students would write, to return the desired throttle and steering of the car based on the input information.

A screen recording of the raw and processed simulator frames
Side by side comparison of the raw and filtered frames, using OpenCV in Python.

The video above demonstrates the navigation camera frame sent from Unity to Python (left) and the post-processed frame filtering for the lane lines using OpenCV in Python (right). In this clip, the car is being controlled manually with keyboard inputs. The students' assignment was to first process the raw frame on the left, to keep only the important information (lane lines), resulting in the image on the right. Next, they would process the filtered frame on the right to estimate the car's position in the lane, and steer it towards the center, while moving forwards.


Conclusions

Overall, the simulator was a great success. Students were able to perform the following autonomous driving routines:

With a return to in-person learning, this project is closed for now. I would like to revisit it at some point and finish adding some of the features I have not yet gotten to, so keep an eye out for updates and a final version!

Check out the whole project on GitHub! (Coming Soon™!)