How To Debug An Autonomous Vehicle

March 08, 2016

With six months to go until Sparkfun AVC 2016, I've started work on my entry again (a fairly simple entry based around an Arduino Mega). I've learned a lot from last year's failure and I'm on a mission to make improvements that make it easier for me to debug issues and get my entry performing consistently. In my day job I spent plenty of time debugging code, but debugging an autonomous vehicle is much harder and physical crashes are harder to recover from than software crashes.

The first step was to buy a Go Pro HERO camera (just the basic version, costing a little over $100). This weekend we took G-Force for a test run with the camera attached and got some good footage.

During each run, I have code running on the Arduino that records a bunch of telemetrics (compass bearing, GPS co-ordinates, and ultra-sonic sensor data) to an SD card and this is very useful but it is also difficult to review since there is so much data. It is also hard to correlate the data with the footage from the Go Pro. I really need some visual indicators on the vehicle so that when I review footage from the Go Pro, I can see what is happening. For instance, when the vehicle makes a sharp turn, is it for obstacle avoidance or because the GPS data changed?

I had an Adafruit Neopixel stick lying around so I hooked that up last night. It has 8 LEDs, so I am using 5 of them to indicate readings from the Octasonic HC-SR04 breakout that is monitoring the ultrasonic sensors (using green, amber, red to indicate distances). One of the LEDs indicates GPS information (red for no fix and then toggling between green and blue as new co-ordinates are received). The remaining two LEDs are used to indicate obstacle avoidance (left and right). Here's a short video showing a test of the LEDs.

This weekend I'll take G-Force out for another test run and capture some footage showing the status LEDs in action.