Sparkfun AVC 2016

We were unable to compete at Sparkfun AVC at all this year due to issues with our compass. Things were working pretty well during practice the day before although the compass was off by about 10 degrees on the run to the first corner. Here’s a video from practice which demonstrates obstacle avoidance kicking in when the vehicle was drifting too close to the hay bales.

So, we attempted to calibrate the compass that evening and when I came back to the site on Saturday morning I found that the compass was off by around 100 degrees. I don’t know what we did wrong yet. I did attempt to get another compass working on the day but it used a different interface (I2C) and only returned raw x, y, z values and I’m just not familiar enough with the math for turning those numbers into a heading so after trying various algorithms I found online and getting bad readings, I gave up and switched to being a spectator.

img_20160915_124050409_hdr

To summarize my performance in different areas:

  • The instrumented video really worked well to show me what the issues were and was a huge improvement over looking through log files.
  • The use of the LCD display was very helpful to see what was going on during testing.
  • The ultrasonic sensors were very unreliable at detecting hay bales and there was huge variance in different batches of sensors that I had. I’m definitely going to switch to LIDAR next time.
  • The GPS and navigation logic worked really well as far as I could tell.
  • The compass was a disaster, giving totally incorrect readings after we calibrated it.

Apart from being prepared ahead of time next year, a big lesson learned is to have spare parts in case of a component failing. I also need to practice on an enclosed course next time as this is much different to navigating waypoints in a large open parking lot.

This was a pretty intense four weeks building a new autonomous vehicle with Rust and the Raspberry Pi and I learned a lot about both. The project also gave me an excuse to do use Onshape more for designing various mounts that I then 3D-printed.

I’m looking forward to doing much more with Rust and the Raspberry Pi in the future.

 

 

AVC update 8/29

After two pretty intense weekends of learning more about Rust, I now have the new autonomous vehicle basically working. We took it for the very first test run last night and here’s the video as captured and instrumented by the Rust code:

This was pretty good for a first test. The instrumented video is paying off already as I can see from the data that the code is oversteering when turning resulting in the dizzying left/right motion. This would have been much harder to spot if I were just looking through log files.

One of the main goals of this project is to help accelerate the Rust learning curve and that is certainly happening. I learned how to implement threads, using an Arc (atomically reference counted wrapper for shared state) to share data between the navigation thread and the video thread. I also had to implement a simple web interface using Hyper and Iron so that I can start/stop the vehicle from the touchscreen display. I also experimented with channels and generally I think those would be preferred over using Arx and Mutex, but they didn’t quite seem to fit with my use case.

Apart from fine-tuning the software and performing more testing, the only significant challenge ahead is to add some ultrasonic sensors and an SPI-based board for monitoring them and hooking that up the Raspberry Pi using a logic-level converter (the board is 5V, and the Pi is 3.3V).

 

 

I’m porting my AVC entry to Raspberry Pi + Rust !

I very recently started using the Rust programming language professionally and although I am still working my way through the learning curve, I feel that I am proficient enough to “get it done” even if it isn’t using the most idiomatic Rust code. I’m enjoying the language immensely and have been spending evenings and weekends challenging myself to solve various problems in Rust. I’ve even created a dedicated blog to write about my experiences learning rust: Keep Calm and Learn Rust.

After attending the Boulder/Denver Rust Meetup recently, I agreed to give a talk on Rust at their next meetup on 9/21, perhaps related to IoT or Raspberry Pi.

This past Saturday (8/20) I started playing around with Rust on the Raspberry Pi and started researching how to interact with webcams as well as serial and SPI sensors. Within a couple of hours I was hooked and decided to commit to upgrading my Sparkfun AVC entry from an Arduino platform to a Raspberry Pi platform with as much logic as possible implemented in Rust.

Yes, that’s right … with a perfectly fine working entry, and with exactly four weeks until the event, I decided to completely change the hardware and software architecture. Have I bitten off more than I can chew? Maybe … we will see.

Moving to the Pi opens up so many opportunities. I have already prototyped processing a live video stream using Rust and OpenCV and overlaying text onto the captured frames, so I can include instrumentation data (GPS, compass bearing, etc) in the video. This will really solve the problem I have had of debugging the vehicle after each run to see what really happened.

Moving to Rust is ideal because I can call C code (such as OpenCV) with zero overhead. Rust also ensures that my code will be robust thanks to its memory safety features.

I am now 4 days into the process and have been able to read GPS and compass data successfully using an FTDI Basic breakout to connect the sensors to the Raspberry Pi’s USB ports, and as I already mentioned, I have been able to add real-time instrumentation data to a video stream.

There are challenges ahead though. I need to figure out SPI communications (for the board that I have that monitors the ultrasonic sensors – and this is based on 5V rather than the Pi’s 3.3V). I’ll also have to implement a serial protocol in Rust to driver the motor controller board, since there is no Rust source code available yet.

You can follow along with my progress in several ways – you can subscribe to this blog, you can follow me on twitter at @andygrove73, and you can watch my github repository. This is all open source and I hope I can inspire others to try Rust out by sharing this code.

3D Printed Voice Changer Case

In the weeks leading up to the Denver Maker Faire, I decided to design a case to hold an Arduino and the voice changer shield that I had designed. I wanted to make it easy for people to have a go at talking like a Dalek. It turned out to be a huge success!

The voice changer was connected to powered speakers inside the Dalek, and also connected to the LEDs in the dome. This did a pretty convincing job of making it look like the Dalek was talking.

Here are a couple of photos of the final case design.

IMG_20160607_174853683_HDR

IMG_20160607_175513147

I’ve uploaded the design to Thingiverse here for any previous voice changer customers who want to print their own case.

Preparing for Denver Mini Maker Faire 2016

I was contacted about five weeks ago to see if I would be interested in taking my Dalek along to the Denver Mini Maker Faire with my local maker space, The Gizmo Dojo. I hadn’t taken the Dalek to an event since 2014 and it was in a bit of a state of disrepair so I decided this would be a good opportunity to make some repairs and improvements. It was also good timing, since I have now learned enough about 3D design and printing to be able to make some custom parts that I was unable to manufacture before.

Here are the before and after photos showing the improvements made over the past five weeks. Hopefully it is obvious which is which.

dalek_before_after

I originally built the Dalek in 2012 and it was a long project. I eventually got tired of working on it and took some short cuts with the shoulder section and was never too happy with the results. To make it look a little more authentic I have now added some aluminum fly screen under the slats and drilled holes to add bolts to hold the slats in place. This makes the slats stand out from the body to make them more noticable.

IMG_20160522_121812828_HDR

I made a new eye, plunger arm, and gun. All use PVC plumbing along with 3D printed parts that were then spray painted.

IMG_20160515_152210678_HDR

IMG_20160510_102433257_HDR

If you want to come along to the faire, it is being held this coming weekend (June 11th/12th) at the Denver Museum of Nature and Science. The Gizmo Dojo booth will be in the Southeast Atrium.

How To Debug An Autonomous Vehicle

With six months to go until Sparkfun AVC 2016, I’ve started work on my entry again (a fairly simple entry based around an Arduino Mega). I’ve learned a lot from last year’s failure and I’m on a mission to make improvements that make it easier for me to debug issues and get my entry performing consistently. In my day job I spent plenty of time debugging code, but debugging an autonomous vehicle is much harder and physical crashes are harder to recover from than software crashes.

The first step was to buy a Go Pro HERO camera (just the basic version, costing a little over $100). This weekend we took G-Force for a test run with the camera attached and got some good footage.

During each run, I have code running on the Arduino that records a bunch of telemetrics (compass bearing, GPS co-ordinates, and ultra-sonic sensor data) to an SD card and this is very useful but it is also difficult to review since there is so much data. It is also hard to correlate the data with the footage from the Go Pro. I really need some visual indicators on the vehicle so that when I review footage from the Go Pro, I can see what is happening. For instance, when the vehicle makes a sharp turn, is it for obstacle avoidance or because the GPS data changed?

I had an Adafruit Neopixel stick lying around so I hooked that up last night. It has 8 LEDs, so I am using 5 of them to indicate readings from the Octasonic HC-SR04 breakout that is monitoring the ultrasonic sensors (using green, amber, red to indicate distances). One of the LEDs indicates GPS information (red for no fix and then toggling between green and blue as new co-ordinates are received). The remaining two LEDs are used to indicate obstacle avoidance (left and right). Here’s a short video showing a test of the LEDs.

This weekend I’ll take G-Force out for another test run and capture some footage showing the status LEDs in action.

SparkFun AVC 2016 has been announced

SparkFun AVC 2016 has been announced! It’s in September this year, which should be much more bearable than doing this in the summer. It also gives me longer to prepare! They are teasing some new and exciting changes this year too …

I’ve already started on some changes to our entry but I’m happy to have longer to prepare.

For more information: https://www.sparkfun.com/news/2039

HC-SR04 Breakout Board for Arduino

This weekend I assembled the first version of a new HC-SR04 breakout board that I have been working on lately. The goal of this project is to let an Arduino Uno (or any other 5V microcontroller) monitor up to 8 ultrasonic sensors over SPI, therefore only using one dedicated pin. The other advantage is that by offloading the sensor monitoring, the master device can perform other tasks. My motivation for designing this is to use it in my Sparkfun AVC entry this year.

hcsr04_breakout_v1

The microcontroller is running pure AVR code (as opposed to Arduino code) and is running at 16 MHz, which I’m sure is overkill, but there is no reason why the board couldn’t also be run at lower speeds with some tweaks to the software and changing the fuse bits.

I made one mistake in the schematic that literally took me hours to figure out – I missed the connection between the SS pin on the header and the SS pin on the microcontroller, so SS was floating, resulting in intermittent communication failures. Once I figured out the root cause I had to solder a wire onto the back of the PCB to fix this. There were also some things about the board layout that I didn’t really like, so I’ve reworked the design and am now waiting for fresh boards from OSH park.

What better way to test this than create an electronic piano?

I’m offering a few of these breakout boards for sale in my Tindie store since I have surplus boards.

Post-mortem of SparkFun AVC 2015 Performance

Half a year after our terrible performance at SparkFun’s AVC competition, I’ve analyzed the data collected during one of the runs and have figured out the mistakes I made in the navigation software. I had previously tried looking at the data, but needed to be able to visualize it to make any sense of it all. My first step was to generate a KML file from the data collected during the run, showing the planned waypoints as well as the recorded GPS locations. I then imported the KML file into Google maps so I could see the route taken by the robot.

Selection_010

The blue stars indicate the planned waypoints. The red pins show the actual route of the vehicle. This seems accurate and shows that the vehicle took a sharp right turn after the first corner and seemed to be heading for the wrong waypoint. The placemarks on the map are annotated with notes including the compass bearing and target bearing and shows that the navigation software had calculated a bearing of 49 degrees. The robot wasn’t even accurately keeping to that bearing, but that’s another story.

Next, I found an online calculator for calculating the bearing between latitude and longitude locations and entered one of the GPS co-ordinates and the target waypoint. This calculated an angle of 41 degrees instead and mapped out the route on Google maps too, which confirmed that this was correct route.

Selection_009

Clearly, I had an error in my math. Because I was running on an Arduino microcontroller with limited computing power, I had chosen to implement a simple algorithm based on trigonometry, using the delta of latitude and longitude between the two points to form a right-angled triangle and then used the atan() function to calculate the angle. This was a huge mistake and didn’t take into account that at a latitude of 40 degrees, one degree of latitude is 68.99 miles, whereas one degree of longitude is 53.06 miles. After modifying the simple trig math to scale the latitude and longitude deltas accordingly, the calculations become correct, within a reasonable tolerance.

UPDATE: I have now run simulations of one million random pairs of latitude/longitude co-ordinates within the bounds of SparkFun’s parking lot to compare the results of the modified simple algorithm versus the accurate and more expensive algorithm and the difference between the two algorithms never exceeds 0.1 degrees.

This wasn’t the only issue affecting our performance, but was one of the main factors. The compass wasn’t calibrated either and wasn’t giving accurate readings, but I am hoping that this is now resolved. I’ll be taking G-Force out for some new test runs soon to see if these changes help.