I’m porting my AVC entry to Raspberry Pi + Rust !

I very recently started using the Rust programming language professionally and although I am still working my way through the learning curve, I feel that I am proficient enough to “get it done” even if it isn’t using the most idiomatic Rust code. I’m enjoying the language immensely and have been spending evenings and weekends challenging myself to solve various problems in Rust. I’ve even created a dedicated blog to write about my experiences learning rust: Keep Calm and Learn Rust.

After attending the Boulder/Denver Rust Meetup recently, I agreed to give a talk on Rust at their next meetup on 9/21, perhaps related to IoT or Raspberry Pi.

This past Saturday (8/20) I started playing around with Rust on the Raspberry Pi and started researching how to interact with webcams as well as serial and SPI sensors. Within a couple of hours I was hooked and decided to commit to upgrading my Sparkfun AVC entry from an Arduino platform to a Raspberry Pi platform with as much logic as possible implemented in Rust.

Yes, that’s right … with a perfectly fine working entry, and with exactly four weeks until the event, I decided to completely change the hardware and software architecture. Have I bitten off more than I can chew? Maybe … we will see.

Moving to the Pi opens up so many opportunities. I have already prototyped processing a live video stream using Rust and OpenCV and overlaying text onto the captured frames, so I can include instrumentation data (GPS, compass bearing, etc) in the video. This will really solve the problem I have had of debugging the vehicle after each run to see what really happened.

Moving to Rust is ideal because I can call C code (such as OpenCV) with zero overhead. Rust also ensures that my code will be robust thanks to its memory safety features.

I am now 4 days into the process and have been able to read GPS and compass data successfully using an FTDI Basic breakout to connect the sensors to the Raspberry Pi’s USB ports, and as I already mentioned, I have been able to add real-time instrumentation data to a video stream.

There are challenges ahead though. I need to figure out SPI communications (for the board that I have that monitors the ultrasonic sensors – and this is based on 5V rather than the Pi’s 3.3V). I’ll also have to implement a serial protocol in Rust to driver the motor controller board, since there is no Rust source code available yet.

You can follow along with my progress in several ways – you can subscribe to this blog, you can follow me on twitter at @andygrove73, and you can watch my github repository. This is all open source and I hope I can inspire others to try Rust out by sharing this code.

3D Printed Voice Changer Case

In the weeks leading up to the Denver Maker Faire, I decided to design a case to hold an Arduino and the voice changer shield that I had designed. I wanted to make it easy for people to have a go at talking like a Dalek. It turned out to be a huge success!

The voice changer was connected to powered speakers inside the Dalek, and also connected to the LEDs in the dome. This did a pretty convincing job of making it look like the Dalek was talking.

Here are a couple of photos of the final case design.

IMG_20160607_174853683_HDR

IMG_20160607_175513147

I’ve uploaded the design to Thingiverse here for any previous voice changer customers who want to print their own case.

Preparing for Denver Mini Maker Faire 2016

I was contacted about five weeks ago to see if I would be interested in taking my Dalek along to the Denver Mini Maker Faire with my local maker space, The Gizmo Dojo. I hadn’t taken the Dalek to an event since 2014 and it was in a bit of a state of disrepair so I decided this would be a good opportunity to make some repairs and improvements. It was also good timing, since I have now learned enough about 3D design and printing to be able to make some custom parts that I was unable to manufacture before.

Here are the before and after photos showing the improvements made over the past five weeks. Hopefully it is obvious which is which.

dalek_before_after

I originally built the Dalek in 2012 and it was a long project. I eventually got tired of working on it and took some short cuts with the shoulder section and was never too happy with the results. To make it look a little more authentic I have now added some aluminum fly screen under the slats and drilled holes to add bolts to hold the slats in place. This makes the slats stand out from the body to make them more noticable.

IMG_20160522_121812828_HDR

I made a new eye, plunger arm, and gun. All use PVC plumbing along with 3D printed parts that were then spray painted.

IMG_20160515_152210678_HDR

IMG_20160510_102433257_HDR

If you want to come along to the faire, it is being held this coming weekend (June 11th/12th) at the Denver Museum of Nature and Science. The Gizmo Dojo booth will be in the Southeast Atrium.

How To Debug An Autonomous Vehicle

With six months to go until Sparkfun AVC 2016, I’ve started work on my entry again (a fairly simple entry based around an Arduino Mega). I’ve learned a lot from last year’s failure and I’m on a mission to make improvements that make it easier for me to debug issues and get my entry performing consistently. In my day job I spent plenty of time debugging code, but debugging an autonomous vehicle is much harder and physical crashes are harder to recover from than software crashes.

The first step was to buy a Go Pro HERO camera (just the basic version, costing a little over $100). This weekend we took G-Force for a test run with the camera attached and got some good footage.

During each run, I have code running on the Arduino that records a bunch of telemetrics (compass bearing, GPS co-ordinates, and ultra-sonic sensor data) to an SD card and this is very useful but it is also difficult to review since there is so much data. It is also hard to correlate the data with the footage from the Go Pro. I really need some visual indicators on the vehicle so that when I review footage from the Go Pro, I can see what is happening. For instance, when the vehicle makes a sharp turn, is it for obstacle avoidance or because the GPS data changed?

I had an Adafruit Neopixel stick lying around so I hooked that up last night. It has 8 LEDs, so I am using 5 of them to indicate readings from the Octasonic HC-SR04 breakout that is monitoring the ultrasonic sensors (using green, amber, red to indicate distances). One of the LEDs indicates GPS information (red for no fix and then toggling between green and blue as new co-ordinates are received). The remaining two LEDs are used to indicate obstacle avoidance (left and right). Here’s a short video showing a test of the LEDs.

This weekend I’ll take G-Force out for another test run and capture some footage showing the status LEDs in action.

SparkFun AVC 2016 has been announced

SparkFun AVC 2016 has been announced! It’s in September this year, which should be much more bearable than doing this in the summer. It also gives me longer to prepare! They are teasing some new and exciting changes this year too …

I’ve already started on some changes to our entry but I’m happy to have longer to prepare.

For more information: https://www.sparkfun.com/news/2039

HC-SR04 Breakout Board for Arduino

This weekend I assembled the first version of a new HC-SR04 breakout board that I have been working on lately. The goal of this project is to let an Arduino Uno (or any other 5V microcontroller) monitor up to 8 ultrasonic sensors over SPI, therefore only using one dedicated pin. The other advantage is that by offloading the sensor monitoring, the master device can perform other tasks. My motivation for designing this is to use it in my Sparkfun AVC entry this year.

hcsr04_breakout_v1

The microcontroller is running pure AVR code (as opposed to Arduino code) and is running at 16 MHz, which I’m sure is overkill, but there is no reason why the board couldn’t also be run at lower speeds with some tweaks to the software and changing the fuse bits.

I made one mistake in the schematic that literally took me hours to figure out – I missed the connection between the SS pin on the header and the SS pin on the microcontroller, so SS was floating, resulting in intermittent communication failures. Once I figured out the root cause I had to solder a wire onto the back of the PCB to fix this. There were also some things about the board layout that I didn’t really like, so I’ve reworked the design and am now waiting for fresh boards from OSH park.

What better way to test this than create an electronic piano?

I’m offering a few of these breakout boards for sale in my Tindie store since I have surplus boards.

Post-mortem of SparkFun AVC 2015 Performance

Half a year after our terrible performance at SparkFun’s AVC competition, I’ve analyzed the data collected during one of the runs and have figured out the mistakes I made in the navigation software. I had previously tried looking at the data, but needed to be able to visualize it to make any sense of it all. My first step was to generate a KML file from the data collected during the run, showing the planned waypoints as well as the recorded GPS locations. I then imported the KML file into Google maps so I could see the route taken by the robot.

Selection_010

The blue stars indicate the planned waypoints. The red pins show the actual route of the vehicle. This seems accurate and shows that the vehicle took a sharp right turn after the first corner and seemed to be heading for the wrong waypoint. The placemarks on the map are annotated with notes including the compass bearing and target bearing and shows that the navigation software had calculated a bearing of 49 degrees. The robot wasn’t even accurately keeping to that bearing, but that’s another story.

Next, I found an online calculator for calculating the bearing between latitude and longitude locations and entered one of the GPS co-ordinates and the target waypoint. This calculated an angle of 41 degrees instead and mapped out the route on Google maps too, which confirmed that this was correct route.

Selection_009

Clearly, I had an error in my math. Because I was running on an Arduino microcontroller with limited computing power, I had chosen to implement a simple algorithm based on trigonometry, using the delta of latitude and longitude between the two points to form a right-angled triangle and then used the atan() function to calculate the angle. This was a huge mistake and didn’t take into account that at a latitude of 40 degrees, one degree of latitude is 68.99 miles, whereas one degree of longitude is 53.06 miles. After modifying the simple trig math to scale the latitude and longitude deltas accordingly, the calculations become correct, within a reasonable tolerance.

UPDATE: I have now run simulations of one million random pairs of latitude/longitude co-ordinates within the bounds of SparkFun’s parking lot to compare the results of the modified simple algorithm versus the accurate and more expensive algorithm and the difference between the two algorithms never exceeds 0.1 degrees.

This wasn’t the only issue affecting our performance, but was one of the main factors. The compass wasn’t calibrated either and wasn’t giving accurate readings, but I am hoping that this is now resolved. I’ll be taking G-Force out for some new test runs soon to see if these changes help.

Calibrating an HMC6352 magnetometer

I’ve started tinkering again with our autonomous vehicle that failed so miserably at Sparkfun AVC 2015. My focus so far has been on the compass, which I suspect was one of the bigger contributors to our poor performance. Some tests confirmed that the compass was very inaccurate so I took the time to learn how to calibrate the compass this weekend and the compass seems to be very accurate now. I didn’t run through this process when we competed in 2014 and we did pretty well then, so maybe we just got lucky.

After studying the datasheet for the HMC6352, I forked the Sparkfun Arduino library and added an example sketch for performing calibration. The library is available here. The calibration process requires the compass to be rotated on a flat surface at a steady pace. I placed the vehicle and my laptop on a lazy susan so I could rotate it with both hands during the calibration process.

New web site for the voice changer shield

Over the Christmas break, I have set up a dedicated web site as a central place to put information for my voice changer shield kit. I have also converted the software into an Arduino Library with example sketches, making installation simpler. The ring modulator example sketch is also now much easier to understand as a result, since all of the low level SPI code has been moved into the library.

For more information, please see the new Ultimate Voice Changer web site.