The Hackaday Effect

So this is what happens when your fun hobby project gets featured on Hackaday. I guess I’ve had my 5 minutes of fame and now I have gone back to obscurity. It’s interesting to note that this traffic spike to my YouTube videos only netted me a couple of bucks in revenue so I’m still several years away from receiving my first $100 payout. I’d better not quit the day job just yet!

On the plus side, all of this attention did result in me selling more than a dozen of my octasonic boards, which was pretty neat. The best part is that some of them are going to be used in a public art project next month and that’s much more satisfying than making money from this.

I suppose one moral here is that one way to sell your electronics inventions is to make a fun demo to share with the world.

Screen Shot 2017-05-07 at 7.39.26 PM

Making music with Ultrasonic Sensors, MIDI, and a Raspberry Pi

I have been working on a fun project to make an exhibit for my local makerspace to take to events. I had a ton of spare ultrasonic sensors lying around so I decided to experiment with turning them into a musical instrument using a Raspberry Pi running a software synth.

This project has now been featured on Hackaday! There is also a very detailed Instructable if you want to replicate this project.

The code is very simple. The Raspberry Pi is polling each sensor (with the help of an octasonic breakout board that I designed about a year ago) and translating the distance into a MIDI instruction (“note on” or “note off”). These MIDI instructions are then piped into the stdin of a fluidsynth process, which converts the MIDI instructions into music.

This is my first exposure to MIDI and I love how simple it is to hack music. I’ll post an update once we have this project finished and mounted on a frame.


Rust is for Robots!

andygrove   September 23, 2016   No Comments on Rust is for Robots!

Here are the slides from my talk earlier this week at the Denver/Boulder Rust Meetup. I had a ton of fun putting this project together over a four week period and I would encourage others to try their hands at embedded projects with the Raspberry Pi and the Rust programming language. The source code is open source and available here.

Sparkfun AVC 2016

andygrove   September 18, 2016   No Comments on Sparkfun AVC 2016

We were unable to compete at Sparkfun AVC at all this year due to issues with our compass. Things were working pretty well during practice the day before although the compass was off by about 10 degrees on the run to the first corner. Here’s a video from practice which demonstrates obstacle avoidance kicking in when the vehicle was drifting too close to the hay bales.

So, we attempted to calibrate the compass that evening and when I came back to the site on Saturday morning I found that the compass was off by around 100 degrees. I don’t know what we did wrong yet. I did attempt to get another compass working on the day but it used a different interface (I2C) and only returned raw x, y, z values and I’m just not familiar enough with the math for turning those numbers into a heading so after trying various algorithms I found online and getting bad readings, I gave up and switched to being a spectator.


To summarize my performance in different areas:

  • The instrumented video really worked well to show me what the issues were and was a huge improvement over looking through log files.
  • The use of the LCD display was very helpful to see what was going on during testing.
  • The ultrasonic sensors were very unreliable at detecting hay bales and there was huge variance in different batches of sensors that I had. I’m definitely going to switch to LIDAR next time.
  • The GPS and navigation logic worked really well as far as I could tell.
  • The compass was a disaster, giving totally incorrect readings after we calibrated it.

Apart from being prepared ahead of time next year, a big lesson learned is to have spare parts in case of a component failing. I also need to practice on an enclosed course next time as this is much different to navigating waypoints in a large open parking lot.

This was a pretty intense four weeks building a new autonomous vehicle with Rust and the Raspberry Pi and I learned a lot about both. The project also gave me an excuse to do use Onshape more for designing various mounts that I then 3D-printed.

I’m looking forward to doing much more with Rust and the Raspberry Pi in the future.



AVC update 8/29

andygrove   August 29, 2016   No Comments on AVC update 8/29

After two pretty intense weekends of learning more about Rust, I now have the new autonomous vehicle basically working. We took it for the very first test run last night and here’s the video as captured and instrumented by the Rust code:

This was pretty good for a first test. The instrumented video is paying off already as I can see from the data that the code is oversteering when turning resulting in the dizzying left/right motion. This would have been much harder to spot if I were just looking through log files.

One of the main goals of this project is to help accelerate the Rust learning curve and that is certainly happening. I learned how to implement threads, using an Arc (atomically reference counted wrapper for shared state) to share data between the navigation thread and the video thread. I also had to implement a simple web interface using Hyper and Iron so that I can start/stop the vehicle from the touchscreen display. I also experimented with channels and generally I think those would be preferred over using Arx and Mutex, but they didn’t quite seem to fit with my use case.

Apart from fine-tuning the software and performing more testing, the only significant challenge ahead is to add some ultrasonic sensors and an SPI-based board for monitoring them and hooking that up the Raspberry Pi using a logic-level converter (the board is 5V, and the Pi is 3.3V).



I’m porting my AVC entry to Raspberry Pi + Rust !

I very recently started using the Rust programming language professionally and although I am still working my way through the learning curve, I feel that I am proficient enough to “get it done” even if it isn’t using the most idiomatic Rust code. I’m enjoying the language immensely and have been spending evenings and weekends challenging myself to solve various problems in Rust. I’ve even created a dedicated blog to write about my experiences learning rust: Keep Calm and Learn Rust.

After attending the Boulder/Denver Rust Meetup recently, I agreed to give a talk on Rust at their next meetup on 9/21, perhaps related to IoT or Raspberry Pi.

This past Saturday (8/20) I started playing around with Rust on the Raspberry Pi and started researching how to interact with webcams as well as serial and SPI sensors. Within a couple of hours I was hooked and decided to commit to upgrading my Sparkfun AVC entry from an Arduino platform to a Raspberry Pi platform with as much logic as possible implemented in Rust.

Yes, that’s right … with a perfectly fine working entry, and with exactly four weeks until the event, I decided to completely change the hardware and software architecture. Have I bitten off more than I can chew? Maybe … we will see.

Moving to the Pi opens up so many opportunities. I have already prototyped processing a live video stream using Rust and OpenCV and overlaying text onto the captured frames, so I can include instrumentation data (GPS, compass bearing, etc) in the video. This will really solve the problem I have had of debugging the vehicle after each run to see what really happened.

Moving to Rust is ideal because I can call C code (such as OpenCV) with zero overhead. Rust also ensures that my code will be robust thanks to its memory safety features.

I am now 4 days into the process and have been able to read GPS and compass data successfully using an FTDI Basic breakout to connect the sensors to the Raspberry Pi’s USB ports, and as I already mentioned, I have been able to add real-time instrumentation data to a video stream.

There are challenges ahead though. I need to figure out SPI communications (for the board that I have that monitors the ultrasonic sensors – and this is based on 5V rather than the Pi’s 3.3V). I’ll also have to implement a serial protocol in Rust to driver the motor controller board, since there is no Rust source code available yet.

You can follow along with my progress in several ways – you can subscribe to this blog, you can follow me on twitter at @andygrove73, and you can watch my github repository. This is all open source and I hope I can inspire others to try Rust out by sharing this code.

3D Printed Voice Changer Case

In the weeks leading up to the Denver Maker Faire, I decided to design a case to hold an Arduino and the voice changer shield that I had designed. I wanted to make it easy for people to have a go at talking like a Dalek. It turned out to be a huge success!

The voice changer was connected to powered speakers inside the Dalek, and also connected to the LEDs in the dome. This did a pretty convincing job of making it look like the Dalek was talking.

Here are a couple of photos of the final case design.



I’ve uploaded the design to Thingiverse here for any previous voice changer customers who want to print their own case.

Preparing for Denver Mini Maker Faire 2016

I was contacted about five weeks ago to see if I would be interested in taking my Dalek along to the Denver Mini Maker Faire with my local maker space, The Gizmo Dojo. I hadn’t taken the Dalek to an event since 2014 and it was in a bit of a state of disrepair so I decided this would be a good opportunity to make some repairs and improvements. It was also good timing, since I have now learned enough about 3D design and printing to be able to make some custom parts that I was unable to manufacture before.

Here are the before and after photos showing the improvements made over the past five weeks. Hopefully it is obvious which is which.


I originally built the Dalek in 2012 and it was a long project. I eventually got tired of working on it and took some short cuts with the shoulder section and was never too happy with the results. To make it look a little more authentic I have now added some aluminum fly screen under the slats and drilled holes to add bolts to hold the slats in place. This makes the slats stand out from the body to make them more noticable.


I made a new eye, plunger arm, and gun. All use PVC plumbing along with 3D printed parts that were then spray painted.



If you want to come along to the faire, it is being held this coming weekend (June 11th/12th) at the Denver Museum of Nature and Science. The Gizmo Dojo booth will be in the Southeast Atrium.

How To Debug An Autonomous Vehicle

With six months to go until Sparkfun AVC 2016, I’ve started work on my entry again (a fairly simple entry based around an Arduino Mega). I’ve learned a lot from last year’s failure and I’m on a mission to make improvements that make it easier for me to debug issues and get my entry performing consistently. In my day job I spent plenty of time debugging code, but debugging an autonomous vehicle is much harder and physical crashes are harder to recover from than software crashes.

The first step was to buy a Go Pro HERO camera (just the basic version, costing a little over $100). This weekend we took G-Force for a test run with the camera attached and got some good footage.

During each run, I have code running on the Arduino that records a bunch of telemetrics (compass bearing, GPS co-ordinates, and ultra-sonic sensor data) to an SD card and this is very useful but it is also difficult to review since there is so much data. It is also hard to correlate the data with the footage from the Go Pro. I really need some visual indicators on the vehicle so that when I review footage from the Go Pro, I can see what is happening. For instance, when the vehicle makes a sharp turn, is it for obstacle avoidance or because the GPS data changed?

I had an Adafruit Neopixel stick lying around so I hooked that up last night. It has 8 LEDs, so I am using 5 of them to indicate readings from the Octasonic HC-SR04 breakout that is monitoring the ultrasonic sensors (using green, amber, red to indicate distances). One of the LEDs indicates GPS information (red for no fix and then toggling between green and blue as new co-ordinates are received). The remaining two LEDs are used to indicate obstacle avoidance (left and right). Here’s a short video showing a test of the LEDs.

This weekend I’ll take G-Force out for another test run and capture some footage showing the status LEDs in action.