After two pretty intense weekends of learning more about Rust, I now have the new autonomous vehicle basically working. We took it for the very first test run last night and here’s the video as captured and instrumented by the Rust code:
This was pretty good for a first test. The instrumented video is paying off already as I can see from the data that the code is oversteering when turning resulting in the dizzying left/right motion. This would have been much harder to spot if I were just looking through log files.
One of the main goals of this project is to help accelerate the Rust learning curve and that is certainly happening. I learned how to implement threads, using an Arc (atomically reference counted wrapper for shared state) to share data between the navigation thread and the video thread. I also had to implement a simple web interface using Hyper and Iron so that I can start/stop the vehicle from the touchscreen display. I also experimented with channels and generally I think those would be preferred over using Arx and Mutex, but they didn’t quite seem to fit with my use case.
Apart from fine-tuning the software and performing more testing, the only significant challenge ahead is to add some ultrasonic sensors and an SPI-based board for monitoring them and hooking that up the Raspberry Pi using a logic-level converter (the board is 5V, and the Pi is 3.3V).