Archives: Robots

New Project: Autonomous Racing Rotorcraft

20121114-blade.jpgFor the last 6 years I have ever so slowly been learning how to fly medium sized RC helicopters outdoors, in an attempt to become a good enough pilot that I could roboticize one. About a year ago I upgraded to a Blade 450, which let me practice outdoors in a much wider variety of weather conditions and provided for a feasible amount of payload. My proficiency is now good enough that I’ve started work on the control and navigation design with the medium term goal of entering it in autonomous racing competitions, such as Sparkfun’s AVC.

At the top end, this helicopter can lift around 100g, which while tight, should be completely doable with 2012 componentry. A rough outline of the components I have in mind now is:

  • Computer - IGEP COM Module: A gumstix clone out of Spain, they are slightly smaller, slightly more powerful, and can nominally be powered and operated over USB with no baseboards required. (Although I’ve so far found that heat issues pretty much require a baseboard or other heatsink of some kind be used.)
  • GPS -Sparkfun GS407: We had reasonably good luck with the u-blox 6 chipset on Savage Solder. It doesn’t have great precision, and has no way to bypass the internal EKF, but at least lets you configure it somewhat.
  • IMU - Currently I’m planning on using the same chips as in the Pololu MinIMU-9 v2 just because I’ve worked with it on Savage Solder, the drift performance is reasonable, and the combined electronics only add a couple of grams.
  • Altimeter - My hope is to use a structured light solution where a small camera-phone camera module looks down coupled with a laser pointer offset by some amount. The laser pointer will alternate one frame to the next to increase the signal to noise ratio. Then, the position of the pointer in the camera’s field of view should give a relatively accurate altimeter at ranges between 1 and 10m. Higher than that and I am figuring GPS will be good enough to keep it far from the ground.
  • Servo Control and Failsafe - To develop the autonomous system with a minimum of crashes, it will need the ability to be controlled both by a human and by the computer, with an emergency fallback to human control. Also, for simulation modeling purposes, I want to be able to record the human inputs made during manual flights. In order to not require hard real-time performance of the linux-based IGEP board, I will use a separate AVR controller to read, pass through, or command each of the servo channels. The Blade 450 only has a 6 channel transmitter, (throttle, pitch, roll, collective, tail, and gyro gains). In case of an upgrade later on, I’ll assume that there are 8 transmitter channels that might need to be manipulated.

I have a preliminary set of milestones laid out. While each may end up taking significant effort, it will at least let me track progress. They don’t all necessarily need to be done in order of course:

  • Demo Servo Control - Demonstrate that the AVR can monitor, passthrough, and command 8 servo channels while simultaneously reading full 100Hz updates from the I2C IMU, and feeding everything back over the USB link without losing accuracy. I have a working implementation of this, which I’ll cover in a later post.
  • Camera and Lens Selected - Prototype and get a camera working with the IGEP COM Module and verify that its optical properties will be sufficient for altimetry and possibly horizontal motion compensation during landing.
  • Block Diagram of Daughterboard - Draw up all the major components required for the daughterboard.
  • Schematic of Daughterboard - Actually capture all the connections and minor components necessary for the daughterboard.
  • Layout of Daughterboard - Get the first rev of the board drawn up and manufactured.
  • Board Bringup - Get the board working, re-print as required.
  • Flight Data Recording and Simulator - Using the flight hardware, take data of flights and use it to build a simulation model. Currently, CRRCsim looks like it will be suitable for a dynamics and visualization platform. It’s rotorcraft support isn’t great, but with some work I believe I can get it close enough.
  • Navigation Filter - Using the simulator and flight tests, create a navigation solution which can accurately localize the helicopter in 6D.
  • Autonomous Forward Flight - Fly simple autonomous trajectories in fast forward flight.
  • Prove out Altimeter - With manual takeoff and landing, get the altimeter to a satisfactory level of performance.
  • Autonomous Takeoff and Landing - Using the altimetry when close to the ground, make controlled landings and takeoffs.

Wow, that is a long road ahead.

Savage Solder - Robogames 2012

Very belatedly reported, Mikhail and I entered Savage Solder in the Robogames 2012 RoboMagellan competition, placing first. The goal was largely to see what we could do in a short period of time. With only about 2 months from start to finish, we put together a machine which handily took first place in the competition, despite having a few significant bugs and barely having been tuned for performance.

The strategy was relatively straightforward: Start with a capable platform, we used the HPI Savage Flux with a laptop, webcam, GPS, and IMU strapped on top. A Teensy USB was coded to read RC servo inputs, write servo outputs, talk to the IMU over I2C, and read the bump sensor while connecting to the laptop over USB. A simple Unscented Kalman Filter (UKF) kept a global world position using GPS in UTM coordinate space. The car followed trajectories using pure pursuit for steering and a PID controller around velocity. The trajectories were layed out ahead of time with a simple lua script that pointed them towards each of the cones in a series using Dubin’s curves. A separate target tracker maintained UKFs for one or more visible cones which were culled from a simple visual filter. Once a cone was certain enough, and in the right location, the trajectory tracker switched to making trajectories aiming at the cones and slowing down, until a forward facing bump sensor tripped. At that point, the car moved on to the next element in its sequence.

The biggest thing we didn’t even include in our design was obstacle detection or avoidance. With the u-blox GPS we used, we were able to achieve positioning performance that would usually constrain the car to within +- 1.5m of a desired path, which, together with careful manual planning, was mostly enough to keep it from running into things. This was helped by the fact that the landscape for current RoboMagellan events is relatively forgiving, having few dynamic obstacles which aren’t people.

The two biggest failings in the design were that 1) we ran out of time to tune the velocity and acceleration control to get the car to reach its speed performance. In the end, with the controller we had, we tuned the constants a bit to achieve a top speed around 5-7mph, but we were limited by how quickly the controller could slow down. The underlying platform is capable of 40mph+ with rapid (~4 m/s^2) acceleration. Second, was the mechanism behind homing on cones. The target tracking filter operated in the global coordinate system. However, while our u-blox was pretty good, it would often drift rapidly by several meters in one direction or another, which could confuse the filter and cause the car to miss the cone. Higher level logic caused it to retry, but every retry added a significant time penalty as changing directions is slow on the RC car platform.

In preparing for future competitions, we’re working on getting Savage Solder both so that it performs up to our original design, and also so that it can avoid some subset of dynamic obstacles, possibly just using a similar visual tracker so that it is capable of competing in events such as Sparkfun’s AVC.