As part of preparing Savage Solder for events this spring, I’ve been overhauling the computer and sensing platform to have higher reliability and lower weight. One of the changes was a new LCD and input interface.
I wrote up the previous LCD here before, which was the width of the car, pretty heavy, and not resistant to the elements. For this version, I wanted to use a smaller screen, and remove the possibility of moisture getting into the case, while still maintaining roughly the same USB interface.
PCB and Assembly
The display I ended up using was the 2.2″ TFT LCD from Adafruit. It is color, has sufficient pixels to display everything we wanted, and has a SPI interface. The other half of the solution is the input device, for which I decided to use an off the shelf MPR121 capacitive touch sensor controller, with a breakout board, also from Adafruit.
As before, these were glued together with a custom board holding an ATMega32U4 and a few other miscellaneous components. It is mezzanine mounted to the LCD and touch board, and has and external JST connector for USB. Also custom printed was the touch sensor itself. It is nothing more than a PCB with solder-masked copper where each of the touch sensitive areas should be, pulled out to a connector on the back.
LCD controller
Touch panel
The entire assembly was designed to be mounted to the lid of an off the shelf enclosure with a clear polycarbonate lid (which is where the rest of the electronics is going). The LCD and touch board each have four mounting holes, which are attached with a bolt, nut and lockwasher assembly. Sealing is accomplished with neoprene o-rings on the outer surface of the lid.
Assembled LCD from frontAssembled LCD from back
Software
The MPR121 has a lot of options, many of which influence the performance of the device. The enclosure I am using is an off the shelf one from BUD (PN-1324-C amazon link). It has a polycarbonate lid approximately 1/8″ thick, which is kind of at the extreme of what you can get capacitive touch sensors to work through. I was inspired by Adafruit’s MPR121 library, but rewrote it to not require Arduino. I had to tweak a number of the MPR121 constants as well to work under the thick polycarbonate:
MPR121 Register
Adafruit
Savage Solder
CONFIG1
0x10
0x20 (32uA charge current)
Thresholds
12, 6
4, 2
For rendering text on the display, I also based my code on the Adafruit ILI9340 library. Since this LCD controller has to render text one pixel at a time, it was much slower than the HD44780 text-based controller we were using previously. To get remotely acceptable performance, I had to make the AVR keep a framebuffer around, and only re-draw characters which had actually changed.
AVC 2014 has come and gone! Savage Solder placed 2nd in the doping class this year, with two of three successful runs. The first run ended after about 5 feet when the replacement ESC seemed to overheat after being left on for too long before the starting line. The second run was flawless, hitting the hoop and the ramp. The third run was nearly perfect, only missing the hoop.
Congrats to all the other teams, there were a lot of successful and inventive entries this year!
I’ve found two good videos of the third run so far, the first from the official livestream at time 12:00
One of the improvements we made for this year’s Savage Solder entry into the Sparkfun AVC (and maybe other competitions, I’m looking at you Vecna Robot Race), is an updated baseboard. Our old baseboard was a Teensy 2.0+ soldered onto a hand-built protoboard and then wired up to an external IMU mounted on the front of the chassis. For fun, I decided to replace that board with a more integrated unit which installs more cleanly into the car chassis and is based on the firmware I’m developing for the ARR.
Hardware
The board hardware includes:
3 servo inputs: The Savage Flux RX receiver has 3 outputs, for throttle, brake, and an auxiliary channel.
4 servo outputs: We are only using 2 channels currently, but it was trivial to wire up 2 more.
Pins and mounting for stacked MiniIMUv2: Rather than have the IMU need a long-run cable and dedicated mounting, it is now integrated into the baseboard. The I2C can be run at a 400kHz in this configuration, and thus the IMU can be sampled at a higher rate if necessary.
USB device port for connection to host PC: We use a small form factor Intel motherboard as a host computer, the USB connection, while not particularly reliable, is sufficient for the competitions Savage Solder enters.
Sensored motor feedback for odometry: This provides the primary odometry for the car.
General purpose digital inputs for emergency switch, and bumper switches: Our emergency override switch and robomagellan bump switches can all be treated as simple logic inputs.
Battery voltage sense: One ADC channel was wired to a connector to measure the drive battery voltage.
oshpark.com rendering
This was another oshpark.com job, and the layout was relatively straightforward. The biggest wrinkle was that the first revision was non-functional due to a broken AVR eagle package. Who would figure that the no-name library I downloaded from some shady site didn’t have a ground pad? The crazier thing was that it kind of worked, despite all the vias that it placed under the part were shorted out by the pad.
Baseboard fully populated
The board was mechanically designed to fit into an otherwise unused space forward of the steering servos on our Savage Flux chassis. There are two screws there that attach the chassis to some suspension components on the underside. I just replaced those two screws with slightly longer ones, and added a spacer underneath.
Baseboard installed in Savage Solder
Firmware
As I mentioned before, the firmware was based on that I am creating for the ARR. The basic architecture is the same, in that a master computer communicates with the baseboard over USB, and the baseboard provides high rate servo, I2C communication, and other miscellaneous functionality. I’ve written up the basic servo functionality and engineering test fixture of the firmware previously. Versus our previous controller, the servo control is much improved, both in the precision and accuracy of input and output, as well as having access to more channels. While the firmware supports 8 input and 8 output channels, the Savage Solder baseboard only has 3 inputs and 4 outputs routed to any connectors.
I added a few new pieces of functionality to make the firmware usable for Savage solder, namely:
Configurable Emergency Passthrough / Override: The controller can be configured with two very short functional scripts which can be used to evaluate various servo and GPIO conditions. One determines whether all servos should be placed in the pass through configuration, and the other determines if computer control should be allowed. These allow both for the Savage Solder emergency safety scheme, as well as any I will use on ARR in the future.
Encoder operation: Savage Solder uses the sense leads from our sensored brushless motor as an encoder. The firmware measures these to report total distance traveled. Versus the old firmware, it now counts in true quadrature, giving 4 times the resolution.
GPIO operation: The board exposes control and sensing of a few general purpose input and output pins. They are used to control the LEDs on the board, sense the onboard emergency override button, and sense the bump switches we use for robomagellan style competitions.
Testing
The baseboard has been installed in the car since late July 2013. Yes, this post is very belated! In that time, it has been used for all of our testing and appears to be working quite well. Or rather, it hasn’t been the source of any problems yet!
While it caused us some grief during the actual competition, Savage Solder for the most part successfully used a vision based system to detect and track the bonus hoop necessary for extra points in this year’s Sparkfun AVC. The premise of our entry into the competition was a relatively cheap car, with nothing but cheap sensors. We compensated for our low accuracy GPS with a camera which allowed us to avoid barrels, and find the hoop and ramp at speed. Our barrel and hoop detection and tracking is basically unchanged from the cone detection we use in robomagellan style events. You can find a series describing it in part 1, part 2, and part 3. In this post, I’ll describe specifically our hoop detection algorithm, and give some results showing how well it performs on real world data.
Original Image from Webcam
As with the cone detection system earlier, we use a standard USB webcam with a pipeline that consists of part OpenCV primitives and part custom detection routines. The output of the algorithm is an estimated range and bearing (and their estimated errors) to any hoops that might be present in the current frame.
In the example below, I’ll show how the process works on the sample image to the right, taken from Savage Solder during the AVC 2013.
Details
Edge Detection
Step A: First, we apply a Canny edge detection filter. We just use the default OpenCV “Canny” function. The thresholds are selected as part of our overall tuning process, although an initial guess was made based on what looked reasonable in some sample data sets.
Step B: As with the cone/barrel/ramp detection, we compute the distance transform. Here, since the objects of interest are thin lines, the distance transform lets us quickly compute how far away individual points are from that line. This is done using the OpenCV “distanceTransform” method.
Distance Transform
Step C: Using the original edge detected image, the OpenCV “findContours” method is used to identify all the contiguous lines. Contours which have fewer than a configurable number of points are dropped from the list of contours.
Step D: Now comes the first key part of the algorithm. At this point, we pick a random contour from our list of valid ones, then pick three random points on that contour. A circle is drawn circumscribing through those three points, to create a “proto-hoop”. A number of preliminary checks are run, if any of them fail, the proto-hoop is discarded and another sample is selected:
A Single Sampled Circle
Are all 3 points on the upper half of the circle?
Is the upper half of the circle entirely in the visual frame?
Is the center of the circle above a configurable minimum height?
Is the radius larger than a configurable minimum size?
If all the checks pass, then the circle is passed on to the next step.
Step E: At this point, we have a proto-hoop drawn in the image, but it might be three points that touched a square building, or a cloud, or a few people practicing their cheerleading. To distinguish these cases, we evaluate a support metric to determine how likely it is actually a hoop. To do so, we step through each pixel in the upper half of the circle in a clockwise manner. For each pixel, we compute two separate metrics which are later summed together. For clarity, the metric is defined such that smaller is better.
The square of the distance from that point on the circle to the nearest edge pixel. This prefers circles which are close to edges for most of their path.
The square root of the distance minus the square root of the previous point’s distance. This portion of the metric penalizes circles which are near very jagged edges. This is intended to penalize shapes like trees which are often circular-ish, but not very smooth.
If the support metric is too large for a given circle, then that circle is dropped and another sample is chosen.
Step F: Go back to step D and repeat a couple of hundred times.
Step G: Finally, before reporting potential hoops, a last pass is taken. The proto-hoops are examined in order from best support to worst support. For each one, a local optimization function is run to find a nearby circle center and radius which improves the overall support. Then, for the first proto-hoop, it is just added to the output list. For subsequent hoops, they are compared against previously reported ones, and circles that substantially overlap are discarded as likely false positives.
Final Result: Two Detections
At this final stage, there are now a set of curated hoop detections for the given frame! The results are passed up to the tracker module, which correlates targets from frame to frame, discarding ones that don’t behave like a hoop should over time.
Results and Future Work
We only got the detector working with reasonable quality 2 or 3 weeks before the competition, so our corpus of images was not that large. However, in the dataset we do have, it is able to reliably detect the hoop out to about 6 meters of range with a manageable level of false positives. The table below shows the performance we had measured using data we took at our local testing sites.
Range (m)
Detections
False Detections
2-4
100% (18)
5.3% (1)
4-6
76% (16)
61% (25)
6-8
23% (16)
73% (14)
While the false positive level might seem high, the subsequent tracking is able to cope as the false positives tend to be randomly scattered about, and thus don’t form stable tracks from frame to frame. The detection rate within 6 meters is high enough that real hoops are detected 3 out of 4 times, which lets them be tracked just fine.
This approach seemed to work relatively well, although we had a relatively smaller corpus size compared to barrels and ramps. For future AVC competitions, we will likely use the same or similar technique, but trained on a larger dataset to avoid systematic failures like hoop shaped clouds, trees, or buildings.
We ran Savage Solder in two competitions this spring, Robomagellan in Robogames 2013, and the Sparkfun Autonomous Vehicle Competition 2013. In neither contest did we fare particularly well, especially given the level of preparation we put in. I’ll describe our results here, along with a little analysis from each event.
Robogames 2013 Robomagellan
Robogames 2013
In brief, the Robomagellan event requires that your robot touch a target orange traffic cone. It’s made harder because the cone can be in an arbitrary outdoor environment, with challenging terrain, changing features, people, GPS obstructions and the like. Your robot can achieve time deductions by touching other “bonus” cones. You get 3 runs, and your score is the best of the three.
Savage Solder only made two runs this year. A summary of the failure modes:
First Run: After touching the 25% bonus cone, our u-blox GPS decided to have a slew event, moving 20m away from where the car was. This resulted in the early termination of our run when the car hit a trash can. Our Robomagellan code has no obstacle avoidance, and it thought this was the final cone, so this ended our run.
Second Run: After the competition had started, the event center’s groundskeepers decided to start locating some picnic tables on the competition area. They placed one right in the middle of our surveyed path about 30s before we had to start. To change the plan, our robomagellan code at that time required surveying a series of waypoints, which we could not do in that short a time. The car got unlucky, clipped the side of the table enough to make it think it hit a cone, then reversed. When it reversed, it got hung up on the picnic table, burning out our ESC. We had no spare ESCs, so that ended our Robomagellan event.
Sparkfun AVC 2013
Sparkfun AVC 2013
The Sparkfun Autonomous Vehicle Competition, held for the last couple of years, requires the robot to drive around a loop course. Bonus points are awarded for passing through a medium sized metal hoop and jumping over a ramp. This year, the overall score was the sum total of three runs.
After our dissapointing Robomagellan performance, we doubled down for Sparkfun AVC. We replaced our ESC and got spares… of nearly everything. We built a ramp, hoop, and obtained barrels. The two weekends before the event the car was autonomously completing our practice course 100% of the time including hitting both the ramp and the jump about 90%-95% of the time over maybe a hundred runs.
How did this translate into our competition performance? We completed one run, failed after the second corner on the second run, and failed after the first corner on the third run. End result: not so hot.
First Run: This run was flawless. We set our speed at 11mph, on the theory that we completed most of our testing there, and we didn’t know how fast or successful at hitting the props the other teams would be.
Second Run: Here, after seeing Netburner’s fast run we increased our maximum speed to 15mph to shave some time off. This speed didn’t end up being a problem, aside from Netburner rear ending us going into the second turn harmlessly — but the clouds were a problem. When homing on the hoop, our hoop detector got a glimpse of some very hoop shaped clouds off in the distance and went for them. This ended badly after it closelined itself, ran into a couple of haybales and the curb. This was the first time we had a false positive on clouds in all our testing.
Third Run: I went back to the image data in our first and second run, and updated our hoop detection constants to have the same detection rate, but weed out all the clouds. We went with a speed of 15mph again. However, right before we started, I noticed on our display that it was complaining that the GPS was not reporting data regularly. In fact, it was barely reporting data at all. As with Robomagellan and the picnic tables, this problem appeared right before the starting gun, so there was nothing we could do but start it and watch. And watch we did, as it collided with the fence after the first turn. The root cause: the u-blox doesn’t seem to be able to emit NMEA messages longer than 364 characters when run at 115200, and the thus the u-blox’s PUBX,3 NMEA message gets truncated once the GPS has more than about 19 satellites in its tracking filter. Our GPS interface software requires all the messages it uses to be valid (with checksums) before emitting a fix upstream, so it wasn’t reporting anything most of the time. Looking back at the data from all our Boston testing, hundreds of hours worth, this only happened for a very brief time, maybe 10s total, as we don’t have the same clear view of the sky as in Boulder.
Fixes
u-blox GPS slew: We have seemingly resolved this by using the RMS of the reported range residuals as a floor on the measurement noise of position into our Kalman filter. Since that change, u-blox slews, while they still happen, don’t significantly impact the trajectory of the car.
Last minute picnic tables: For AVC 2013 and beyond, we’ve switched from surveying points with a handheld GPS to drawing interpolated curves on satellite maps which need maybe a one time registration. This allows us to change paths quickly as necessary.
Hoop shaped cloud: This one we fixed at the event by tuning constants. We didn’t get our hoop detector working until about 3 weeks before the event, so our corpus of hoop images was relatively small. If we run Sparkfun AVC again, I’ll also create a larger corpus of hoop images including some fluffy cloud days.
u-blox limited NMEA length: Here we are switching to use the u-blox binary protocol, which has both has shorter messages and is presumably better tested by u-blox.
Conclusions
Through both the spring competitions, we had 1 successful run out of 5. Of the 4 failed runs, 3 were from root causes we had never before seen, or had seen entirely too infrequently to diagnose. The 4th, (the u-blox slewing that caused our trash can hit), had been happening only every 4 or 5 robomagellan runs, so it was a known, but limited, risk.
Obviously, we are fixing all the individual failure modes that we observed. I am not sure what the larger lesson here is, other than that there are many many failure modes and the only way to find them all is through exhaustive testing. Maybe using more expensive components would have helped, as the cheap u-blox and our ESC with no overcurrent protection cost us 3 runs total (4 if you count our missed 3rd robomagellan run).
I suppose the ultimate conclusion is that building robots remains hard.
Savage Solder successfully participated in the superbly run Sparkfun AVC 2013 last weekend. The car ended up not running nearly as well as we had hoped, but we still had one flawless run which put us in third place in the doping class.
Sparkfun hasn’t posted a recap yet, but there are two videos online which mostly capture our one good run.
I will write up a post-mortem covering AVC 2013 and RoboGames Robomagellan 2013 in the coming weeks.
Today was the first day Savage Solder was firing on all cylinders in our Sparkfun AVC 2013 practice. We ran a pretty large number of circuits, and all systems seemed to be working pretty well, missing the hoop and ramp only a couple of times. Here is a video of a flawless run through the course at 11mph.
We got in our first day of autonomous practice this weekend for the Sparkfun AVC 2013. Some simple issues gave us some problems in the morning, but by the end of the day we had enough either resolved or worked around enough of them that we were achieving pretty good runs on a practice course. I’ve got video of our best run below, but first a couple of caveats:
We have a debugging creep at the start of the run to verify heading.
Our practice hoop wasn’t on the course. We don’t yet have a visual detector for the hoop, or our “hoop deflector” mounted on the car.
There is a debugging pause after it slaloms through the barrels, before it moves on to the ramp.
Today we took Savage Solder out to take some calibration image data of the various Sparkfun AVC obstacles and to experiment with manual navigation through an AVC style course. Unfortunately, we had some electronics failures with our replacement ESC that will take some repair. Before that though, we managed to collect a fair amount of useful data, and did a few jumps off the ramp. While not impossible, it looks to be pretty challenging. For small cars, the end of the ramp is actually pretty high. For large cars, the ramp is surprisingly narrow and was not trivial to hit when driving manually. This video is of the slowest run we dared do, higher speeds made better landings.
Along with many others, we’re hoping to enter our car into the Sparkfun AVC 2013 this year! In support of our effort, I put together some datasets and 3D assets that might be useful to other participating teams, especially if they do any work in simulation.
Georeferenced Data
First, I used the most recent measurements from April 9th on the AVC website to create a kml file showing where each of the obstacles is, along with the course boundaries. I’ve included a snapshot from Google Earth below to show what it contains.
Sparkfun AVC 2013 Course Layout
Next, I extracted some freely available USGS ortho imagery and created a cropped version covering just the course. I essentially used the techniques described in my post on LIDAR imagery, except that the ortho imagery came from the USGS National Map Viewer. I created both a geotiff that has georeferencing information, and an opaque .png which is projected identically, but has no embedded information. On that last image, I did a little gimp hackery to edit out cars which would be in the path of the actual course.
I also made a set of somewhat (OK very) crude blender models of each of the different obstacles present on the course to approximately correct dimensions. This includes the barrels, the hoop, and the ramp.
avc2013-barrel.blend
avc2013-hoop.blend
avc2013-ramp.blend
Drivethrough Video
Finally, just for fun, I recorded a screencast of one of the first drivethroughs of Savage Solder on the simulated Sparkfun AVC Course. This was with just the bare minimum of changes from what we’ll use at RoboMagellan in Robogames in two weeks, so for now it was navigating purely by GPS and IMU. It drives through the hoop about one time in three, which is enough to catch a good video! As in my previous simulation videos, the yellow line is where the absolute surveyed GPS path is. The green line is where the car thinks the GPS path is at any moment, which will differ due to GPS error and heading error.