Archives: 2012-12

Autonomous Racing Rotorcraft: Initial Camera Exploration: System Image

In my previous post I described how I wired up an OV9650 based camera module to my IGEP COM Module. The ultimate goal of course is a low cost reliable altimeter for my autonomous racing helicopter. When we left off, I had the camera breadboarded up connected to the IGEP COM Module and wanted to drive it over the I2C interface to verify it was working. While the IGEP COM Module’s camera I2C interface is easily exposed to the linux user space, the OV9650 doesn’t respond to I2C commands without a clock being present. Turning on the clock turned out to be quite a challenge.

First I explored the possibility of existing user space tools which might be able to poke directly into /dev/mem and activate the clock. Promisingly, I found omapconf, a user space tool which claimed to provide complete control over the clock systems of TI’s OMAP chips. Unfortunately, it only operated on OMAP4 and the unreleased OMAP5, not the OMAP3 that is in the IGEP COM Module. The source code wasn’t particularly helpful for quick inspection either. There is a lot of complexity in manipulating the processor’s clock registers, and implementing it from the datasheet didn’t seem like a particularly fruitful use of time. Thus discouraged, I moved on to the next possibility, kernel mode operations.

The linux kernel for the IGEP COM Module already had routines designed to manipulate the clock registers, so why not use those? Well, to start with, building a custom kernel (and suitable system image), for this board was hardly well documented. After much consternation, and then much patience, I ended up piecing together a build setup from various IGEP wiki pages that allowed me to mostly reproduce the image that came on my board. It consisted of the following steps:

  • Yocto: Is a tool used to creating embedded linux distributions. It is based on the OpenEmbeddeded recipe files, and includes reference layers for building a distribution.

    git clone -b denzil git://git.yoctoproject.org/poky
    
  • ISEE Layer: ISEE publishes a yocto layer for their boards which includes recipes for a few packages that don’t yet exist in yocto.

    git clone -b denzil git://git.isee.biz/pub/scm/meta-isee.git
    

Once the relevant git trees were cloned, you still have to generate a build/directory using the “oe-init-build-env” script, and modify files in that directory to configure the build. The How to get the Poky Linux distribution wiki page on the ISEE website had some hints there as well.

With this configured, I was able to get a system image that closely matched the one that came with my board. In addition, I added a local layer to hold my modifications, then proceeded to switch to the as yet unrelease ISEE 3.6 kernel, which has support for the ISEE camera board. My supposition is that the OV9650 camera I’m working with is close enough to the mt9v034 on the CAM BIRD that I will run into many fewer problems using their kernel. For my local development, I cloned the ISEE kernel repository, and have pointed my local layer’s linux kernel recipe to the local git repository. This allows me to build local images with a custom patched kernel. Now, I am truly able to move on to the next step, of driving the clock!

ARR: Initial Camera Exploration: Wiring

What I expected to be the hardest hardware portion of the ARR has not disappointed – the camera. To recap, I am in the process of designing and building an autonomous helicopter. As part of that, I wanted to use a vision system combined with a laser pointer to provide low cost altimetry during takeoff and landing. The IGEP COM Module has a dedicated camera port with support for both camera and serial interfaces, and there are a large number of cheap camera phone camera modules out there which should be sufficient. How hard can it be?

I started with a TCM8230MD from SparkFun for which I simultaneously ordered a breakout board from some random online retailer. Unfortunately, the breakout board never came. I tried using some 26 guage wire I had around and a magnifying glass to solder up a by-hand breakout board to no avail. Finally, I broke down and found http://sigalrm.blogspot.com/2011/03/tcm8230md-breakout.html who posted eagle files of a breakout board. I dutifully submitted them to batchpcb and waited.

TechToys OV9650 Module

TechToys OV9650 Module

While I was waiting for batchpcb, I simultaneously ordered an OV9650 based board from techtoys, who also sold a breakout (pictured right). While the breakout was on 1mm spacing instead of 0.1in spacing, it was still a lot easier to experiment with. The Hong Kong post has gotten remarkably good, because this development board arrived long before the batchpcb breakout did, so I started integrating it. This board operates at 3.3V, while the TI DM3730 in the IGEP COM Pro uses 1.8V for all of its external IO. To mediate between them for logic level signals, I used a bank of SN74LVC245ANs from digikey, along with 2 discrete N-FETs to shift the I2C bidirectional signals as described in this application note from NXP.

The next challenging step was getting the camera port on the IGEP COM Pro into a usable form. It has a 27 pin 0.3mm FFC style connector, which if you haven’t seen one before is tiny! To make this work, I got a 4 inch passthrough cable from digikey and mating connector, and this 0.3mm FFC breakout from proto-advantage.com. I soldered down the FFC connector to the breakout using my hot air rework station, but should have used an iron with solder-wick technique, as the plastic on the connector melted and bubbled up a bit during soldering. Fortunately, it still worked, so I didn’t need to use my spare.

20121222-arr-ov9650-breadboard.jpg

At this point, I had the camera and IGEP COM Module nominally wired up in a way that should be usable. The final breadboard is pictured above. To make progress from here, I wanted to try a smoke test: communicate with the camera over it’s I2C interface to verify that it is powered and working. However, I quickly discovered that it wouldn’t do anything unless it was supplied with a valid clock. This turned into my next odyssey – trying to enable the clock output on the TI DM3730 in the IGEP COM Module. That is involved enough to be the subject of a later post.

ARR: Platform Controller Testing

This is a followup to my last post discussing the initial platform controller feasibility experiments for the ARR, my autonomous helicopter.

We left off last time with a couple of candidate designs for the platform controller and some rough requirements that it needed to meet. What I wanted to do was devise a test procedure that could ensure that it would meet all the functional, as well as performance requirements, preferably over a long period of time.

Basic Functionality Testing

First, the controller has a moderate number of relatively performance insensitive functionality that could potentially regress. This includes things like reading GPIO inputs, configuring what inputs to stream, and communicating over the I2C bus. For these, I created a pyunit test suite communicating with the device over it’s USB-serial interace with pyserial. The suite setup function forces the platform controller to reset, so each case starts from a known state. From there, I created maybe a dozen tests which exercise each piece of functionality and verify that it works as expected.

Input Servo Pattern Generation

Next, the platform controller needs to be able to sample 8 servo channels with sub microsecond precision. To verify this, I needed to generate 8 channels of data which were both a) easily verifiable as correct, and b) likely to turn up most of the likely causes of performance problems. My eventual solution was to create a very simple AVR program running on a separate microcontroller using delay loops which emits 8 channels of servo data with pseudorandom values using a known linear feedback shift register (LFSR) equation. I used a 16 bit LFSR, emitting only the upper 11 bits as the pulse width. Each channel was exactly one LFSR cycle behind the previous one in time, so that for a given 8 pulse cycle, you will have seen 8 subsequent values of the LFSR. This made it pretty easy for the receiver to figure out what the remaining 5 bits should have been, even if there were errors of several LSB (least significant bit) on several of the channels. For example, a few cycles of this pseudo-random data look like:

#        Channel pulse width in hex.
# Cycle   1   2   3   4   5   6   7   8
   1     370 1B8 0DC 46E 237 11B 48D 246
   2     1B8 0DC 46E 237 11B 48D 246 123
   3     0DC 46E 237 11B 48D 246 123 491

This approach has a couple of benefits:

  • Each 50Hz cycle is independently verifiable. All you need to do is scan through all possible “hidden” 5 bits and see if any result in a close match across all channels. You can also see just how much error there is on each channel.
  • Every channel changes drastically at each cycle. Some problems could be masked by slowly time varying inputs. Since at each cycle, the input changes in a very random way, this isn’t a problem.
  • The emitter can run open loop. No test harness communication is needed with the servo control emitter whatsoever. It just emits the random 8 channels of servo data indefinitely.

I wired up this separate AVR to the platform controller on a breadboard, and first added a few simple tests to the pyunit test suite that checked just a couple cycles to verify basic consistency. Once that was working, an endurance test script streamed the input data (along with a representative sample of other data as well to simulate comparable load to the final system), checking each frame for consistency and the number of LSB errors.

Servo Output Performance

The ARR platform controller has the ability to either pass through servo inputs, or take commands over the USB-serial link. Both of these needed to be verified in a comprehensive way. Fortunately, after the input verification step above, I already had a microcontroller which could sample 8 servo channels with high accuracy and report them back over a serial link! So I solved this by just dropping another ARR platform controller onto my breadboard. The pyunit test suite was extended to communicate with both devices, and monitored the outputs of the device under test using the second system. The final test harness block diagram is below:

20121217-helicopter-platform-test-block.png

Overall System Performance

Using this complete system, the endurance test script verifies not only input performance over time, but output performance over time as well. In one sample run, I got the following results:

 Frames capture: 150,000 (about 50 minutes)
 Input Results:
   Invalid Frames:       0      - 0.00%
   Values w/ 1 LSB off:  5,601  - 0.47%
   Skipped Frames:       0      - 0.00%
 Output Results:
   Invalid Frames:       0      - 0.00%
   Values w/ 1 LSB off:  58,000 - 4.82%
   Skipped Frames:       0      - 0.00%

These results were taken when the platform controller was passing through input values to the output and streaming a full complement of I2C data at 100Hz. They are about as expected, a single servo value results in a 1LSB error about 0.47% of the time. Then, passing that channel through to the output and sampling it a second time results in a 1LSB error about 4.82% of the time. Given that 1LSB of error will not even be noticeable in this application, these results are just fine. Finally, for posterity, I’ve included an annotated picture of the completed test harness below:

20121217-arr-platform-test.jpg

3D Printed Cookie Cutters

For my nieces this holiday season, in addition to actual cookies, I printed up some customized cookie cutters on the Artisan’s Asylum 3D printer (A Stratasys uPrint SE Plus)

Inkscape

20121213-emma-inkscapeThe toolchain I used could be applied to a number of 3D projects. First, I either found an image kind of resembling what I had in mind using google images, or drew up a sketch on a piece of paper. Then, I transcribed that image into an inkscape vector drawing, similar to the elf one on the right. The inkscape drawing contained a closed shape for the outer dimensions of the part, the inner dimensions of the part, as well as closed shapes for any surface features that I wanted. I used the “Linked Offsets” feature to force the inner wall boundary to be a precise distance away from the outer wall boundary. Colors were chosen arbitrarily, as the next step ignores the fill colors entirely.

Freecad

20121213-lilah-freecad.pngNext, I fired up freecad, which can import SVG elements as geometry primitives in the 3D view. Unfortunately, and what was to become the biggest annoyance with this project, is that its import of SVG paths isn’t particularly robust. Notably, for some elements it doesn’t close them properly, and for others it doesn’t even turn them into curves, rather importing them as sets of points. This was done in a hurry, and while I didn’t have enough time to actually fix the problems in freecad, I did dig around in the source enough to figure out that they were not handling paths which ended up on the exact same position as the first point correctly. One problem was that freecad would only count a path as closed if the “z” element was used to close it off. Another was that paths with kinks would just not close with no indication why, even if the kinks were too small to be visible. So, my workaround was to manually edit the .svg files in emacs after inkscape saved them and fiddle around with them afterwords to try and get freecad to import them as closed surfaces. Then for the paths that still didn’t work, I looked extra close in inkscape for any kinked paths. In this project, those largely resulted from inkscape’s linked offset paths being glitchy around regions of high curvature.

With those surfaces imported, I then proceeeded to do a series of extrusions, differences, and unions to get the parts that I was looking for. In some cases, when I ran into limitations of freecad’s boolean operation engine, I had to go back to inkscape to tweak the artwork. This was largely around different objects which were intended to share a border, which didn’t work out so well.

netfabb

After getting the solid models into good shape in freecad, I exported an .STL file for each model. I pulled this .STL file into netfabb studio basic to verify the volume and to do mesh repair. Sometimes freecad will export .STL files that netfabb doesn’t complain about, but I figure it doesn’t hurt to let it fix up any problems it finds.

uPrint 3D Printer

The final step is printing. This is largely uneventful: feed in the STL files, configure the print job, hit print, and come back in a couple of hours. Usually, when printing a part, you can count on the first iteration to have some problems and this case was no exception. I had designed the cookie cutter wall thickness to be 1.25mm wide, figuring that would be 5 passes of the uPrint. However, the uPrint ended up not actually filling the inside of the wall in many places, resulting in two very thin walls separated by a small void. Given more time, but in this case I was out, so I moved onward with what I had!

Baking!

So, the final test, using them to cut cookies… Well… They mostly worked. The separated outer walls caused a lot of cookie material to get wedged up inside. I also realized at this point why most cookie cutters have an exposed central area. Without one, extracting the cookies is quite challenging. I painstakingly used chopsticks and a knife, which worked adequately, if with great effort. Certainly, if I were to make a second revision, I would fix both the separated wall problem, and make the cookies easier to eject afterwards.

Below is a picture of the final 3 parts before being gummed up with a season’s worth of cookie making.

20121213-cookie-cutter-3dprint.jpg

ARR: Platform Controller

The first aspect of my autonomous helicopter project I tackled was to validate the feasibility of an AVR microcontroller based low level platform controller (more specifically, the AT90USB1286). The platform controller needs to provide the real time interface between the radio receiver, the servos, the primary flight computer, as well as a couple of additional sensors. More specifically, the platform controller has a couple of responsibilities and/or requirements:

  • Receive up to 8 channels of servo commands from the radio receiver.
  • Emit up to 8 channels of servo commands to the various servos and motors on the helicopter.
  • Optionally allow the receiver commands to pass through to the output, or alternately put the outputs under computer control.
  • Communicate with a variety of I2C devices, notably the inertial sensors and a barometer.
  • Expose all of these sensors to the flight computer over a full speed USB connection.

A block diagram showing the interconnections in the current design is below

20121209-helicopter-platform-control-block.png

The servo command manipulation is particularly challenging. As a refresher, most RC servos are controlled with a pulse width modulated signal, where a pulse of between 1ms and 2ms is emitted at 50Hz. 1.5ms commands the center position and the other two extremes are at 1ms and 2ms. The Spektrum receiver on my Blade 450 claims to have 2048 bits of resolution. That indicates that each least significant bit (LSB) of command corresponds to (2ms - 1ms) / 2048 = 488ns. Given that most AVRs have a maximum clock rate of 20MHz, that works out to about 9 AVR clock cycles per LSB. In general, sampling an input with that level of accuracy would require a hardware capture register, but no AVRs have 8 separate inputs which can trigger captures.

Input Approach 1: Analog Comparator and Timer1 Capture

I explored two mechanisms for getting around that shortcoming. First, while the AVR does not have 8 separate inputs which can simultaneously be connected to a capture register, it can be configured to allow the analog comparator to trigger captures. Subsequently, the analog comparator can be driven by the analog multiplexer, which on the AT90USB conveniently has 8 inputs. The downside is that only one input can be selected at a time. This isn’t necessarily a showstopper, as most RC receivers actually drive the servo commands one after another in time. Channel 1 is pulsed, then channel 2, channel 3, and so on. Then when the time for another 50Hz cycle begins, the cycle starts afresh.

My experiments with this approach were mixed. The accuracy was very good, since the capture was done by the AVR hardware, I could accurately sample at 20MHz when each pulse started and stopped. There were downsides though. For one, the receiver servo outputs had to be connected in exactly the correct order. If they weren’t, then the wrong pin would be configured when it was being pulsed. While the pin orderings will be fixed in this application, the firmware may be useful in a more general sense later. The need to whip out a scope to find out which order the pulses are emitted in could be a little burdensome. Second, and more critically, all of the processing for this approach was done in a main loop, including switching which input pin was being sampled. This was done to minimize the amount of time with interrupts disabled in order to support accurate servo pulse outputs. Because of this, occasionally the main loop would not complete a full cycle between when pulse N stopped, and pulse N+1 started, and the pulse would be missed, along with the rest of the pulses that cycle. Similarly, it requires that disconnected channels be placed altogether at the end. While these limitations may not be showstoppers, I wanted to see if I could do better.

Input Approach 2: Pin Change Interrupt

The second approach was to sample the servo inputs by placing them on the AVR’s 8 bit pin change interrupt port and configuring that interrupt. Then, the interrupt handler samples the current value of a timer, enables interrupts, then samples the state of the input port. Both values are stored into a ring buffer for consumption by the main loop. This approach requires about 16 instructions to execute in the interrupt handler before interrupts are re-enabled, which could introduce ~1 LSB of jitter into the output pulses if the timing is unfortunate. However, it can handle servos connected in any order and can handle any input being disconnected. This approach seemed relatively promising, my experiments proved it out with roughly the above properties.

Servo Outputs

Next, the AVR needed to generate the 8 channels of output in an accurate way. The technique that ended up working well for me was to, in the main loop, start a servo pulse output and begin a countdown timer in a small critical section protected with interrupts disabled for 3 instructions total. Then, the interrupt handler is hand coded in the minimal amount of assembly to zero out all the outputs on the given port. gcc by default, if you write:

ISR(TIMER3_COMPA_VECT) {
  PORTC = 0x00;
}

Will generate 12 instructions of prologue and code to save the status, r0 and r1, xor r0 with itself to generate a 0, then store r0 to PORTC, and finally restore everything. What I ended up using was:

ISR(TIMER3_COMPA_vect, ISR_NAKED) {
  asm("push r16"     "\n\t"
      "ldi r16, 0"   "\n\t"
      "out %0, r16"  "\n\t"
      "pop r16"      "\n\t"
      "reti"         "\n\t" :: "M" (_SFR_IO_ADDR(PORTC)));
}

Notably, there is an AVR instruction to directly load an immediate into a register without affecting the status register, so it doesn’t need to be saved. Also, only one temporary register is required, whereas gcc prepares two for no good reason. This hand assembly requires a mere 5 instructions, significantly reducing jitter coupling into the input sampling loop.

Next Steps

Next time, I’ll look at how I stress tested this solution to verify that it would actually work over a long period of time and through all the required scenarios.