Tag Archives: arr

Autonomous Racing Rotorcraft: Camera Signal Integrity

20130209-igep-com-pro-ov9650-adapter.png
IGEP COM Pro to TechToys OV9650 Adapter

Last time working with the racing helicopter’s camera system, I managed to capture a poor quality image from the camera. My next step was attempting to diagnose the problems with the image quality. However, before I could do so, I ran into another problem with signal integrity in my breadboard setup.

The TI DM3730’s ISP with the 3.6 kernel is relatively sensitive to the quality of the pixel clock and vertical sync. If a single frame has the wrong number of pixels detected, the driver does not seem to be able to recover. This was a big problem for me, as the breadboard setup I have runs many of the 12MHz signal lines over 24 gauge wire springing around a breadboard. What I found was that I was only intermittently able to get the ISP to capture data, and eventually it got to a point where I could not get a single frame to capture despite all the wiggling I could attempt.

Rather than spending a large amount of time trying to tie up my breadboard wires just so, I instead just printed up a simple adapter board which contains all the level translation and keeps the signal paths short. This time, I tried printing it at oshpark.com, a competitor to batchpcb. The OV9650’s FFC connector has pads with a 7.8mil spacing, and batchpcb only supports 8.1mil, while oshpark has 6mil design rules. They also claim to ship faster, and are slightly cheaper.

The results were pretty good. From start to finish, it took 14 days to arrive, and the 3 boards appeared to have no major defects. My art had one minor error which required rework. The output enable pin on the level converters were tied to the wrong polarity, thus the lifted pins and blue wiring. Despite that, it appears to be working as intended.

20130209-igep-com-pro-ov9650-adapter-final.jpg
Final OV9650 adapter attached to IGEP COM Pro

Autonomous Racing Rotorcraft: Camera Driver

In the last post on my autonomous racing helicopter I managed to get the OV9650 camera communicating in a most basic way over the I2C bus. In order to actually capture images from it, the next step is to get a linux kernel driver going which can connect up the DM3730’s ISP (Image Signal Processor) hardware and route the resulting data into user space.

After some preliminary investigation, I found the first major problem. In the linux tree as of 3.6, there is a convenient abstraction in the video4linux2 (V4L2) universe for system on chips where the video device as a whole can represent the entire video capture chain, but each new “sensor” only needs a subdevice created for it. And there are in fact a lot of sensors already existing in the mainline kernel tree, including one for the very related OV9640 camera sensor. The downside however, is that there are two competing frameworks that sensor drivers can be written to, the “soc-camera” framework and the raw video4linux2 subdevice API. Sensors written for these two frameworks are incompatible, and from what I’ve seen, each platform supports only one of the frameworks. Of course, the omap3 platform which is used in the DM3730 only supports the video4linux2 API whereas all the similar sensors are written for the “soc-camera” framework!

Laurent Pinchart, a V4L2 developer has been working on this effort some, but I had a hard time locating a canonical description of the current state of affairs. Possibly the closest thing to a state of the soc-camera/v4l2 subdev universe can be found in this mailing list post:

Subject: Re: hacking MT9P031 for i.mx
From: Laurent Pinchart <laurent.pinchart@xxxxxxxxxxxxxxxx>
Date: Fri, 12 Oct 2012 15:11:09 +0200

...

soc-camera already uses v4l2_subdev, but requires soc-camera specific support
in the sensor drivers. I've started working on a fix for that some time ago,
some cleanup patches have reached mainline but I haven't been able to complete
the work yet due to lack of time.

--
Regards,

Laurent Pinchart

Since I don’t have a lot of need to upstream this work, I took the easy route and started with an existing v4l2 subdevice sensor, specifically the mt9v034 which is in the ISEE git repository. I copied it and replaced the guts with the ov9640 “soc-camera” driver from the mainline kernel. After a number of iterations I was able to get a driver that compiled, and appeared to operate the chip.

To test, I have been using Laurent Pinchart’s media-ctl and yavta tools. The media controller framework provides a general way to configure the image pipelines on system on chips. With it, you can configure whether the sensor output flows through the preview engine or the scaler and in what order if any. yavta is just a simple command line tool to set and query V4L2 controls and do simple frame capture.

One of the first images with correct coloring is below. Single frame capture out of the box was recognizable, but the quality was pretty poor. Also, as more frames were captured, the images became more and more washed out, with all the pixel values approaching the same bright gray color. That problem I will tackle in a later post.

20130116-ov9650-first-image.png
First Image from OV9650

Autonomous Racing Rotorcraft: Initial Camera Exploration: I2C Smoke Test

I am now far far down the rabbit hole of trying to validate a camera for the low altitude altimetry of my prototype autonomous racing helicopter. In the last post I got to the point where I could build system images for my IGEP COM Module that included patches on top of the ISEE 3.6 linux kernel. The next step was to use that ability to turn on the clock at the TI DM3730’s external camera port.

First, what is the path by which a normal camera driver turns on the clock on the IGEP? To discover this, I traced backwards from the board expansion file for the CAMR0010 produced by ISEE, just because it was the easiest place with a thread to start grasping. In the board expansion file, “exp-camr0010.c”, a function is defined specifically to configure the ISP’s (Image Signal Processor) clock:

static void mt9v034_set_clock(struct v4l2_subdev *subdev, unsigned int rate)
{
        struct isp_device *isp = v4l2_dev_to_isp_device(subdev->v4l2_dev);

        isp->platform_cb.set_xclk(isp, rate, ISP_XCLK_A);
}

However, in the absence of a full camera driver, it was not entirely clear how to get a hold of a “struct isp_device*” that you could use to configure the clock. To understand more, I traced the many layers this function is passed down through before leaving the board expansion source file:

  • mt9v034_platform_data: This structure was defined by ISEE and is exposed from the new mt9v034 driver.
  • i2c_board_info: The mt9v034_platform_data structure is passed into this one as the “.platform_data” member.
  • isp_subdev_i2c_board_info:The i2c_board_info structure is passed as the “.board_info” member of this structure.
  • isp_v4l2_subdevs_group_camera_subdevs: The board_info structure is passed in here as the “.subdevs” member.
  • isp_platform_data: The camera_subdevs member is passed in here as the “.subdevs” member.
  • omap3_init_camera: Finally, the platform_data structure is passed in here.

Eventually, this clock setting callback is stashed inside the mt9v034 driver where it is invoked in a couple of places. Yikes! I tried to backtrack this route to get an isp_device, but had no luck. What did end up working was grabbing the driver, then device by name: (error checking and the “match any” function omitted for clarity)

struct device_driver* isp_driver;
struct device* isp_device;
struct isp_device* isp;
isp_driver = driver_find("omap3isp", &platform_bus_type);
isp_device = driver_find_device(isp_driver, NULL, NULL, match_any);
isp = dev_get_drvdata(isp_device)

Then, I exposed this functionality through a simple debugfs entry that appears in /sys/kernel/debug/arr_debug/xclka_freq (when debugfs is mounted of course). Then I was able to write frequencies from the command line and get the external clock to run at any frequency I chose. Yay!

There was one final piece to the puzzle before I could claim the camera was functional. The OV9650, while electrically compatible with I2C, is not an SMBus device. The standard linux command line tools, i2cget and friends were not able to drive the camera in a useful way. To get over the final hurdle, I wrote a simple user-space C program which opens “/dev/i2c-3”, sets the slave address using the I2C_SLAVE ioctl, and then uses the bare “read” and “write” API to send and receive bytes of data. With this, I was able to extract the product identifier from the chip of 0x9652! I guess it is likely a subsequent revision of the 9650.

Autonomous Racing Rotorcraft: Initial Camera Exploration: System Image

In my previous post I described how I wired up an OV9650 based camera module to my IGEP COM Module. The ultimate goal of course is a low cost reliable altimeter for my autonomous racing helicopter. When we left off, I had the camera breadboarded up connected to the IGEP COM Module and wanted to drive it over the I2C interface to verify it was working. While the IGEP COM Module’s camera I2C interface is easily exposed to the linux user space, the OV9650 doesn’t respond to I2C commands without a clock being present. Turning on the clock turned out to be quite a challenge.

First I explored the possibility of existing user space tools which might be able to poke directly into /dev/mem and activate the clock. Promisingly, I found omapconf, a user space tool which claimed to provide complete control over the clock systems of TI’s OMAP chips. Unfortunately, it only operated on OMAP4 and the unreleased OMAP5, not the OMAP3 that is in the IGEP COM Module. The source code wasn’t particularly helpful for quick inspection either. There is a lot of complexity in manipulating the processor’s clock registers, and implementing it from the datasheet didn’t seem like a particularly fruitful use of time. Thus discouraged, I moved on to the next possibility, kernel mode operations.

The linux kernel for the IGEP COM Module already had routines designed to manipulate the clock registers, so why not use those? Well, to start with, building a custom kernel (and suitable system image), for this board was hardly well documented. After much consternation, and then much patience, I ended up piecing together a build setup from various IGEP wiki pages that allowed me to mostly reproduce the image that came on my board. It consisted of the following steps:

  • Yocto: Is a tool used to creating embedded linux distributions. It is based on the OpenEmbeddeded recipe files, and includes reference layers for building a distribution.
    git clone -b denzil git://git.yoctoproject.org/poky
    
  • ISEE Layer: ISEE publishes a yocto layer for their boards which includes recipes for a few packages that don’t yet exist in yocto.
    git clone -b denzil git://git.isee.biz/pub/scm/meta-isee.git
    

Once the relevant git trees were cloned, you still have to generate a build/directory using the “oe-init-build-env” script, and modify files in that directory to configure the build. The How to get the Poky Linux distribution wiki page on the ISEE website had some hints there as well.

With this configured, I was able to get a system image that closely matched the one that came with my board. In addition, I added a local layer to hold my modifications, then proceeded to switch to the as yet unrelease ISEE 3.6 kernel, which has support for the ISEE camera board. My supposition is that the OV9650 camera I’m working with is close enough to the mt9v034 on the CAM BIRD that I will run into many fewer problems using their kernel. For my local development, I cloned the ISEE kernel repository, and have pointed my local layer’s linux kernel recipe to the local git repository. This allows me to build local images with a custom patched kernel. Now, I am truly able to move on to the next step, of driving the clock!

ARR: Initial Camera Exploration: Wiring

What I expected to be the hardest hardware portion of the ARR has not disappointed — the camera. To recap, I am in the process of designing and building an autonomous helicopter. As part of that, I wanted to use a vision system combined with a laser pointer to provide low cost altimetry during takeoff and landing. The IGEP COM Module has a dedicated camera port with support for both camera and serial interfaces, and there are a large number of cheap camera phone camera modules out there which should be sufficient. How hard can it be?

I started with a TCM8230MD from SparkFun for which I simultaneously ordered a breakout board from some random online retailer. Unfortunately, the breakout board never came. I tried using some 26 guage wire I had around and a magnifying glass to solder up a by-hand breakout board to no avail. Finally, I broke down and found http://sigalrm.blogspot.com/2011/03/tcm8230md-breakout.html who posted eagle files of a breakout board. I dutifully submitted them to batchpcb and waited.

20121222-techtoys-ov9650.jpg
TechToys OV9650 Module

While I was waiting for batchpcb, I simultaneously ordered an OV9650 based board from techtoys, who also sold a breakout (pictured right). While the breakout was on 1mm spacing instead of 0.1in spacing, it was still a lot easier to experiment with. The Hong Kong post has gotten remarkably good, because this development board arrived long before the batchpcb breakout did, so I started integrating it. This board operates at 3.3V, while the TI DM3730 in the IGEP COM Pro uses 1.8V for all of its external IO. To mediate between them for logic level signals, I used a bank of SN74LVC245ANs from digikey, along with 2 discrete N-FETs to shift the I2C bidirectional signals as described in this application note from NXP.

The next challenging step was getting the camera port on the IGEP COM Pro into a usable form. It has a 27 pin 0.3mm FFC style connector, which if you haven’t seen one before is tiny! To make this work, I got a 4 inch passthrough cable from digikey and mating connector, and this 0.3mm FFC breakout from proto-advantage.com. I soldered down the FFC connector to the breakout using my hot air rework station, but should have used an iron with solder-wick technique, as the plastic on the connector melted and bubbled up a bit during soldering. Fortunately, it still worked, so I didn’t need to use my spare.

20121222-arr-ov9650-breadboard.jpg

At this point, I had the camera and IGEP COM Module nominally wired up in a way that should be usable. The final breadboard is pictured above. To make progress from here, I wanted to try a smoke test: communicate with the camera over it’s I2C interface to verify that it is powered and working. However, I quickly discovered that it wouldn’t do anything unless it was supplied with a valid clock. This turned into my next odyssey — trying to enable the clock output on the TI DM3730 in the IGEP COM Module. That is involved enough to be the subject of a later post.

ARR: Platform Controller Testing

This is a followup to my last post discussing the initial platform controller feasibility experiments for the ARR, my autonomous helicopter.

We left off last time with a couple of candidate designs for the platform controller and some rough requirements that it needed to meet. What I wanted to do was devise a test procedure that could ensure that it would meet all the functional, as well as performance requirements, preferably over a long period of time.

Basic Functionality Testing

First, the controller has a moderate number of relatively performance insensitive functionality that could potentially regress. This includes things like reading GPIO inputs, configuring what inputs to stream, and communicating over the I2C bus. For these, I created a pyunit test suite communicating with the device over it’s USB-serial interace with pyserial. The suite setup function forces the platform controller to reset, so each case starts from a known state. From there, I created maybe a dozen tests which exercise each piece of functionality and verify that it works as expected.

Input Servo Pattern Generation

Next, the platform controller needs to be able to sample 8 servo channels with sub microsecond precision. To verify this, I needed to generate 8 channels of data which were both a) easily verifiable as correct, and b) likely to turn up most of the likely causes of performance problems. My eventual solution was to create a very simple AVR program running on a separate microcontroller using delay loops which emits 8 channels of servo data with pseudorandom values using a known linear feedback shift register (LFSR) equation. I used a 16 bit LFSR, emitting only the upper 11 bits as the pulse width. Each channel was exactly one LFSR cycle behind the previous one in time, so that for a given 8 pulse cycle, you will have seen 8 subsequent values of the LFSR. This made it pretty easy for the receiver to figure out what the remaining 5 bits should have been, even if there were errors of several LSB (least significant bit) on several of the channels. For example, a few cycles of this pseudo-random data look like:

#        Channel pulse width in hex.
# Cycle   1   2   3   4   5   6   7   8
   1     370 1B8 0DC 46E 237 11B 48D 246
   2     1B8 0DC 46E 237 11B 48D 246 123
   3     0DC 46E 237 11B 48D 246 123 491

This approach has a couple of benefits:

  • Each 50Hz cycle is independently verifiable. All you need to do is scan through all possible “hidden” 5 bits and see if any result in a close match across all channels. You can also see just how much error there is on each channel.
  • Every channel changes drastically at each cycle. Some problems could be masked by slowly time varying inputs. Since at each cycle, the input changes in a very random way, this isn’t a problem.
  • The emitter can run open loop. No test harness communication is needed with the servo control emitter whatsoever. It just emits the random 8 channels of servo data indefinitely.

I wired up this separate AVR to the platform controller on a breadboard, and first added a few simple tests to the pyunit test suite that checked just a couple cycles to verify basic consistency. Once that was working, an endurance test script streamed the input data (along with a representative sample of other data as well to simulate comparable load to the final system), checking each frame for consistency and the number of LSB errors.

Servo Output Performance

The ARR platform controller has the ability to either pass through servo inputs, or take commands over the USB-serial link. Both of these needed to be verified in a comprehensive way. Fortunately, after the input verification step above, I already had a microcontroller which could sample 8 servo channels with high accuracy and report them back over a serial link! So I solved this by just dropping another ARR platform controller onto my breadboard. The pyunit test suite was extended to communicate with both devices, and monitored the outputs of the device under test using the second system. The final test harness block diagram is below:

20121217-helicopter-platform-test-block.png

Overall System Performance

Using this complete system, the endurance test script verifies not only input performance over time, but output performance over time as well. In one sample run, I got the following results:

 Frames capture: 150,000 (about 50 minutes)
 Input Results:
   Invalid Frames:       0      - 0.00%
   Values w/ 1 LSB off:  5,601  - 0.47%
   Skipped Frames:       0      - 0.00%
 Output Results:
   Invalid Frames:       0      - 0.00%
   Values w/ 1 LSB off:  58,000 - 4.82%
   Skipped Frames:       0      - 0.00%

These results were taken when the platform controller was passing through input values to the output and streaming a full complement of I2C data at 100Hz. They are about as expected, a single servo value results in a 1LSB error about 0.47% of the time. Then, passing that channel through to the output and sampling it a second time results in a 1LSB error about 4.82% of the time. Given that 1LSB of error will not even be noticeable in this application, these results are just fine. Finally, for posterity, I’ve included an annotated picture of the completed test harness below:

20121217-arr-platform-test.jpg

ARR: Platform Controller

The first aspect of my autonomous helicopter project I tackled was to validate the feasibility of an AVR microcontroller based low level platform controller (more specifically, the AT90USB1286). The platform controller needs to provide the real time interface between the radio receiver, the servos, the primary flight computer, as well as a couple of additional sensors. More specifically, the platform controller has a couple of responsibilities and/or requirements:

  • Receive up to 8 channels of servo commands from the radio receiver.
  • Emit up to 8 channels of servo commands to the various servos and motors on the helicopter.
  • Optionally allow the receiver commands to pass through to the output, or alternately put the outputs under computer control.
  • Communicate with a variety of I2C devices, notably the inertial sensors and a barometer.
  • Expose all of these sensors to the flight computer over a full speed USB connection.

A block diagram showing the interconnections in the current design is below

20121209-helicopter-platform-control-block.png

The servo command manipulation is particularly challenging. As a refresher, most RC servos are controlled with a pulse width modulated signal, where a pulse of between 1ms and 2ms is emitted at 50Hz. 1.5ms commands the center position and the other two extremes are at 1ms and 2ms. The Spektrum receiver on my Blade 450 claims to have 2048 bits of resolution. That indicates that each least significant bit (LSB) of command corresponds to (2ms – 1ms) / 2048 = 488ns. Given that most AVRs have a maximum clock rate of 20MHz, that works out to about 9 AVR clock cycles per LSB. In general, sampling an input with that level of accuracy would require a hardware capture register, but no AVRs have 8 separate inputs which can trigger captures.

Input Approach 1: Analog Comparator and Timer1 Capture

I explored two mechanisms for getting around that shortcoming. First, while the AVR does not have 8 separate inputs which can simultaneously be connected to a capture register, it can be configured to allow the analog comparator to trigger captures. Subsequently, the analog comparator can be driven by the analog multiplexer, which on the AT90USB conveniently has 8 inputs. The downside is that only one input can be selected at a time. This isn’t necessarily a showstopper, as most RC receivers actually drive the servo commands one after another in time. Channel 1 is pulsed, then channel 2, channel 3, and so on. Then when the time for another 50Hz cycle begins, the cycle starts afresh.

My experiments with this approach were mixed. The accuracy was very good, since the capture was done by the AVR hardware, I could accurately sample at 20MHz when each pulse started and stopped. There were downsides though. For one, the receiver servo outputs had to be connected in exactly the correct order. If they weren’t, then the wrong pin would be configured when it was being pulsed. While the pin orderings will be fixed in this application, the firmware may be useful in a more general sense later. The need to whip out a scope to find out which order the pulses are emitted in could be a little burdensome. Second, and more critically, all of the processing for this approach was done in a main loop, including switching which input pin was being sampled. This was done to minimize the amount of time with interrupts disabled in order to support accurate servo pulse outputs. Because of this, occasionally the main loop would not complete a full cycle between when pulse N stopped, and pulse N+1 started, and the pulse would be missed, along with the rest of the pulses that cycle. Similarly, it requires that disconnected channels be placed altogether at the end. While these limitations may not be showstoppers, I wanted to see if I could do better.

Input Approach 2: Pin Change Interrupt

The second approach was to sample the servo inputs by placing them on the AVR’s 8 bit pin change interrupt port and configuring that interrupt. Then, the interrupt handler samples the current value of a timer, enables interrupts, then samples the state of the input port. Both values are stored into a ring buffer for consumption by the main loop. This approach requires about 16 instructions to execute in the interrupt handler before interrupts are re-enabled, which could introduce ~1 LSB of jitter into the output pulses if the timing is unfortunate. However, it can handle servos connected in any order and can handle any input being disconnected. This approach seemed relatively promising, my experiments proved it out with roughly the above properties.

Servo Outputs

Next, the AVR needed to generate the 8 channels of output in an accurate way. The technique that ended up working well for me was to, in the main loop, start a servo pulse output and begin a countdown timer in a small critical section protected with interrupts disabled for 3 instructions total. Then, the interrupt handler is hand coded in the minimal amount of assembly to zero out all the outputs on the given port. gcc by default, if you write:

ISR(TIMER3_COMPA_VECT) {
  PORTC = 0x00;
}

Will generate 12 instructions of prologue and code to save the status, r0 and r1, xor r0 with itself to generate a 0, then store r0 to PORTC, and finally restore everything. What I ended up using was:

ISR(TIMER3_COMPA_vect, ISR_NAKED) {
  asm("push r16"     "\n\t"
      "ldi r16, 0"   "\n\t"
      "out %0, r16"  "\n\t"
      "pop r16"      "\n\t"
      "reti"         "\n\t" :: "M" (_SFR_IO_ADDR(PORTC)));
}

Notably, there is an AVR instruction to directly load an immediate into a register without affecting the status register, so it doesn’t need to be saved. Also, only one temporary register is required, whereas gcc prepares two for no good reason. This hand assembly requires a mere 5 instructions, significantly reducing jitter coupling into the input sampling loop.

Next Steps

Next time, I’ll look at how I stress tested this solution to verify that it would actually work over a long period of time and through all the required scenarios.