Another of the tasks I’ve set for myself with regards to future Mech Warfare competitions is redesigning the turret. The previous turret I built had some novel technical features, such as active inertial gimbal stabilization and automatic optical target tracking, however it had some problems too. The biggest one for my purposes now, was that it still used the old RS485 based protocol and not the new CAN-FD based one. Second, the turret had some dynamic stability and rigidity issues. The magazine consisted of an aluminum tube sticking out of the top which made the entire thing very top heavy. The 3d printed fork is the same I one I had made at Shapeways 5 years ago. It is amazingly flexible in the lateral direction, which results in a lot of undesired oscillation if the base platform isn’t perfectly stable. I’ve learned a lot about 3d printing and mechanical design in the meantime (but of course still have a seemingly infinite amount more to learn!) and think I can do better. Finally, cable management between the top and bottom was always challenging. You want to have a large range of motion, but keeping power and data flowing between the two rotating sections was never easy.
My concept with this redesign is twofold, first make the turret be basically an entirely separate robot with no wires connecting it to the main robot and second, try to use as many of the components from the quad A1 as I could to demonstrate their, well, flexibility. Thus, this turret will have a separate battery, power distribution board, raspberry pi, pi3 hat, and a moteus controller for each axis of motion. These are certainly overkill, but hey, the quad A1 can carry a lot of weight.
The unique bits will be a standalone FPV camera, another camera attached to the raspberry PI for target tracking, a targeting laser, and the AEG mechanism, including a new board to manage the firing and loading functions.
I’ve been developing a new bi-directional spread spectrumradio to command and control the mjbots quad robot. Here I’ll describe my first integration of the protocol into the robot.
To complete that integration, I took the library I had designed for the nrfusb, and ported it to run on the auxiliary controller of the pi3 hat. This controller also controls the IMU and an auxiliary CAN-FD bus. It is connected to one of the SPI buses on the raspberry pi. Here, it was just a matter of exposing an appropriate SPI protocol that would allow the raspberry pi to receive and transmit packets.
Slightly unfortunately, this version of the pi3hat does not have interrupt lines for any of the stm32s. Thus, I created a multiplexed status register that the rpi can use to check which of the CAN, IMU, or RF has data pending. Then I slapped together a few registers which allowed configuring the ID and reading and writing slots and their priorities.
Then I refactored things around on the raspberry pi side so that one core would keep busy polling for one of those things to become available. So far, for the things which access SPI, I’ve been putting them locked to an isolcpu cpu to get improved SPI timing. Eventually, once I have interrupt lines, I might consolidate all of these down to a single core. That, plus defining an initial mapping between the controls and slots resulted in:
Finally, I created a very simple GL gui application which connects to an nrfusb and a joystick. It uses Dear ImGui to render a few widgets and glfw to window and read the joystick.
While I was at it, I finally updated my joystick UI to make gait selection a bit faster, and got the robot to do a better job of switching out of the walk gait. Thus the following video showing all of that hooked together.
With a protocol design in hand, the next step was to go and implement it. My goal was to produce a library which would work on the nrfusb, and also on the auxiliary stm32g4 on the mjbots pi3 hat. In this first implementation pass however, I only worked with the nrfusb as both transmitter and receiver.
While developing this, I had more than my share of “huh” moments working from the datasheet and with the components. To begin with, the initial nrf24l01+ modules I got were all Chinese clone ones. While I was having problems getting auto acknowledgement to work, I discovered that the clones at a minimum were not compatible with genuine Nordic devices. Thus I reworked genuine parts into the modules I had:
That didn’t solve any of my immediate problems, but the subsequent modules I got all had genuine chips so it was useful that they all were compatible.
The other more annoying problems are somewhat obvious in hindsight. For a transmitter to be able to successfully receive an automatic acknowledgment from a receiver, not only does the ID need to be configured in the appropriate RX_ADDR register, but EN_RXADDR also needs to have the correct bit set. I had assumed that was only required for slave devices as there was no mention of it in any of the Enhanced Shockburst flow charts or setup procedures for transmitters or auto acknowledgment.
The second annoyance, was that when in receiver mode, switching channels seems to kinda work a little bit for some channels even with CE held high, but to be reliable you have to pull CE low and put the unit in standby mode while changing channels.
With those problems (and some others) resolved, I have a reliable bidirectional link that is ultimately tweakable. Next I’ll integrate this into the quad A1 to actually control the robot and monitor its telemetry.
In order to bring up the final piece of the raspberry pi 3 hat, the nrf24l01+, I wanted a desktop development platform that would allow for system bringup and also be useful as a PC side transmitter. Thus, the nrfusb:
Similar to the fdcanusb, it is just an STM32G474 on the USB bus, although this has a pin header for a common nrf24l01+ form factor daughterboard.
The next steps here are to get this working at all, then implement a spread spectrum bidirectional protocol for control and telemetry.
After I initially assembled the new legs onto the chassis, I realized I had the geometry slightly off and there was some interference through part of the shoulder rotation. I made up new printed parts and replaced everything in front of the camera. Thus, watch some high speed robot surgery:
The quad A1’s first job is to validate the new moteus controller in the quadrupedal configuration, after which I’ll use it as the primary development platform to get all my gait work done.
Now that the IMU is functioning, my next step is to use that to produce an attitude estimate. Here, I dusted off my unscented Kalman filter based estimator from long ago, and adapted it slightly to run on an STM32. As before, I used a UKF instead of the more traditional EKF not because of its superior filtering performance, but because of the flexibility it allows with the process and measurement functions. Unlike the EKF, the UKF is purely numerical, so no derivation of Jacobians is necessary. It turns out that even an STM32 has plenty of processing power to do this for things like a 7 state attitude filter.
One problem I encountered, was by default I have been building everything for the STM32 with the “-Os” optimization level. Unfortunately, with Eigen linear algebra routines, that is roughly 4x slower than “-O3”. Doubly unfortunate, just using copts at the rule level or --copts on the command line didn’t work. bazel doesn’t let you control the order of command line arguments very well, and the -Os always ended up *after* any of the additional arguments I tried to use to override. To get it to work, I had to navigate some bazel toolchain mysteries in rules_mbed in order to allow build rules to specify if they optionally want the higher optimization instead of optimizing for size. I’m pretty sure this is not exactly what the with_features mechanism in toolchain’s feature rule is for, but it let me create a feature called speedopt which turns on -O3 and turns off -Os. The final result is at rules_mbed/530fae6d8
To date, I’ve only done some very zeroth order performance optimization. I spent 15 minutes parameter tuning, making sure that the covariances updated to approximately the correct levels and I added a simple filter to reject accelerometer updates during dynamic motion. I did just enough runtime performance to get an update down to around 300us, which is just fine for a filter intended to run at 1kHz. More will remain as future work.
Here’s a plot from a quick sanity check, where I manually rolled the device in alternating directions, then pitched it in alternating directions. (When pitching, it was on a somewhat springy surface, thus the ringing).
The pitch and roll are plenty smooth, although they look to perhaps not returning exactly to their original position. At some point, I will do a more detailed qualification to dial in the performance.
After getting the power to work, the next step in bringing up the new quad’s raspberry pi interface board is getting the FDCAN ports to work. As described in my last roadmap, this board has multiple independent FDCAN buses. There are 2 STM32G4’s each with 2 FDCAN buses so that every leg gets a separate bus. There is a 5th auxiliary bus for any other peripherals driven from a third STM32G4. All 3 of the STM32G4’s communicate with the raspberry pi as SPI slaves.
Making this work was straightforward, if tedious. I designed a simple SPI based protocol that would allow transmission and receipt of FD-CAN frames at a high rate in a relatively efficient manner, then implemented that on the STM32s. On the raspberry pi side I initially used the linux kernel driver, but found that it didn’t give sufficient control over hold times during the transmission. Since the SPI slave is implemented in software, I needed to leave sufficient time after asserting the chip select and after transmitting the address bytes. The kernel driver gives no control over this at all, so I resorted to directly manipulating the BCM2837s peripheral registers and busy loop waiting in a real time thread.
After a decent supply of bugs were squashed, I got to a point where the host could send off 12 queries to all the servos with the four buses all being used simultaneously, then collating the responses back. I haven’t spent much time optimizing the cycle time, but the initial go around is at around 1.0ms for a query of all 12 devices which is about 1/3 of the 3.5ms I had in the previous single-bus RS485 version.
Here’s a scope trace of a full query cycle with 2 of the 4 CAN buses on the top, and the two chip selects on the bottom. Woohoo!
The next peripheral to get working on the quad’s raspberry pi interface board is the IMU. When operating, the IMU will primarily be used to determine attitude and angular pitch and roll rates. Secondarily, it will determine yaw rate, although there is no provision within the IMU to determine absolute yaw.
To accomplish this, the board has a BMI088 6 axis accelerometer and gyroscope attached via SPI to the auxiliary STM32G4 along with discrete connections for interrupts. This chip has 16 bit resolution for both sensors, decent claimed noise characteristics, and supposedly the ability to better reject high frequency vibrations as seen in robotic applications. I am currently running the gyroscope at 1kHz, and the accelerometer at 800Hz. The IMU is driven off the gyroscope, with the accelerometer sampled whenever the gyroscope has new data available.
My first step was just to read out the 6 axis values at full rate to measure the static performance characteristics. After doing that overnight, I got the following Allan Variance plot.
That gives the angular random walk at around 0.016 dps / sqrt(Hz) with a bias stability of around 6.5 deg/hr. The angular random walk is about what is specified in the datasheet, and the bias is not specified at all, but this seems really good for a MEMS sensor. In fact, it is good enough I could probably just barely gyrocompass, measuring the earth’s rotation, with a little patience. The accelerometer values are shown there too, and seem fine, but aren’t all that critical.
Next up is turning this data into an attitude and rate estimate.
The first thing I needed to get working on the new quad’s raspberry pi3 hat, was the input DC/DC power converter. One of the main functions of this board is to take the main DC bus voltage of around 20V, and provide the raspberry pi with 5V power.
In the previous iteration of this board, it was limited to an recommended maximum voltage of around 24V. As with all the components in my hardware revisions I aimed to support a higher input voltage. Here I switched parts to the Diodes AP64351 so that I could get to a recommended maximum voltage of 32V (the part’s absolute max is 40V).
Normally, bringing up power isn’t all that interesting. Either it works, or some pin is obviously connected incorrectly and it doesn’t. However here, I had different behavior. When first powering on the device, it kinda flickered the output to 5V or less maybe once every second or so. While probing with the multimeter, I found that when I probed the soft start selection pin all of a sudden it started to seemingly work! Assuming the probe’s input impedance must be enough to do something, I soldered on an SMD resistor in parallel with the soft start selection capacitor (after trying more capacitance) and got it to work a little bit, but it was still flaky, cutting out whenever the raspberry pi started to draw significant current during the boot process.
The second instance of the board exhibited similar symptoms, except there I managed to accidentally short the soft start selection pin to 18V, likely toasting it. However, surprisingly, that seemed to get the chip into a working state!
After much thought (and I should have noticed it in the picture above), I discovered I had managed to populate the incorrect part, the AP64350, not the AP64351. The x50 version is mostly pin compatible, but has a frequency selection pin where the x51 has a soft start selection pin.
Replacing that chip with the desired part got everything working as expected!