The initial implementation of auxiliary encoders for moteus supported exactly one encoder, the AS5048B. The hardware can support any I2C based encoder, so supporting additional encoders has always been on the TODO list.
I’m excited to announce, that as of firmware release 2021-12-03, AS5600 encoders are now supported as well. They are a lot cheaper than the AS5048 as they have a much lower update rate and resolution, but that isn’t necessarily a problem if it is only used to disambiguate a modest gear reduction.
The moteus controller uses an absolute magnetic encoder to sense the position of the rotor in order to conduct field oriented control of the motor. In many applications, this sensing is also sufficient to measure the output as well, particularly in direct drive applications. However, if the controller is driving the output through a gear reduction, multiple turns of the input are necessary to make one turn on the output. At power on, this results in an ambiguity, where the controller doesn’t know where the output is.
There are a couple of possible solutions to this, one is to do like the quad A1 does, and have a “known turn on position”. Another would be to have a rigid end stop and use a homing procedure on startup. Yet another would be to have a non-backdrivable mechanism and remember in the host application how many revolutions had been taken.
What I’m going to cover here is yet another solution to this problem, an auxiliary encoder. In this approach, a second absolute encoder is used to measure the position at the output directly, thus directly resolving all ambiguity. All of the production moteus controllers have had a, to date unused, connector named ABS which has pins intended for I2C on it. As of revision 2021-04-09, moteus can now use these pins to read the position from an AS5048B absolute magnetic encoder.
After reading, it uses the values for two purposes. First, it reports the measured value out over both the diagnostic and register interface, so that host applications can use it. It also can be optionally used to initialize the value of “unwrapped_position” at startup.
Then you connect it to the moteus (while off) and install the breakout board facing a magnet. Here, I made a simple 3d printed belt reducer:
Now we can go into tview and configure things. First, we use the config abs_port.mode from the reference manual, and set it to the value for an AS5048B (1). Then we will configure an offset using abs_port.position_offset, and finally set the position to be set from this encoder on startup with servo.rezero_from_abs.
In my previous experiments demonstrating torque feedback (full rate inverse dynamics, ground truth torque testing), I’ve glossed over the fact that as the stator approaches magnetic saturation, the linear relationship between torque and current breaks down. Now finally I’ll take at least one step towards allowing moteus to accurately work in the torque domain as motors reach saturation.
The stator in a rotor consists of windings wrapped around usually an iron core. The iron in the core consists of lots of little sub-domains of magnetized material, that normally are randomly oriented resulting in a net zero magnetic field. As current is applied to the windings, those domains line up, greatly magnifying the resulting magnetic field. Eventually most of the sub-domains are aligned, at which point you don’t get any more magnifying effect from the iron core. In this region, the stator is said to be “saturated”. You can read about it in much more depth on wikipedia or with even more detail here. The end result is a curve of magnetic field versus applied current that looks something like this:
To date, moteus assumes that you are operating completely in the “Linear” region, where the torque and current are linearly related.
Operating in the Rotation Region
To operate in the “rotation” region I ended up using the following formula:
Where is the input current, is the motor torque constant, , and are three constants that I fit to measured torque data. With some approximations, this can be calculated relatively efficiently on the STM32G4 that drives the moteus controller, adding only a microsecond to the overall loop time to go in both directions.
I then ran a torque sweep with my load-cell fixture from before, and sure enough, the input and output torque match much better now across the entire range of operation, despite the fact that the phase current needs to start growing very rapidly near the top end:
My initial design torque for the qdd100 was a little over 17 Nm. However, when I did my first ground truth torque testing, I found that some servos had a lower maximum torque than I had specified. While working to diagnose those, I built a qdd100 that used an alternate stator winding of 105Kv instead of the 135Kv that are in all the beta units. The Kv rating of a stator describes how fast the motor will spin for a given applied voltage. If you assume the same amount of copper mass of wiring, a lower Kv will mean that there are thinner wires that wrap around the stator more turns (or fewer wires in parallel). A higher Kv will have thicker wires with fewer overall turns.
On paper, if you assume a perfect controller, this shouldn’t make much of a difference. The same input power should be required for the same output torque. The only differences should come into play once you have a controller with either a limited maximum voltage or a limited maximum current. The higher Kv motor will be able to go faster given a fixed maximum voltage, and the lower Kv motor will have more torque for a given maximum current.
I wanted to verify that this was true as part of my evaluation to identify the cause of my decreased torque, so I used a slightly upgraded torque testing fixture:
For now, I rigged up the world’s cheapest load cell from amazon to a Nucleo configured to report the load in grams over the serial port. I also wired up my Chroma power supply over USB using the linux USBTMC driver. With those two things hooked up, I was able to run tests that sweeped across torque commands, while recording output torque, phase current, and input power.
At higher torques, the input power was pretty sensitive to the temperature of the windings — hotter windings increased the resistance, which increased the power required to achieve a given phase current, thus my plot isn’t perfect as it was grabbed over several different runs. For the highest power samples I couldn’t use my Chroma, as it is limited to around 600W. Thus those samples don’t record the input power.
Plotting the input power vs output torque on the same chart shows that indeed, modulo some measurement error, they are the same for the two stators:
So, this experiment reaffirmed my understanding of stator magnetics and confirmed that the stator winding was not the cause of my decreased torque.
Because my working environment is otherwise too idyllic and peaceful, I’ve been running the new moteus servo mk2 through its paces. All day long. 8 hours a day.
This is the same test I ran to verify the controller, only now I’ve done it several times longer to get a better feel for if there are any weak links. Somewhat surprisingly, the ball doesn’t drop all that often, only once an hour or two.
Earlier I described my design plan for reducing the overall mass of the moteus servo mk2. Constructing a prototype of this turned out to take many more iterations and time than I had expected! Along the way I produced and scrapped two front housings, two outer housings and a back housing.
I made one complete prototype which only had the weight reduction applied to some of the parts and lacked a back cover and any provision for a wire cover. It was the one from the moteus controller r4.1 juggling video:
I also had to get new workholding solutions for the PocketNC in the form of the wcubed vise.
Every one of the pieces got reworked in some manner or designed from scratch for the things that did not exist previously.
Front housing: Here I iterated on how much material to remove from the central cavity. Initially I removed more, but it gave the primary output bearing problems to be loaded intermittently. Also, I had adhesion problems with the ring gear when too little material was left there. I settled on a continuous ring for the output bearing and a decent amount of material for the internal gear.
Back housing: I tweaked the back housing mounting points so that the outer housing could be symmetric. Also, I added a facility for the wire cover to guard the phase wires entering the controller.
Outer housing: The outer housing was largely unchanged from my initial weight reduced design, although I produced one bad one due to a simple mistake locating the mounting hole, and a second because the stud lengths between the front and back were different in an earlier iteration.
Planet output: The planet output design changed only to add some weight reducing cutouts. This was the last part for which I was still using mk1 servo spare parts for, so now I actually manufactured a prototype in house.
Planet input: Here there are now weight reducing cutouts, and the mating studs use less material.
Back cover: The back cover design is basically unchanged, I just had to make one for the first time.
Wire cover: The wire cover is a part of the design I had deferred until now. It bolts to the back housing and shrouds the phase wires.
And tada! It can! Well, at least a little bit. I’ve only spent a short while with the gearbox based chassis, and have a lot of work left to do. However, here’s a quick video showing it walking around, slipping on a ruler, and almost falling over a few times.
After completing one gearbox, I needed to build at least 4 more of them to replace the lateral servos on Super Mega Microbot (2). So, I got to work. First, I disassembled 5 more BE8108 motors.
Then, I drilled out the rotors, this time using the mill at AA.
Next I removed the stators from their backing. This was painful enough last time, that I tried a new technique using the mill to do most of the work. Unfortunately, one of the stators was critically damaged during my initial experimentation. So, now down to 4 survivors.
I went and printed 5 copies of all the printed parts:
After finally getting the darned thing apart, and printing a new outer housing, I went about re-assembling the whole mechanism. This time, I tried to take care to make the future disassembly less painful.
To start with, I filed down the problematic outer bearing interfaces of the sun gear holder so that the bearings were a slip fit over them. These two interfaces don’t need to be particularly snug, so that was easy enough, if monotonous, to accomplish. I also machined out a some pockets around the magnet hole, to make it possible to just hot-glue the position magnet in place and more easily extract it.
Next, I re-installed the sun gear holder back in the rotor.
After that, I pressed the input bearing into the new planet input:
Then I went about installing the shaft output bearing into the planet output, the planet output into the output bearing, the planet shafts into the planet output, the planet bushings into the planets, and the planet bearings into the bushings.
Those got dropped onto the shafts, and the planet input was stuck into place.
After that, the screws were installed in the planet input, and the stator was fit onto the front housing, using a shrink fit again:
At this point, I aligned the rotor and pressed it and the primary shaft into place.
Now I used my paper strip alignment technique to get the rotor properly (or at least functionally) spaced from the stator.
At this point, the rotor still didn’t spin freely. Because of all the rework I’ve done, and my sloppiness in executing it, bits of the exploded bearings and other detritus had lodged themselves against the rotor and stator. The problematic pieces were small, sub 5 thousandths, but still plenty enough to cause the rotor to hang when spinning. These I carefully extracted under a microscope with a pair of tweezers.
At this point, I had a gearbox that spun freely and seemed mostly correct!