Tag Archives: mech_warfare

Walking and Maker Faire!

Alert!  I’m at Maker Faire Bay Area all weekend in the Mech Warfare area in Zone 2 (May 17-19, 2019 for you time travelers from the future).  Drop by and say hi!

If you were left in suspense last time, yes, the robot can walk!  Getting it to do so in a minimal way was relatively painless.  What I found, which hadn’t happened in earlier iterations, is that many types of dynamic motions would cause the lower leg belts to jump a tooth.  Needless to say, this was nearly universally fatal, as there is no direct position sensing of the lower leg.  This robot is heavy enough that my simulacrum 3d-printed timing belt pulleys just don’t cut it.

Well, there wasn’t enough time to actually get better pulleys now, so I just tuned the walking to be slow and gentle enough that nothing went awry.  Here’s the first bit of a 13 minute video I took of it walking around and shooting targets.

Now, that that was over with, I had a few minor things to finish up before heading out to Maker Faire.  I made some covers for the motors to keep BBs out.

dsc_0017

And I made a bracket so that I could attach the front and rear target panels to shoulder joints:

dsc_0019

And here’s a glamour shot of the whole thing in fighting form!

DSC00016.JPG

Now that it was all ready, time to take it all back apart and pack it for shipping.

dsc_0025

And off to the airport I went!

Standing up and sitting down

Before SMMB could function in the Mech Warfare event it needed to be able to start and stop unattended.  That meant standing up and sitting down on its own.  Being that hack that this was, I went for a two pronged approach.

The direct direct servos I have for the upper and lower legs are somewhat underpowered for this size of robot.  Especially so when the machine is fully squatting down.  Also, the servos aren’t really encapsulated at all, and there are plenty of leg configurations that can self-intersect resulting in robot harm.

To avoid all of that, I 1) installed a second pair of permanently affixed “resting legs”, that mean the robot never has to squat down all the way.  These are just some PVC pipe with squash balls glued on to the bottom.  I printed bottom plates for the chassis with small cutouts to hold the top of the PVC pipe.

dsc_2362

Next, I added a short startup and shutdown sequence.  The startup sequence first moves the motors from wherever they happen to be to some fixed distance above their intended idle position, purely on a servo-by-servo basis.  I don’t yet have position feedback making it into the C++ application, but the servos themselves can interpolate form their current position, so this was a fair way to get the whole thing into a known state.  The only trick was that I had to linearly ramp up the power applied to the lower leg so that it didn’t get stuck dragging along the ground.  A future solution that used position feedback, and thus IK, wouldn’t have that problem.  The next part of the startup sequence just smoothly lowers the legs until the full robot weight is resting on them.  At this point, the robot is freestanding.

The shutdown sequence only does half of the inverse.  It smoothly raises the legs until there is no weight on the robot, then cuts power.

Once I get a robot made fully with gearbox servos instead of direct drives, and positioning sensing working at full rate back into C++, I should be able to undo some of these hacks and get it starting from arbitrary configurations without the help of the resting feet.  However, this is good enough for now.

 

Connecting up the turret

With the turret functioning in isolation, now I needed to mount it on the robot and get things communicating.

Mounting was easy, I 3d printed a bracket that fits the turret on one side, and mounts to the 4 hard mounts on the top of the gearbox chassis.

Turret mounted on chassis
Turret mounted on chassis

More time consuming was updating the control software to communicate with it.  The old turret used the HerkuleX protocol, and when I integrated the moteus RS485 based servos, I took a few shortcuts that I knew would need to be resolved later.  And the future is now!

In any case, the class which operated the servo actually owned the RS485 transport, which had to be factored out so that the turret class could share.  Also, I went above and beyond and tried to maintain the HerkuleX based operation too, in case I ever went back to the old chassis and turret.

Surprisingly, virtually no debugging was required once I got the basic functionality working.  The old joystick controls moved everything just as it should.

 

gimbal control board revision

With the new gearbox based mech chassis for Super Mega Microbot 2, the old gimbal controller would need some updates.  It has these new features/capabilities:

  • Higher input voltage: The old system ran at 2S, so 7.2V nominal.  Now we’re running at 5S, so 18.5V nominal.
  • RS485 data: The HerkuleX based robot used TTL level data communications.  moteus uses RS485.
  • Daisy chained power: With the new raspberry pi based computer in the turret, I now need to have an additional power and data port up on the mobile part of the turret.
  • No camera passthrough: Similarly, since the camera is directly attached to the raspberry pi 3, I don’t need to mess with having a connector to pass it through anymore.
gimbal_image.png
PCB rendering

As usual, I sent it off to MacroFab and waited.  A seemingly very short time later and poof, here it was!

dsc_2213

Bringing this up was more annoying than it could have been, mostly from a software perspective.  The moteus and imu junction firmware were both based on the original gimbal software, but refactored to be usable across different projects.  At the same time, that was where I had developed the RS485 based multiplex transport library.  So, now was the time to bite the bullet and convert the gimbal software to use those common libraries.

Since the gimbal board has another unique processor compared to everything else, I broke it out into a separate git repository:

The old project was initially CubeMX based.  When porting to rules_mbed and moteus/mjlib, I was in a hurry, so just copy and pasted much of the CubeMX initialization into the new tree and didn’t use any mbed APIs at all.  It took me a while to remember how all the CubeMX initialization was glued together and which pieces of it were relevant before all the peripherals started working properly.

I then proceeded to mechanically integrate it together into the unused turret.

dsc_2302
Mounted in bottom of turret
dsc_2304
Fully assembled turret

I once again had to remember how to calibrate and operate the thing.  Doing this once every 9 months is kind of painful!  However, I did manage to get it all working again, and ready to be integrated onto the mech.

 

 

Mech Warfare 2019 – at Maker Faire Bay Area

As an intermediate forcing function, I’ve been preparing Super Mega Microbot to enter the Mech Warfare event at Maker Faire Bay Area May 17-19 in San Mateo, CA!  The Mech Warfare event is a competition where scale size “mechs” or robots compete with airsoft cannons in a scaled down cityscape.  Teleoperation is allowed, but the human operators are only allowed to see “what the mech sees”, which means driving from a video screen.

My goal for some time has been to try and improve the dynamic motion of my quadruped robot.  Entering into Mech Warfare isn’t directly in support of that, but it is a fine and fun diversion.  That isn’t to say that the current quadruped iteration is necessarily a great fit.  I’ve only recently gotten it walking on new actuators at all, and only have enough gearboxes for just the lateral joints.  The current iteration is a little, ahem — shall we say large — compared to other entrants.    Also, there are a lot of “non-core” technologies that have to operate for the robot to effectively compete in a Mech Warfare event.  Things like the airsoft cannon, remote video display, armor, unattended robust operation, etc.  On the balance though, it is probably only a couple of weeks total diversion and is a good forcing function to make something that actually works end to end.

At this point, I’m clearly in “make it work at any cost on a short time frame” mode … err… “demo” mode.  Coming up are some posts describing all the pieces necessary for the event, followed by hopefully an event debrief!

So, if you’re in the area, come on down to Maker Faire in San Mateo May 17-19 2019 and check it out!

IMG_20160410_131037167
Super Mega Microbot (1) and others at Mech Warfare 2016

Super Mega Microbot in Robogames 2016

Earlier in April we took Super Mega Microbot out to California to compete in Mech Warfare during Robogames 2016. Thanks to the R-TEAM organizers who made the event happen this year. We were really excited, and the event as a whole went off really well! There were a lot of functional mechs attending, and many fights that were exciting to watch.

20160425-mech-entrants
Most of the mechs which competed

 

20160425-mech-operators.jpg
And their human operators

 

We managed to play 5 official matches, in the double elimination tournament, finishing in 3rd place overall. When it worked, SMMB worked really well. Our first loss was a very close battle, the score keeping system had us winning by 2 and the judges had us losing by 2. (The scoring system wasn’t super reliable, so there were human judges calling hits). Our second loss was caused when the odroid’s USB bus on SMMB stopped working mid-match, causing us to lose camera and wifi.

Takeaways

Since our last matches, we tried to improve a number of things, while some worked, not all of them are entirely successful yet:

  • Faster walking: The new mammal chassis is about twice as fast as the old lizard one, but we didn’t get much time to make it work really well, so we were still one of the slower mechs at Robogames. Also, the shoulder bracket, even on its second revision, still had several partial failures during matches and will need to be rebuilt in metal to be strong enough.
  • Stabilized camera: The new gimbal stabilized turret actually worked really well. We were able to reliably hit moving targets from the full length of the arena while in motion. It still has room for improvement, but overall was very reliable.
  • 5GHz Video transport: We updated our video to use a custom protocol over multicast 5GHz wifi, so that we could completely control the amount of link layer retransmissions. When it worked, this worked very well. We were able to get 720p video with 200ms latency, even in the presence of significant interference. However, adding the external 5GHz wifi card to our odroid seems to have made the USB bus overall somewhat unstable, and one of our matches ended prematurely when the entire USB port died, taking our camera and wifi with it.

Matches

Thanks to Kevin from R-TEAM, we managed to capture overhead video of all our matches, and have the video as seen on our operator console for each official match as well.

Match 2 – vs HD3

Match 9 – vs Odin

Match 11 vs Immortal

Match 14 vs TwitchMX

Match 17 vs Odin

Functional gimbal stabilized Mech Warfare turret

Well, that took longer than I expected! I last showed some progress on a gimbal stabilized turret for Mech Warfare competitions more than six months ago. Due to some unexpected technical difficulties, it took much longer to complete than I had hoped, but the (close to) end result is here!

20160206-turret-overview
Complete gimbal mounted turret

Here’s a quick feature list:

  • 2 axis control: Yaw and pitch are independently actuated.
  • Brushless: Each axis is driven by a brushless gimbal motor for high bandwidth no-backlash stabilization.
  • Absolute encoders: Each axis has an absolute magnetic encoder so that accurate force control can be applied to each gimbal, even at zero speed.
  • Fire control: High current outputs for driving an AEG motor, an agitator motor, and a low current output for a targetting laser are present.
  • 7v-12V input: Supports 2S-3S lipo power sources.
  • 12V boost: When running from 2S lipo, it can boost the gimbal drive up to 12V for additional stabilization authority.
  • HerkuleX protocol: The primary control interface uses a native Dongbu HerkuleX protocol; support for other UART based protocols which will work at 3.3V CMOS levels should be easy.
  • USB debugging support: A USB port is present to return high rate debugging information and allow configuration and diagnostics to be performed.
  • Open source: All design and firmware files are Apache 2.0 licensed on github: https://github.com/mjbots/mjmech/tree/master/hw/gimbal.

You can see the turret’s basic operations in a quick video here:

 

 

Design

The design is driven by the bill of materials selection. The primary components of the gimbal are as follows:

  • Turnigy HD 3508 Gimbal Motor: Both axes use this gimbal motor from HobbyKing, which has sufficient power to stabilize a 600g turret.
  • Frame: The mechanical frame is a shapeways strong-and-flexible printed part.
  • STM32F411: A fast 32 bit microcontroller with support for all the peripherals that are necessary.
  • TPS62172: The primary 3.3V regulator which powers the microcontroller and all the other 3.3V parts.
  • TPS55330: The 12V boost regulator, which when enabled, powers the gimbal motors.
  • MC33926: A 2 channel motor driver used for fire control, it powers both the AEG and agitator motor outputs.
  • DRV8313: 2 of these integrated BLDC drivers power each gimbal motor.
  • AS5048A/B: These absolute magnetic encoders are used to measure the actual position of the pitch and yaw gimbals.
  • BMI160: This IMU is used as the primary source of inertial compensation data. The board hardware supports a second IMU, to be placed on the main robot, but the firmware does not yet support that configuration.

Boards

This gimbal design contains three custom boards, a breakout board for the BMI160 IMU, a breakout board for the AS5048B magnetic encoder sensor, and the primary board which contains the rest of the logic.

BMI 160 Breakout

20160206-bmi160-breakout.jpg
Completed BMI160 breakout board, Assembled by MacroFab

The first board is simple; it is a basically just a breakout board for the BMI160 inertial sensor. It provides the BMI160 itself, some decoupling capacitors, and a 0.1 inch 4 pin connector for the I2C bus.

I had these prototypes made at MacroFab which I highly recommend as a great provider of low-cost turnkey PCB assembly.

AS5048B Breakout

20160206-as5048b-breakout-small
AS5048B breakout board

 

This, like the BMI160 breakout board, just has decoupling capacitors, the chip itself, and connectors. It additionally has mounting holes designed to fit onto the 3508 gimbal motor. This was printed at OSH Park and hand-assembled.

Gimbal control board

20160206-gimbal-control.jpg
Completed primary gimbal control board (r2) , Assembled by MacroFab

The primary gimbal control board contains most of the system functionality. It is designed to mechanically mount directly above the yaw gimbal motor, as the yaw absolute magnetic encoder is in the center on the underside of the board.

This prototype was also built at MacroFab, who did an excellent job with this much more complex assembly.

The connectors and features are as follows:

  • Power and Data: A 4 pin JST-XH connector in the upper right brings in power and data from the main robot.
  • Debug USB: A debugging protocol is available on this micro-USB port.
  • Camera USB: Two 4 pin JST-PH connectors provide a convenience path for the camera USB. The turret’s camera connects to the top connector, and the main robot connects to the side facing connector.
  • I2C peripherals: 3, 4 pin JST-ZH connectors have identical pinout and connect to external I2C peripherals. These are used for the primary IMU, the pitch absolute magnetic encoder, and the optional secondary IMU.
  • Arming switch: This switch is connected directly to the enable pin on the MC33926, and is also connected to an input on the STM32F411.
  • Programming connector: The 6 pin JST-PH connector has the same pinout as Benjamin Vedder’s VESC board, and can program and debug the STM32F411.
  • Weapon connector: A 2×4 0.1 inch pin header has power lines for the AEG drive, the agitator drive and the laser. It has an extra row of pins so that a blank can be used for indexing.
  • Gimbal connectors: 2, 3 pin 0.1 inch connectors power the yaw and pitch gimbal brushless motors.

Firmware

struct Config {
  uint8_t address = 0xd0;
  uint16_t rate_hz = 800;
  uint16_t gyro_max_dps = 1000;
  uint8_t accel_max_g = 4;

  Euler offset_deg;

  template 
  void Serialize(Archive* a) {
    a->Visit(MJ_NVP(address));
    a->Visit(MJ_NVP(rate_hz));
    a->Visit(MJ_NVP(gyro_max_dps));
    a->Visit(MJ_NVP(accel_max_g));
    a->Visit(MJ_NVP(offset_deg));
  }

  Config() {
    offset_deg.yaw = 90.0f;
  }
};

Sample configuration structure

The firmware was an experiment in writing modern C++11 code for the bare-metal STM32 platform. Each module interacts with others through std::function like callbacks, and the entire system is compiled both for the target, and the host so that unit tests are run. Dynamic memory allocation is this close to being disabled, but it was necessary for newlib’s floating point number formatting routines, which just allocate a chunk of memory the first time you use them. Otherwise, there is no dynamic memory used at all.

It relies on a CubeMX project template for this board. Most of the libraries CubeMX provides have too little flexilibity to be used for this application, so much of the bit twiddling is re-implemented in the gimbal firmware. CubeMX is great for configuring the clock tree and pin alternate functions however, especially in a complex project like this.

Both configuration and telemetry rely on a templated C++ visitor pattern to perform compile time reflection over mostly arbitrary C++ structures. Any module can register a structure to be used for persistent configuration. Those structures can be changed through the debugging protocol, and can be written to and read from flash at runtime. Each module can also register as many telemetry structures as necessary. These can be emitted over the debugging protocol either at fixed intervals, or whenever they update.

IMU stabilization

The IMU is converted into attitude through use of a simple complementary filter, in the same spirit as some of Seb Madgwick’s algorithms. This is then fed into a control loop for each axis’s gimbal.

There are three possible modes, the first of which is what I call “open-loop”, and is based on the same principles as the BruGi brushless gimbal, where no absolute motor feedback is available. In that mode, a PID controller operates with the axis error as the input, and the output is the actual phase position of the BLDC controller. In this mode, the integral term does most of the work in stabilization, so the overall performance isn’t great.

The second mode still uses a PID controller, but now the output is an offset to the BLDC phase necessary to hold the current position as measured by the absolute encoders. This effectively makes the output a direct mapping to force applied to the motor, although of course a non-linear mapping. This mode results in much better overall performance and is easier to tune.

Finally, there is a third debugging mode that lets you just hard command specific BLDC phases. This is useful for calibrating the mapping between BLDC phase and absolute encoder phase.

tview

The debugging protocol is partially human readable, but telemetry data is encoded in the same binary format as used elsewhere in the mjmech codebase. tview is the debugging application we use to read that data, as well as configure and control the overall system.

20160206-tview-window.png
tview window

The bottom pane just has a serial console, where you can send arbitrary things over the virtual serial port. tview directly supports relatively few commands from the debugging protocol, and for instance has no UI to operate the stabilizer or fire control, so for now these are done by hand in that window.

The left pane has two tabs, one with a configuration tree and the other with a telemetry tree. The configuration tree shows all structures which were registered as configurable, and allows you to change them in the live system. The telemetry tree shows all structures registered as telemetry structures, and reports their values live as the system is operating.

The right pane has a live plot window where any of the values in the telemetry tree can be plotted versus time. It is just an embedded matplotlib plot, so all the normal plot interaction tools are available, plus some from mjmech’s tplot, like the ability to pan and zoom the left and right axes independently.

System video

And last but not least, here is a short video demonstrating the turret stabilizing a camera and firing some blanks at a target as our mech walks around.

 

Progress on Super Mega Microbot

I have some incremental progress to report on various parts of Super Mega Microbot. First, I have a draft fully assembled leg for a mammal walking configuration. It is mostly just the stock Dongbu brackets, with a custom Shapeways print at the final joint holding a standoff and rubber stopper.

20150719-mammal-leg.jpg
Prototype mammal jointed leg

Second, I’ve been working on a gimbal stabilized turret. I have video from a prior incarnation below:

And a first draft of a 3D printed turret bracket that permits a full range of motion of the turret:

20150719-gimbal-bracket
3D printed turret gimbal bracket