bazel-ifying simple autoconf packages

This is part N in a series describing how I created the bazel infrastructure to build all the third party packages for mjmech.  Previously we have:

We left off with the first, very simple packages configured to build with bazel.  In this installment we will tackle those that require at least minimal configuration, i.e. those that have some files which are normally generated as part of the build process.

snappy / template_file

snappy (https://github.com/google/snappy), a file compression library, is largely a pure C++ project, but it does have a single generated header file.  There are a few possible options within bazel to handle this.  The simplest would be to use a macro, although that has complications with respect to namespacing and label resolution.  Here, I created a custom rule, called “template_file”.

As seen in the source on github the interface is relatively simple:

template_file(
  src,
  is_executable,
  substitutions,
  substitution_list,
)

There are two possible forms for passing in substitutions, either a string_dict when using “substitutions”, or a list of KEY=VALUE strings in “substitution_list”.  The two forms are present because starlark can make it awkward to deal with dictionaries, so often working with a list is much more convenient.

The snappy build rule uses this to generate snappy-stubs-public.h:

template_file(
    name = "snappy-stubs-public.h",
    src = "snappy-stubs-public.h.in",
    substitutions = {
        "${HAVE_STDINT_H_01}": "1",
        "${HAVE_STDDEF_H_01}": "1",
        "${HAVE_SYS_UIO_H_01}": "1",
        "${SNAPPY_MAJOR}": "1",
        "${SNAPPY_MINOR}": "1",
        "${SNAPPY_PATCHLEVEL}": "7",
    },
)

log4cpp / autoconf

Next up in the difficulty level is log4cpp, which uses autoconf as its native build system.  autoconf typically has a config.h.in file, defined in a semi-standardized format, mostly comprised of lines like:

/* Define to 1 if you have the  header file. */
#undef HAVE_STRING_H

The pre-processing stage turns each of those lines into either an actual #define, or just a a comment saying that it continues to be undefined.

That is substantive enough that the simple built-in bazel template rule won’t cut it though.  For this, we split the work into two pieces, 1) a python helper script to do the actual formatting and 2) a starlark bazel rule which defaults in some common autoconf defines.  For the bazel_deps repository, the bazel autoconf rule has basically every autoconf define that is shared by more than one project in it, to reduce duplication and make it more likely that all the dependencies are configured consistently.

With the two of them together, the resulting BUILD file defines a cc_library as normal, but it depends upon the new autoconf rule like found in the log4cpp instance:

autoconf_config(
    name = "include/log4cpp/config.h",
    src = "include/config.h.in",
    package = "log4cpp",
    version = "1.1.3",
    defines = autoconf_standard_defines + [
        "DISABLE_SMTP",
        "DISABLE_REMOTE_SYSLOG",
    ],
    prefix = "LOG4CPP_",
)

First day jumping!

I continue to make progress on the improved actuators for SMMB.  To briefly recap, these are based on a home-built brushless servo consisting of off the shelf gears, bearings, 3d printed assemblies, and a custom control board.

Moving on from closed loop vector (FOC) control, I’ve now built up a second motor, set both of them communicating over the same RS485 bus, and wired up a minimal makeshift jumping fixture.  The leg didn’t jump as well as I had expected: I was only able to achieve about 300ms of air time and there are a lot of other minor problems/deficiencies as well.  But on the other hand, I don’t appear to have permanently broken anything yet, so improvement will hopefully be mostly continuous!

Obligatory video:

First closed loop vector control

I’ve reached a minor milestone in developing improved actuators for Super Mega Microbot.  Previously I demonstrated basic closed loop control using a VESC.  Now I have a custom control board running closed loop vector-based current and position control on a single brushless servo!  I’ll hopefully write up pieces in more depth later, but this post can serve as a proof of existence.

First, boards as received from MacroFab:

Mounted onto the planetary gearbox:

dsc_1164

And finally, a brief video of operation, with tview (from the mjmech gimbal) alongside.

Amazingly, I’ve needed no blue wires or rework of any kind so far.  Next up is to get it communicating over the RS485 link instead of serial, build a second gearbox, and get it jumping!

rules_mbed – bazel for mbed

When working on the firmware for Super Mega Microbot’s improved actuators, I decided to try using mbed-os from ARM for the STM32 libraries instead of the HAL libraries.  I always found that the HAL libraries had a terrible API, and yet they still required that any non-trivial usage of the hardware devolve into register twiddling to be effective.  mbed presents a pretty nice C++ API for the features they do support, which is only a little less capable than HAL, but still makes it trivial to drop down to register twiddling when necessary (and includes all of the HAL by reference).

Most people probably use mbed through their online IDE.  While this is a remarkable convenience, I am a big fan of reproducibility of builds and also of host based testing.  mbed provides mbed-cli which gets part of the way there by letting you build offline.  Unfortunately, it still doesn’t provide great support for projects that both deploy to a target and have automated unit tests.  It actually has zero support for unit tests that can run on the host.

Enter bazel

As many have guessed, I’m a big fan of bazel https://bazel.build, for building software projects.  Despite being raw around the edges, it does a lot of things right.  Its philosophy is to be fast, correct, and reproducible.  While it doesn’t support flagless multi-platform builds yet (#6519), it does provide some minimal multi-platform support.  It also has robust capabilities for pulling in external projects, including entire toolchains and build environments.  Finally, it has excellent support for host based unit tests.

To make it work, you do have to configure a toolchain though.  That I’ve now done for at least STM32F4 based mbed targets.  It is published under an Apache 2.0 license at: https://github.com/mjbots/rules_mbed

rules_mbed features

  • Seamless toolchain provisioning: It configures bazel to automatically download a recent GNU gcc toolchain from ARM (2018q2).
  • Processor support: Currently all STM32F4 processors are supported, although others could be added with minimal effort.
  • Host based testing: Common modules that rely only on the standard library can have unit tests run on the host using any C/C++ testing tools such as boost::test.
  • Simultaneous multi-target configuration: Multiple targets (currently limited to the same CPU family) can be built with a single bazel command.  Multiple families can be configured within the same source tree.

Once you have it installed in a project, building all host based tests is as simple as:

tools/bazel test //...

and building all binary output files is as simple as:

tools/bazel test -c opt --cpu=stm32f4 //...

 

First bazel-ified packages

In “Building mjmech dependencies with bazel“, I described my rationale as it were for attempting to build all of the mjmech dependencies within bazel for cross compilation onto the raspberry pi.  mjmech has two big dependencies which were going to cause most of the transitive fallout:

  • gstreamer – We use gstreamer to interface with the webcam, format RTSP streams for FPV on the control station, and to render the control station and heads up display.  Granted, not all of gstreamer is used, but we do depend on features that require ffmpeg and X11.
  • opencv – The use of opencv had been minimal to non-existant previously, as we hadn’t actually done any computer vision on the robot itself.  However, one of the big motivations for switching to the raspberry pi in the first place was to at least to be able to do active target tracking onboard.

And then there are a few other direct dependencies that are “easy”, if nothing else because they have such few transitive dependencies.

  • boost – The use of boost is almost exclusively the header only parts, boost::date_time, boost::filesystem, boost::program_options, boost::test, and boost::python.  Of these, only boost::python has any transitive dependencies.
  • fmt – This text formatting library has no further dependencies whatsoever.
  • log4cpp – This is just used for writing textual debug output and has no transitive dependencies.
  • snappy – We use snappy to compress logged data, but it depends on nothing.

Simple packages

I started with the simple, no dependency packages from the second set.  The strategy here is to, for each package, create a tools/workspace/FOO/repository.bzl with a FOO_repository method to download the upstream tarball, and a corresponding tools/workspace/FOO/package.BUILD which contains the bazel BUILD file describing how to build that package.

The most straightforward package was “fmt”, from https://github.com/fmtlib/fmt.  It’s repository.bzl looks like:

load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")

def fmt_repository(name):
    http_archive(
        name = name,
        urls = [
            "https://github.com/fmtlib/fmt/archive/5.0.0.tar.gz",
        ],
        sha256 = "fc33d64d5aa2739ad2ca1b128628a7fc1b7dca1ad077314f09affc57d59cf88a",
        strip_prefix = "fmt-5.0.0",
        build_file = Label("//tools/workspace/fmt:package.BUILD"),
    )

It basically just contains the URL of the tarball to download, its hash, the prefix to strip, and where to locate the BUILD file.

Next, the package.BUILD file is:

package(default_visibility = ["//visibility:public"])

cc_library(
    name = "fmt",
    hdrs = glob(["include/**"]),
    srcs = [
        "src/format.cc",
    ],
    includes = ["include"],
)

Since the fmt library is mostly header only, this has a single cc_library definition which compiles the one file and sets up the paths to correctly find the header files.

The other easiest package, boost, was handled in a similar manner.

Next level

Moving up the difficulty ladder is snappy and log4cpp, both of which require at least minimally more complicated build rules.  That will be the topic for the next time.

 

Shapeways dimensional tolerances

The first version of the planetary gearbox as 3d printed from Shapeways required a fair amount of post-machining to get all the pieces to fit together.  I wanted to get to a point where I could just order some parts and have a reasonable expectation of them mostly working out of the box.  To make that happen, I’d need to get a better understanding of where the tolerances were coming from.

Understanding the problem

Shapeways provides a fair amount of documentation on the processes and accuracy you can expect generally.  Most of this is detailed in “Design rules and detail resolution for SLS 3D printing“, however the results there have some limitations.  Primarily, they are only applicable to the specific geometries tested.  Shrinkage is qualified as +- 0.15% of the largest dimension, and is likely influenced by the exact printed geometry.  Secondarily, in the documented tests, the designers had full control over the part alignment in the print.  The standard shapeways platform does not allow you to orient parts, you are at the whim of their technicians where the Z axis will end up.

For the gearbox, I had numerous fit points that needed to have controlled tolerances.  The input and output bearing both needed a press fit for both sides.  The internal gear for the planetary gearing needed a press fit, and the front and back shells also have a lip which would be more rigid if the fit was snug.

Brute force

My solution?  Print slight variants of the relevant pieces of each fit point with each radial dimension printed in increments of 0.1mm.

Shapeways Dimensional Tolerance Test
Shapeways Dimensional Tolerance Test

For each part, I broke out the calipers to measure the as printed size, and also attempted manual press fits of each part.  I didn’t manage to put any identifying features on each of the prints, which probably annoyed the Shapeways technicians and made my life a bit harder.  I just assumed that the sizes came back in increasing order despite the part number markings, which I’m pretty sure were incorrect.  This resulted in the following table:

Measured dimensional accuracy of gearbox parts
Measured dimensional accuracy of gearbox parts

Conclusion

The second version of the gearbox had many other changes in addition to these, but this let me get a lot closer to the correct fit on the full assembly.

Building mjmech dependencies with bazel

Previously, I set up bazel to be able to cross compile for the raspberry pi using an extracted sysroot.  That sysroot was very minimal, basically just glibc and the kernel headers.  The software used for SMMB has many dependencies beyond that though, including some heavyweight ones such as gstreamer and I needed some solution for building against them.

Options

There were two basic options:

  1. Install all the dependencies I cared about on an actual raspberry pi, and extract them into the sysroot.
  2. Build all the dependencies I cared about using bazel’s external projects mechanism.

The former would certainly be quicker in the short term, at the expense of needing to check in or otherwise version a very large sysroot.  It would also be annoying to update, as I would need to keep around a physical raspberry pi and continually reset it to zero in order to generate suitably pristine sysroots.

The second option had the benefit of requiring a small version control footprint — just the URLs and hashes of each project, along with suitable bazel build configuration.  It is also perfectly compatible with a fully hermetic build result.  However, it had the significant downside that I would need to write the bazel build configuration for all the transitive dependencies of the project.

I decided to take a stab at the second route, partly because of the benefits, but also to see just how hard it would be.

Structure

Since bazel does not yet have recursive WORKSPACE parsing I went with a structure that is used in the Drake open source project.  The top level project has a tools/workspace directory that contains one sub-directory for each dependency or transitive dependency.  Within that directory is a default.bzl that contains one exported function, add_default_repositories. It is intended to be called from the top level WORKSPACE file, and it creates bazel rules for all necessary external dependencies.

The drake project doesn’t support cross compilation, so most of their BUILD rules are of the form, “grab the pre-compiled flags from pkg-config”.  However, the same structure will work just fine even with non-trivial compilation steps.

mjbots/bazel_deps

So that this could be easily used across multiple dependent projects, I put all the resultant rules into a new github project: https://github.com/mjbots/bazel_deps  Like the drake repository, it has a default.bzl with a single export, add_default_repositories that is used by dependent projects.

Next up, working through actually packaging the dependencies!

Configuring bazel to cross compile for the Raspberry Pi 3

In the previous post, I described the motivation for switching the mjmech build system to bazel.  For that to be useful with Super Mega Microbot, I first had to get a toolchain configured for bazel such that it could produce binaries that would run on the raspberry pi.

All the work in this post is publicly available on github: https://github.com/mjbots/rpi_bazel

Compiler and sysroot

First, I needed to pick the compiler I would be using and how to get the target system libraries available for cross compilation.  In the past I’ve always done the gcc/binutils/gnu everything cross toolchain dance, however here I thought I would try something a bit more reproducible and see if I could make clang work.  Amazingly, a single clang binary supports all possible target types!  clang-6, which can be had through a PPA on Ubuntu 16.04 and natively on Ubuntu 18.04 supports it out of the box.

For the target system libraries, I wrote a small script: make_sysroot.py which when aimed at a live raspberry pi, will extract the Linux kernel headers and glibc headers and binaries.  These are stored in a single tar.xz file, with the latest version checked into the tree: 2018-06-10-sysroot.tar.xz.

Bazel configuration

bazel has a legacy mechanism for configuring toolchains, called the CROSSTOOL file.  It is not exactly pretty, is moderately underdocumented, but at least there are a few write ups online for how to create one.  The CROSSTOOL I created here is minimally functional, with options for both host/clang and rpi/clang with a few caveats:

  1. It isn’t hermetic, it relies on the system installation of clang
  2. It includes hard coded paths to micro releases of clang
  3. The C++ main function isn’t handled correctly yet, and you have to ‘extern “C”‘ it to link applications

With a functioning CROSSTOOL, the next step is to declare the “cc_toolchain_suite” and “cc_toolchain” definitions.  Those are defined in the tools/cc_toolchain/BUILD file and don’t have anything particularly complex in them.

Finally, I created a repository.bzl which is intended to be imported in client projects.  This provides a relatively simple API for a client project to import the toolchain, and gets the sysroot extracted properly.

Using it in a bazel project

The top level README gives the steps to use the toolchain, which while not too bad, still requires touching 4 non-empty files (and 2 possibly empty files).  Once that is done, you can use:

bazel test --config=pi //...

To build your entire project with binaries targeted to run on the raspberry pi!

Building mjmech software on the rpi 3b+ with bazel

The first piece I tackled when switching to the Raspberry Pi 3B+ for Super Mega Microbot was building our existing control software.  The software we used for the 2016 Robogames is largely C++ and was built with SConshttps://github.com/mjbots/mjmech/.  For all our previous platforms for both Savage Solder and SMMB, we had just built the software on the device itself, which while a little slow, was certainly convenient and required very little sophistication from the build system.  Raspian is debian based, so this shouldn’t be hard, right?

“apt” was my friend and at least got the compiler and all the build requirements installed.  Then I went to build and discovered that some of our C++ translation units required more than 1G of RAM individually to compile.  Ouch.  That rendered building on the target with the current software infeasible.  Either I could rewrite the software to be less compilation time RAM intensive, or switch to a cross compilation model.

bazel-icon

Building on the platform was always somewhat slow, so I decided to try and take the leap and get a proper cross compilation environment set up.  I’ve done that in SCons before, but it was never pretty, as SCons doesn’t have the best mechanisms for expressing things that need to be built on the host vs the target, and describing any given cross compilation tool chain is always challenging.  Plus SCons itself is slow for any moderately sized project.  Instead, I opted to give bazel (https://bazel.build) a try.  I’ve used it professionally for a while now, and while it is a mixed bag, I am an eternal optimist when it comes to its feature set.  Right now, versus SCons, it can support:

  1. Specifying multiple C++ toolchains
  2. Switching between a single host toolchain and a single target toolchain
  3. Gracefully pulling in external projects controlled by hash
  4. It runs locally very fast, can parallelize across large numbers of cores, and finally has a functioning content addressable cache

The two big features which have seemingly been very close for a long time now, but aren’t quite there yet are:

  1. Remote execution
  2. Arbitrary C++ toolchain configuration within a single build (i.e. build binaries for multiple target platforms as part of a single build)

Next up, I’ll describe how I configured bazel with the cross toolchain necessary to build binaries on the raspberry pi 3 (b+).

 

Improved actuators for SMMB

One of the major challenges SMMB had in Robogames 2016 was in overall walking speed.  It is using HerkuleX DRS-201 servos, which are roughly comparable to the Dynamixel servos that other entrants were using, but the physical geometry of the robot is such that is hard to get it to move quickly with that class of servos.  The center of gravity is too high, especially with the gimbal mounted turret. The R-Team bots all use very low slung machines that scoot along.  I could go that route, but why do things the easy way?

Instead, I’ve been working to take some ideas from fellow Boston-ite Ben Katz and build some actuators that would permit truly dynamic motion.  He got a leg jumping using Hobbyking brushless motors with some simple FOC control  The biggest differentiators vs the Dynamixel / HerkuleX class of actuators would be low mechanical inertia, high transient power, low backlash, and high speed.  I’m just getting started here, but have managed to build up a 5x planetary gearbox driven by a Turnigy Elite 3508 (so a fair amount smaller than what Ben did, but more appropriately sized for Mech Warfare), and a VESC 6 as an interim motor controller.  It is designed for electric skateboards, but has minimal position control support.  Although, as my bruised hand can attest, it isn’t super stable and flips out occasionally.

The first prototype is assembled and has been spun up, although a fair amount of dremel time and shims were required to get everything to fit together.

dsc_1056
Some SDP-SI gears for the prototype
dsc_1054
The 3508 with its stock shaft extracted
DSC_1057.JPG
New shaft installed, with spur gear alongside
dsc_1070
AS5047 wired up to the VESC
DSC_1072.JPG
Shapeways arrived!
dsc_1073
Housing mounted onto motor
dsc_1076
Planet carrier assembled
dsc_1077
Planet carrier inside housing
dsc_1081
Final assembled gearbox

And finally, the pretty videos: