Building Bazel and TensorFlow 2.x on AArch64
In this blog, read the steps in order to build Bazel and TensorFlow on AArch64.

Until recently, building TensorFlow at all on AArch64 was not possible due to its dependency on Bazel. Doing a bootstrap build of Bazel with its architecture-independent distribution archive failed. GitHub contributor, powderluv, has released several Bazel binaries allowing Google's instructions to work with a few caveats. Note: There are now a number of TensorFlow Docker containers for Arm but none of them are TensorFlow 2 or greater at the time of this writing.
Steps to build Bazel:
- Follow the Google build instructions regarding Python and pip and when you get to the section on installing Bazel, follow the instructions for installing for
Ubuntu->"Compile Bazel from source"->"using an existing Bazel binary".
https://docs.bazel.build/versions/master/install-compile-source.html#build-bazel-using-bazel - In my particular case I am building TensorFlow 2.2, on Ubuntu 18.04.1 on an AWS EC2 A1 instance. Please note, one of the single biggest issues I have discovered had to do with RAM limits causing the compiler to crash on some TF kernel files. Using the Bazel “—local_ram_resources=“ option does not prevent the per-file gcc/clang crashes. Instead, either have an AArch64 system with more than 4GB of RAM or setup a swapfile size of 8GB.
- TensorFlow’s configure.py file specifies: Different versions of TensorFlow require different versions of Bazel to build. To find out which version of Bazel you need to build, look in you your TensorFlow source directory or on GitHub for the configure.py file. Search for _TF_MIN_BAZEL_VERSION. In the case of TensorFlow 2.2, we see:
_TF_MIN_BAZEL_VERSION = '2.0.0'_TF_MAX_BAZEL_VERSION = '2.0.0' - Downloaded the appropriate source archive of Bazel from Github. In this case:
https://github.com/bazelbuild/bazel/releases/tag/2.0.0 and unpacked it. - Grab one of the aarch64 Bazel binaries
https://github.com/powderluv/bazel-binbazel-1.2.1-aarch64-glibc-2.27 is the most appropriate at the time of this writing
And, of course, make sure it is in your path as Bazel somewhere. e.g.:ln -s bazel-1.2.1-aarch64-glibc-2.27 /usr/local/bin/bazel - Change in to your new Bazel source directory (2.0 in my case) and build it:
bazel build //src:bazel
(or for a dev build with more debug output:bazel build //src:bazel-dev)
The binary should build in ./bazel-bin/src/bazel (or ./bazel-bin/src/bazel-dev). Simply replace the older binary Bazel in your path with this new one and proceed to build TensorFlow from source.
e.g.:sudo ln -s /home/<user>/bazel-2.0.0/bazel-bin/src/bazel /usr/local/bin/bazel
Steps to build TensorFlow (2.2rc3 or newer):
- Assuming you have already installed Python and pip dependencies according to
- Clone the repo:
git clone https://github.com/tensorflow/tensorflow.git-or-cd tensorflow && git pullif you have already cloned the repo - git checkout v2.2
configure
Set python to /usr/bin/python for python 2.7, use defaults for the rest unless you have any special requirements, if not this line should work expeditiouslyyes "" | ./configure; python -c "import numpy as np"
- Clone the repo:
- A variety of other dependencies and known issues are satisfied by pip future and grpc:
pip install --upgrade setuptools && pip install future && pip install futures && pip install grpc
-or, if you're using python3-pip3 install --upgrade setuptools && pip3 install future
Build TensorFlow with Bazel with a few basic options; on AWS--config noawsis required to avoid some non-arm header includes. Using--local_ram_resources=1600won't limit gcc or clang building individual kernels and can cause an unrecoverable compiler crash on machines with less than 4GB of RAM. The number of Bazel threads can be tuned to your number of cores; e.g.-local_cpu_resources=1This line should work on a majority of AArch64 systems with enough RAMdate; bazel build //tensorflow/tools/pip_package:build_pip_package --config noaws --config=monolithic --local_cpu_resources=4; date
- Make a cup of tea
- Run any relevant unit tests
If you have any interesting training scenarios, please contact me.
Contact Matt
We might be able to allocate compute resources and share our story at Arm DevSummit.
Re-use is only permitted for informational and non-commerical or personal use only.
