tinyML Talks: Binarized Neural Networks on microcontrollers

Date

January 19, 2021

Location

Virtual

Contact us

Discussion

Schedule

Timezone: PDT

Binarized Neural Networks on microcontrollers

Lukas GEIGER, Deep learning researcher

Plumerai

Today’s deep learning methods limit the use of microcontrollers to only very basic machine learning tasks. In this talk, Lukas explains how real-time deep learning for complex tasks is achieved on microcontrollers with the help of Binarized Neural Networks (BNNs) – in which weights and activations are encoded not using 32 or 8 bits, but using only 1 bit.

BNNs allow for much lower memory requirements and extremely efficient execution, but require new training algorithms and custom software for inference. Our integrated approach tackles these issues. Built on top of Keras and TFLite, our open-source libraries (https://larq.dev 1) make it possible to build, train and benchmark BNNs on ARMv8-A architectures and we show how this work exposes the inconsistencies between published research and real world results.

Finally, we demonstrate the world’s first BNN running live on an ARM Cortex-M4 microcontroller using Plumerai’s software stack to bring unmatched efficiency to TinyML.

Lukas GEIGER, Deep learning researcher

Plumerai

Lukas Geiger is a deep learning researcher at Plumerai working on new training methods and architectures for improving accuracy and efficiency of Binarized Neural Networks (BNNs). He is the author of the open-source Larq training library and core developer of the Plumerai software stack for deploying BNNs on embedded platforms.

Schedule subject to change without notice.