Welcome to the CBin-NN Inference Engine

Inference Engine for Binarized Neural Networks on Resource-Constrained Devices

Welcome to the CBin-NN Inference Engine

Binarization is an extreme quantization technique that is attracting research in the Internet of Things (IoT) field, as it radically reduces the memory footprint of deep neural networks, without a correspondingly huge accuracy drop. In order to support an effective deployment of BNNs, we propose CBin-NN, a library of layer operators that allows building simple yet flexible convolutional neural networks (CNNs) with binary weights and activations. CBin-NN is platform-independent, thus portable to virtually any software-programmable device.

Why TinyML?

CBin-NN

Operator Input Weight Output
QBConv2D 8-bit quantized 32-bit packed 32-bit packed
QBConv2D 8-bit quantized 32-bit packed 32-bit packed
QBConv2D_Optimized_PReLU 8-bit quantized 32-bit packed 32-bit packed
QQConv2D 8-bit quantized 8-bit quantized 32-bit packed
QQConv2D_Optimized 8-bit quantized 8-bit quantized 32-bit packed
QQConv2D_Optimizd_PReLU 8-bit quantized 8-bit quantized 32-bit packed
BBConv2D 32-bit packed 32-bit packed 32-bit packed
BBConv2D_Optimized 32-bit packed 32-bit packed 32-bit packed
BBConv2D_Optimized_PReLU 32-bit packed 32-bit packed 32-bit packed
BBPointwiseConv2D 32-bit packed 32-bit packed 32-bit packed
BBPointwiseConv2D_Optimized 32-bit packed 32-bit packed 32-bit packed
BBPointwiseConv2D_Optimized_PReLU 32-bit packed 32-bit packed 32-bit packed
BMaxPool2D 32-bit packed - 32-bit packed
BMaxPool2D_Optimized 32-bit packed - 32-bit packed
BBFC 32-bit packed 32-bit packed 32-bit packed
BBFC_Optimized 32-bit packed 32-bit packed 32-bit packed
BBFC_Optimized_PReLU 32-bit packed 32-bit packed 32-bit packed
BBQFC 32-bit packed 32-bit packed quantized (8, 16, 32 bit)
BBQFC_Optimized 32-bit packed 32-bit packed quantized (8, 16, 32 bit)
BBQFC_Optimized_PReLU 32-bit packed 32-bit packed quantized (8, 16, 32 bit)

Support or Contact

For more information about the library, see our IEEE EDGE 2022 paper or contact us for assistance.