tinyML Talks: From the lab to the edge: Post-Training Compression

Deep neural networks (DNNs) are nowadays ubiquitous in many domains such as computer vision. However, going from tensorflow or torch to efficient DNN deployments on the edge remains one of the industry’s biggest remaining challenges. During this presentation, we will see how Datakalab solves this problem, without using intensive computations nor re-training on the cloud, in two steps. First, we remain agnostic of the training framework by providing support for the inference of any DNN on a wide range of hardware. Second, we designed custom, state of the art, compression techniques that trely on post training quantization, pruning and context adaptation. The resulting inference models achieve remarkable speeds on microchips while staying within an accuracy loss of less than 1%.


February 27, 2023



Contact us



Timezone: PST

From the lab to the edge: Post-Training Compression

Edouard YVINEC, PhD Student


Edouard YVINEC, PhD Student


Edouard Yvinec is a PhD student at Datakalab and Sorbonne Université. Neural networks compression is his main research interest which he applies to solving computer vision and nlp tasks. He published several works on post-training compression at NeurIPS [1,2] en pruning, ICLR [3] and WACV [4] on quantization and at IJCAI [5] on layer folding. Each of these methods focus on computer vision tasks solved by convnets to the exception of PowerQuant [3] which also tackles transformer compression for both vision and nlp.

[1] Yvinec Edouard, et al. “Red: Looking for redundancies for data-freestructured compression of deep neural networks.” NeurIPS. 2021.
[2] Yvinec Edouard, et al. “SInGE: Sparsity via Integrated Gradients Estimation of Neuron Relevance.” NeurIPS. 2022.
[3] Yvinec Edouard, et al. “PowerQuant: Automorphism Search for Non-Uniform Quantization.” ICLR. 2023
[4] Yvinec Edouard, et al. “SPIQ: Data-Free Per-Channel Static Input Quantization.” WACV. 2023.
[5] Yvinec Edouard, et al. “To Fold or Not to Fold: a Necessary and Sufficient Condition on Batch-Normalization Layers Folding.” IJCAI.

Schedule subject to change without notice.