tinyML Talks: Neural Architecture Search for Tiny Devices

It is widely anticipated that inference models based on Deep Neural Networks (DNN) will be actively deployed in many edge platforms. This has promoted research in automated learning of tiny neural architectures through search. Although NAS was proposed in 2016, the NAS research is focused on fast search of DNN architectures that surpass the performance of human-designed ones. Apart from the above primary target of enhancing the NAS process itself, many people use NAS for generating and customizing DNN models, given a target hardware. In recent times this has become very important for embedded Deep Neural Networks that need to meet platform specific constraints and various objectives, such as low latency, low memory footprint and low power consumption. Neural Architecture Search (NAS) can provide both efficient and accurate, customized models for the target architecture. However, the existing frameworks either provide a). mechanisms for fast accurate model generation or b) slow but both accurate and efficient model generation, but not both. Towards This, the current tutorial explains the basic NAS process and the mathematical model behind the search, which makes it easy for the TinyML engineers to tweak existing NAS frameworks in an informed manner.


April 10, 2023



Contact us



Timezone: PDT

Neural Architecture Search for Tiny Devices

Swarnava DEY, Senior Scientist

TCS Research

Swarnava DEY, Senior Scientist

TCS Research

Swarnava Dey is a Senior Scientist at TCS Research working on embedded vision systems. He is an M.Tech from IIT, Kharagpur, and currently pursuing PhD there in robustness, verifiability and explainability of Embedded Deep Neural Networks and Neuro Symbolic AI. He has 30+ granted patents, 25+ research papers, and is an author of Towards Data Science: https://medium.com/@qswadey. His publication details can be found at his Google Scholar page: https://scholar.google.co.in/citations?hl=en&user=aFplwjEAAAAJ

Schedule subject to change without notice.