tinyML Talks: SRAM based In-Memory Computing for Energy-Efficient AI Inference

Date

May 13, 2021

Location

Virtual

Contact us

Discussion

Schedule

Timezone: PDT

SRAM based In-Memory Computing for Energy-Efficient AI Inference

Jae-sun SEO, Associate Professor

Arizona State University

Artificial intelligence (AI) and deep learning have been successful across many practical applications, but state-of-the-art algorithms require enormous amount of computation, memory, and on-/off-chip communication. To bring expensive algorithms to a low-power processor, a number of digital CMOS ASIC solutions have been previously proposed, but limitations still exist on memory access and footprint.

To improve upon the conventional row-by-row operation of memories, “in-memory computing” designs have been proposed, which performs analog computation inside memory arrays by asserting multiple or all rows simultaneously. This talk will present recent silicon demonstrations of SRAM-based in-memory computing for AI systems. New memory bitcell circuits, peripheral circuits, architectures, and a modeling framework for design parameter optimization will be discussed.

Jae-sun SEO, Associate Professor

Arizona State University

Jae-sun Seo is an Associate Professor at the School of ECEE at Arizona State University. His research interests include efficient hardware design of machine learning / neuromorphic algorithms and integrated power management. He was a recipient of IBM Outstanding Technical Achievement Award (2012), NSF CAREER Award (2017), and Intel Outstanding Researcher Award (2021).

Schedule subject to change without notice.

Sponsors

( Click on a logo to get more information)