MICL Seminar

The Benefits of Mixed-Signal Computing for Neural Network Inference

Laura FickFounding Analog Circuit DesignerMythic
SHARE:

Neural networks are making major impacts in all forms of edge computing, including computer vision, speech recognition, motor control, and more. Although covering many different applications, these inference applications share a common trait: their workload is dominated by matrix multiplies by static coefficients. Mythic has developed a technology to exploit that commonality by using embedded non-volatile memories to store those static matrix co-efficients and mixed-signal computing to efficiently multiply them. In this presentation, we discuss the properties of neural network inference and how Mythic's technology is able to achieve 20-100x improvements in energy efficiency and performance.

Laura Fick is a founding analog circuit designer at Mythic, a US-based startup that is creating the next generation of AI inference microchips. She received her B.S. in Electrical Engineering at the University of Maryland College Park, and her M.S. and PhD at the University of Michigan Ann Arbor as an NSF Graduate Research Fellow. Her thesis research focused on low-power, high performance analog compute technology for neural network accelerators. During her graduate research she worked with Mythic to create their in-memory analog compute matrix-multiply accelerator, which has since received $55M in venture funding from top-tier investors.

Sponsored by

EECS Alumni/MICL

Faculty Host

Dennis Sylvester