Efficient Methods and Hardware for Deep Learning

Date(s):

Location:
Jacobs School of Engineering, 9500 Gilman Dr, La Jolla, San Diego, California 92093

Sponsored By:
Stefanie Battaglia

Speaker(s):
Song Han
Song Han

Speaker Bio:
SONG HAN is a Ph.D. candidate supervised by Prof. Bill Dally at Stanford University. His research focuses on energy-efficient deep learning, at the intersection between machine learning and computer architecture. He proposed the Deep Compression algorithm, which can compress neural networks by 10–49× while fully preserving prediction accuracy. He designed the first hardware accelerator that can perform inference directly on a compressed sparse model, which results in significant speedup and energy saving. His work has been featured by O’Reilly, TechEmergence, TheNextPlatform, and Embedded Vision, and it has impacted the industry. He led research efforts in model compression and hardware acceleration that won the Best Paper Award at ICLR’16 and the Best Paper Award at FPGA’17. Before joining Stanford, Song graduated from Tsinghua University.

Contact:
Stefanie Battaglia
Executive Assistant to the Department Chair
sbattaglia@ucsd.edu | Ph: (858) 534-7013