![Chris Re](/sites/default/files/styles/card/public/2023-10/chris-re.png?h=91971008&itok=hCLmeUJo)
SLAC Colloquium - "Some Building Blocks for Foundation Model Systems" by Chris Re, Stanford University
Chris Re, Stanford University, Department of Computer Science
I fell in love with foundation models because they radically improved data systems that I had been trying to build for a decade. Motivated by this experience, the bulk of the talk focuses on efficient building blocks for foundation models. The first line of work describes fundamental trends in hardware accelerators for AI that we can leverage, e.g., optimizing memory accesses in Flash Attention on GPUs. The second line of work describes new architectures that are asymptotically more efficient than transformers for long sequences. One family of these architectures is inspired by signal processing and classical architectures, like RNNs and CNNs, in a mathematically precise way. These new architectures have achieved state-of-the-art quality on long-sequence tasks, are promising general purpose architectures, and are being applied to new areas. Of course, as researchers, we want to understand their limits and how to improve them, which the talk will focus on. Two underlying themes in the talk are understanding the role of inductive bias in AI models and understanding how robust our recipe is to get amazing AI.
About Chris Re
![Chris Re](/sites/default/files/styles/medium/public/2023-10/ChrisRe_Headshot.jpg?itok=kKyIyyta)
Christopher (Chris) Re is an associate professor in the Department of Computer Science at Stanford University. He is in the Stanford AI Lab and is affiliated with the Machine Learning Group and the Center for Research on Foundation Models. His recent work is to understand how software and hardware systems will change because of machine learning along with a continuing, petulant drive to work on math problems. Research from his group has been incorporated into scientific and humanitarian efforts, such as the fight against human trafficking, along with products from technology and companies including Apple, Google, YouTube, and more. He has also co-founded companies, including Snorkel, SambaNova, and Together, and a venture firm, called Factory.
His family still brags that he received the MacArthur Foundation Fellowship, but his closest friends are confident that it was a mistake. His research contributions have spanned database theory, database systems, and machine learning, and his work has won best paper at a premier venue in each area, respectively, at PODS 2012, SIGMOD 2014, and ICML 2016. Due to great collaborators, he received the NeurIPS 2020 test-of-time award and the PODS 2022 test-of-time award. Due to great students, he received best paper at MIDL 2022, best paper runner up at ICLR22 and ICML22, and best student-paper runner up at UAI22.
Audience: Public