Keynotes

December 15-19, 2025

Noah A. Smith

Model Flows: Powering Science of AI and AI for Science

Abstract: Neural language models with billions of parameters and trained on trillions of words are powering the fastest-growing computing applications in history and generating discussion and debate around the world. Yet most scientists cannot study or improve those state-of-the-art models because the organizations deploying them keep their data and machine learning processes secret. I believe that the path to models that are usable by all, at low cost, customizable for areas of critical need like the sciences, and whose capabilities and limitations are made transparent and understandable, is radically open development, with academic and not-for-profit researchers empowered to do reproducible science. In this talk, I’ll discuss some of the work our team is doing to radically open up the science of language modeling and make it possible to explore new scientific questions and democratize control of the future of this fascinating and important technology.

The work I’ll present was led by a large team at the Allen Institute for Artificial Intelligence in Seattle, with collaboration from the Paul G. Allen School at the University of Washington and various kinds of support and coordination from many organizations, including the Kempner Institute for the Study of Natural and Artificial Intelligence at Harvard University, AMD, CSC – IT Center for Science (Finland), Databricks, Together.ai, the National AI Research Resource Pilot, and Oak Ridge National Laboratory. In August, the team was awarded a $75M mid-scale research infrastructure grant from the National Science Foundation, with additional support from NVIDIA, enabling continued work for five years.

Bio: Noah A. Smith is a researcher in natural language processing and machine learning, serving as the Vice Provost for Artificial Intelligence and Professor of Computer Science and Engineering at the University of Washington and Senior Director of NLP Research at the Allen Institute for AI. He co-directs the OLMo open language modeling initiative and is the PI of the NSF- and NVIDIA-supported project “Open Multimodal AI Infrastructure to Accelerate Science.” His current work spans language, music, and AI research methodology, with a strong emphasis on mentoring—his former mentees now hold faculty and leadership roles worldwide. Smith is a Fellow of the Association for Computational Linguistics and has received numerous awards for research and innovation.

 

James Zou

AI agents to accelerate scientific discoveries.

Abstract: AI agents—large language models equipped with tools and reasoning capabilities—are emerging as powerful research enablers. This talk will explore how agentic AI can accelerate scientific discoveries. I’ll first introduce the Virtual Lab—a collaborative team of AI scientist agents conducting in silico research meetings to tackle open-ended research projects. As an example application, the Virtual Lab designed new nanobody binders to recent Covid variants that we experimentally validated. Then I will introduce Paper2Agent, a framework to automatically convert passive research papers into interactive AI agents. Finally I will discuss learnings from Agents4Science, the first conference where the authors and reviewers are primarily AI systems.

Bio: James Zou is an associate professor of Biomedical Data Science, CS and EE at Stanford University. He works on developing cutting-edge AI for biomedical applications. His group developed many widely used innovations including EchoNet AI (FDA cleared for assessing cardiac function), Gradio (used by over a million developers), and SyntheMol (NY Times 2024 Good Tech). He has received the Overton Prize, Sloan Fellowship, NSF CAREER Award, two Chan-Zuckerberg Investigator Awards, a Top Ten Clinical Achievement Award, best paper awards at ICML and other AI conferences, and faculty awards from Google, Amazon, Adobe and Apple.