The Decade Ahead: Building Frontier AI Systems for Science and the Path to Zettascale

The Decade Ahead: Building Frontier AI Systems for Science and the Path to Zettascale

Monday, May 13, 2024 6:05 PM to 6:20 PM · 15 min. (Europe/Berlin)
Hall Z - 3rd floor
Focus Session
AI Applications powered by HPC TechnologiesHPC Simulations enhanced by Machine LearningHPC System Design for Scalable Machine LearningML Systems and Tools

Information

The successful development of transformative applications of AI for science, medicine, and energy research will have a profound impact on the way we pursue science and engineering goals. Frontier AI (the leading edge of AI systems) enables small teams to conduct increasingly complex investigations, such as hypothesis generation, whereas other challenges, like human-to-human communication, remain resistant. Together, these developments signify a shift toward more capital-intensive science, as productivity gains from AI will drive resource allocations to groups that effectively leverage AI into scientific outputs. And with AI becoming a major driver of innovation in HPC, we expect shifts in the computing marketplace as performance gap widens between systems designed for traditional scientific computing versus those optimized for large-scale AI such as LLMs. In response to these trends and in recognition of the role of government-supported research to shape the future research landscape, the U. S. Department of Energy has created the FASST (Frontier AI for Science, Security and Technology). FASST is a decadal research and infrastructure development initiative aimed toward accelerating the creation and deployment of frontier AI systems for science, energy research, and national security. I will review FASST and how we imagine it transforming the research at the national laboratories. I’ll discuss the recently established Trillion Parameter Consortium (TPC), which aims to foster a community-wide effort to accelerate the creation of large-scale generative AI for science. Last, I'll introduce the AuroraGPT project—an international collaboration to build a series of multilingual multimodal foundation models for science.
Format
On-siteOn Demand
Advanced Level
100%

Log in