DOE allocates $68m for advancing foundation models in AI research

The U.S. Department of Energy (DOE) has announced $68m funding for 11 multi-institution projects aimed at advancing the development of foundation models, a critical building block of AI technology.

These projects will focus on creating AI models with broad applications across scientific research, automating workflows, and accelerating computational science.

Key investment in foundation models

Foundation models are a type of machine learning or deep learning framework trained on large, diverse datasets.

These models are versatile and can be adapted to a wide range of tasks, from scientific programming to automating laboratory processes.

With this funding, researchers will not only develop foundation models but also explore innovative methods to enhance their energy efficiency and privacy-preserving capabilities.

By training models that use privacy-preserving and distributed methods, the DOE aims to ensure that AI technologies protect sensitive data while still offering valuable insights.

This step aligns with the department’s broader goals to secure the trustworthy use of AI in scientific discovery, as outlined in Executive Order 14110 from the White House.

Foundation models for energy efficient and secure AI

Ceren Susut, DOE’s Associate Director of Science for Advanced Scientific Computing Research, emphasised the transformative potential of AI in scientific research, noting that the department’s efforts will make AI applications more trustworthy and energy efficient.

“Progress in AI is inspiring us to imagine faster and more efficient ways to do science,” Susut said. “These research efforts will make scientific AI both more trustworthy and more energy efficient, unlocking AI’s potential to accelerate scientific discovery.

“There is a huge variety in the number of applications where scientists can use AI, from the laboratory to the field to producing scientific research.”

Among the goals of the funded projects is the development of energy-efficient AI algorithms and hardware, which will support the DOE’s push for environmentally responsible innovation.

Another key area is the privacy-preserving models that protect personal and sensitive data during AI processes, allowing institutions to collaborate without compromising privacy.

Diverse applications and long-term funding

The range of projects supported by this funding covers diverse scientific applications. This includes understanding how foundation models improve as they grow in size and complexity and training AI models using data distributed across multiple institutions.

The selected projects are part of the DOE’s Funding Opportunity Announcement for Advancements in Artificial Intelligence for Science. Total funding includes $20m allocated for fiscal year 2024, with additional funding for subsequent years dependent on congressional appropriations.

The projects will span up to three years, fostering long-term collaboration and innovation across research institutions.

The DOE’s initiative highlights the importance of foundation models in advancing AI-driven scientific discovery.

By enhancing the energy efficiency and security of these models, the department is setting the stage for AI to play a pivotal role in future scientific breakthroughs.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Featured Topics

Partner News

Advertisements



Similar Articles

More from Innovation News Network