SambaNova delivers next-generation DataScale System to Argonne National Laboratory

New system provides capabilities to advance AI for Science.

  • 2 years ago Posted in

SambaNova Systems is delivering its SambaNova’s DataScale® system to the U.S. Department of Energy’s Argonne National Laboratory to provide a new resource for accelerating AI for science workloads, including large-scale imaging data and large language models.

 

DataScale is a fully integrated hardware-software system that enables organizations to reimagine their approach to AI.

 

“With the rollout of Argonne’s SambaNova systems, we’re seeing scientists use novel AI architectures for pioneering research in areas ranging from climate predictions to neutrino physics,” said Rick Stevens, Argonne’s associate laboratory director for Computing, Environment and Life Sciences. “The new SambaNova system will help boost AI-driven research including our efforts to use deep learning to predict how tumors respond to various drug combinations and, in general, foundation AI models for science.”

“The new capabilities and increased memory capacity of SambaNova’s DataScale will open the door to a wider range of AI for science applications involving large AI models and massive datasets,” added Venkat Vishwanath, data science team lead at the Argonne Leadership Computing Facility (ALCF), a DOE Office of Science user facility. “We look forward to running large-scale AI models on the new system to accelerate insights into the growing deluge of scientific data being produced by simulations and experiments, including the high-resolution imaging data being generated at DOE light sources.”

 

The new DataScale system will be made available to the scientific community via the ALCF AI Testbed, a growing collection of some of the world’s most advanced AI accelerators. The ALCF AI Testbed is designed to enable researchers to explore deep learning and Foundation Model workloads to advance AI for science. The ALCF’s AI platforms complement the facility’s current and next-generation supercomputers to provide a state-of-the-art environment that supports research at the intersection of AI, big data, and high-performance computing.

 

“The multi-year deal being announced today is an expansion of our current partnership with Argonne National Laboratory,” said Marshall Choy, SVP of Product at SambaNova Systems. “The partnership showcases Argonne's adoption of a multi-rack SambaNova system and joint efforts on implementing challenging foundation model and deep learning workloads. For scientific research organizations, this means new experiments and discoveries with the potential to change the world."

 

Researchers at Argonne have been using SambaNova’s previous generation system for a wide range of studies, including developing surrogate models to improve weather forecasting accuracy, speeding up computational fluid dynamics simulations for engine research, and processing high resolution image datasets to accelerate experimental discoveries. The science use cases explored to date include:

 

Edge Computing: Using the ALCF’s SambaNova system, researchers demonstrated how specialized AI systems can be used to quickly train machine learning models through a geographically distributed workflow. They then deployed the models on edge computing devices near an experimental data source to enhance researchers’ ability to process and analyze the increasingly large imaging datasets collected from light sources and other experimental facilities.    

Neutrino Physics: To improve neutrino signal efficiency, researchers leveraged the ALCF’s SambaNova system to improve a classic image segmentation task, establishing a new state-of-the art accuracy level using images at their original resolution without the need to downsample. 

Cancer Prediction: The ALCF’s new SambaNova platform provides capabilities to advance efforts to predict tumor response to single and paired drugs based on the molecular features of tumor cells.

Climate Modeling: Researchers are using the ALCF's SambaNova system and deep learning techniques to develop surrogate models from publicly available weather and climate data. Their goal is to determine if the surrogate models can provide improved forecast accuracy compared to current deep learning deployments built on downscaled datasets. 

Engine Research: As part of an effort to create predictive computational design tools for next-generation engines, researchers are using the ALCF’s SambaNova system to develop scalable machine learning models for fast and accurate predictions of the turbulent flows that occur in automotive engines.


Beacon, NY, Dec 20, 2024– DocuWare unveils its AI-powered Intelligent Document Processing...
85% of IT decision makers surveyed reported progress in their companies’ 2024 AI strategy, with...
Lopitaux joins as global companies embrace GenAI solutions at scale and look to build their own...
Predictive maintenance and forecasting for security and failures will be a growing area for MSPs...
NVIDIA continues to dominate the AI hardware market: powering over 2x the enterprise AI deployments...
Hitachi Vantara survey finds data demands to triple by 2026, highlighting critical role of data...
81% of enterprises plan to increase investments in AI-powered IT operations to accelerate...
Hitachi Vantara survey finds data demands to triple by 2026, highlighting critical role of data...