ISC 2024 Keynote: High-precision Computing Will Be a Foundation for AI Models
Some scientific computing applications cannot sacrifice accuracy and will always require high-precision computing.
Therefore, conventional high-performance computing (HPC) will remain essential, even as many applications transition to AI and low-precision computing, which may sacrifice some accuracy.
“We need to ensure that we don’t lose that high-precision arithmetic as we focus on some commercial applications,” said Kathy Yelick, vice chancellor for research and distinguished professor of computer sciences at the University of California, Berkeley.
“High-precision calculations are vital for generating reliable scientific data, which serves as the foundation for large language models,” Yelick said.
While hyperscalers and chip makers like Nvidia prioritize low-precision for AI and power-efficient computing, high-precision arithmetic remains critical in generating much of the scientific content necessary to train new scientists and machine-learning AI models.
These include biomedical research, drug design, medical devices, climate change research, and applications that require deep simulation and modeling.
“Precision medicine, I think, will continue to be really important in certain areas like cancer,” Yelick said.
Climate change is also a long-standing traditional HPC application.
“As we look at climate simulations, what adaptations do we need for things like wildfires, sea level rise, and those kinds of problems? [We’re] getting more precise information about where and what extreme climate events we’ll see,” Yelick said.
At the same time, Yelick encouraged the HPC community to look beyond the traditional focus on modeling and simulation and embrace AI and quantum computing.
“We need to think beyond modeling and simulation at HPC. I think it is a little bit of a danger that we get too focused on the traditional application areas. There are incredible opportunities here,” Yelick said.
She gave examples of applications that could benefit researchers.
“For example, people doing computer vision and robotics – that doesn’t fit our kind of traditional model of what HPC applications are,” Yelick said.
However, Yelick also expressed concern about the growing influence of hyperscalers and chip makers on scientific computing.
AI research is mostly in the private sector. Due to cost constraints, academics often lack access to the same level of AI computing capabilities in the cloud.
Most AI hardware is also locked down with the hyperscalers, which are not making their chips commercially available.
Yelick also mentioned that the number of system builders for high-performance computers was dwindling. Many students were attracted to AI as opposed to chip design.
She recognized that the diversity of computing options beyond x86 allows the HPC community to influence system design.
Yelick recommended working closely with hyperscalers “to make sure we’re building systems that are of interest to their market as well as our market.”
Yelick said the HPC community has traditionally punched above its weight in the broader marketplace.
“What we bring as an HPC community to the design of future high-performance systems is a really deep understanding of applications, algorithms, and how they map onto hardware. Using that kind of insight can help drive the industry in an outsized way relative to the total number and value of the systems we will be purchasing in HPC,” Yelick said.
At the same time, researchers shouldn’t rely completely on commercial cloud resources for HPC. Researchers could get priced out using cloud providers.
“I think we must also ensure we have our own path. You’re competing against everybody doing training for commercial applications. Then academics will be unable to afford to purchase those kinds of cloud resources,” Yelick said.