一本道无码

Skip to main content
A red graf with the words "AI in Physics" toward the top.

AI Expands Potential for Discovery in Physics

Carnegie Mellon physicists use AI to analyze large datasets from experiments, predict complex physical phenomena and optimize simulations

Media Inquiries
Name
Heidi Opdyke
Title
Mellon College of Science

The long-standing interplay between artificial intelligence and the evolution of physics played a pivotal role in awarding the 2024 Nobel Prize in Physics to two AI trailblazers.

鈥淎I for physics and physics for AI are concepts you hear,鈥 said Matteo Cremonesi(opens in new window), an assistant professor of physics at 一本道无码. 鈥淭he fact that the Nobel Prize went to AI pioneers is not surprising. It鈥檚 a recognition of something that has been going on for some time.鈥

The Royal Swedish Academy of Sciences to John J. Hopfield of Princeton University and Geoffrey E. Hinton of the University of Toronto in recognition of their foundational work in machine learning with artificial neural networks. Hinton served on the faculty at Carnegie Mellon from 1982-87.

鈥淭here鈥檚 been a lot of work in recent years in how we can use neural networks for scientific discovery in physics,鈥 said Rachel Mandelbaum(opens in new window), interim head of Carnegie Mellon鈥檚 Department of Physics. 鈥淭his is an important recognition in some of the precursor studies that set us along this pathway, and it鈥檚 an area where Carnegie Mellon is very active.鈥

Carnegie Mellon鈥檚 Department of Physics(opens in new window) has strong research groups working in astrophysics and cosmology, particle physics, biophysics, computational physics and theoretical physics, who play key roles in ongoing experiments in these areas.

Astronomical datasets

Rachel Mandelbaum, right, visits the Vera C. Rubin Observatory in northern Chile.

Rachel Mandelbaum, right, visits the Vera C. Rubin Observatory in northern Chile.

Mandelbaum leads research with the University of Washington to develop software to analyze large datasets generated by the (LSST), which will be carried out by the Vera C. Rubin Observatory in northern Chile.

鈥淚n astrophysics, if we go back, say 20 years, the datasets were pretty small. And one of the main areas for innovation was computational techniques for creating simulations of the universe or galaxies more efficiently. Today we have larger and larger datasets. Those new problems need new tools. It鈥檚 very much the case that we have problems that help the field of machine learning evolve.鈥

Through the LSST, the Rubin Observatory, a joint initiative of the National Science Foundation and the Department of Energy, will collect and process more than 20 terabytes of data each night 鈥 up to 10 petabytes each year for 10 years 鈥 and will build detailed composite images of the southern sky, including information about changes over time.

"Many of the LSST's science objectives share common traits and computational challenges," she said. "If we develop our algorithms and analysis frameworks with forethought, we can use them to enable many of the survey's core science objectives."

Biological laws

Fangwei Si

Fangwei Si

Fangwei Si, the Cooper-Siegel Assistant Professor of Physics, aims to understand living systems in a quantitatively precise way. With a background in physics, mechanical engineering and biology, he applies physical concepts and tools to study bacterial cells.聽

鈥淲e look at how individual bacterial cells respond to environmental changes, but we need statistics and to track their life for a very long time,鈥 Si said. 鈥淲e track thousands of bacterial cells in environments and need to analyze how the health of each cell develops over time.鈥

Si gathers information on food intake, size, shape, density, reproduction and growth rates, and how those and other factors are connected to internal activities of the subcellular contents, such as proteins, RNA and DNA.

鈥淲e have those parameters for tens of thousands or even more cells,鈥 Si said. 鈥淚n the 1960s microscopy was just developed enough to capture pictures of single bacterial cells. People measured those quantities by hand with rulers on films. It used to be that Ph.D. students could spend years on one experiment. But now with machine learning-based analysis, we can finish the same work within one day.鈥

Machine learning helps Si and his researchers analyze terabytes of data at faster speeds. The lab develops and adapts tools, takes rigorous measurements and defines new concepts. Using trained machine learning models Si and his colleagues can automatically detect where cells are in images and analyze them.聽

鈥淥ne of our ongoing projects is to use the cell images captured and combine them with machine learning to understand the physiological state of the cell. We hope to extend this work to other organisms 鈥 for example, human cells 鈥 so that in the future we can tell whether the cells are ready to enter different states, like turning into a cancer cell.鈥

Need for speed

Matteo Cremonesi

Matteo Cremonesi

Giant particle accelerators like CERN鈥檚 Large Hadron Collider smashes protons together at nearly the speed of light. The Compact Muon Solenoid (CMS) detector captures 3D images of these collisions at the rate of one every 25 nanoseconds, or 40 million images a second. This corresponds to a data rate of 10 terabytes per second.聽

鈥淲e don鈥檛 have the capability to save all of the information delivered by the collider, so we need an online mechanism to decide if images are interesting or not,鈥 Cremonesi said.

Cremonesi is part of a team building software that will analyze these images in real-time, contributing to the CMS 鈥渢rigger.鈥 The Next-Generation Triggers project at CERN aims to develop a way to filter the flow of data through a high-performance event-selection system. The trigger is a hardware data filtering system that rapidly decides when an image is interesting enough to save for offline analysis. Out of those 40 million images taken, on average 1,200 images are selected by the trigger, or less than 0.003%. A team of 一本道无码 students and researchers led by Cremonesi is studying the usability of AI to improve the accuracy of this crucial decision-making step.聽

鈥淭his is very challenging real-time processing that we do. It鈥檚 one of the most challenging real-time AI applications on Earth,鈥 Cremonesi said. To put it in perspective, self-driving cars that use machine learning algorithms operate within microseconds, which are 1,000 times larger than nanoseconds.

Cremonesi started working with the CMS experiment in 2015 while he was a postdoctoral researcher at Fermi National Accelerator Laboratory in Chicago. He specialized in computing operations, software development and applied AI.

鈥淭ypically researchers in my field develop technical expertise in hardware, but I was always more fascinated by the software aspect of physics,鈥 said Cremonesi, who joined 一本道无码 in 2020.

鈥 Related Content 鈥