Brookhaven National Laboratory Supports International DUNE Collaboration

Brookhaven National Laboratory is playing a key role in an international effort to understand neutrinos, as evidenced by its support of the recent DUNE Software and Computing Week hosted at Argonne National Laboratory. More than 50 scientists from over 200 institutions worldwide convened to refine software and computing infrastructure for the Deep Underground Neutrino Experiment, or DUNE, which aims to investigate the imbalance between matter and antimatter in the universe. The experiment, slated to begin data collection in 2029, will utilize detectors at Fermilab in Illinois and a larger facility a mile underground in South Dakota to study these elusive particles. “The DUNE Core Software and Computing Consortium is tackling some of the most complex challenges in modern physics,” said Michael Kirby of Brookhaven, leader of the DUNE Core Software and Computing Consortium. “By bringing together experts from around the world, we are not only advancing the tools and technologies needed for DUNE but also fostering collaboration that will drive innovation and discovery for years to come.”

DUNE Experiment Goals: Matter-Antimatter Asymmetry & Neutrino Studies

The prevalence of matter over antimatter remains one of physics’ most enduring mysteries, and the Deep Underground Neutrino Experiment (DUNE) is uniquely positioned to address it. These elusive particles, notoriously difficult to detect, hold clues to why matter dominates the cosmos, despite theoretical predictions of equal creation of both matter and antimatter during the Big Bang. DUNE aims to begin collecting data without a neutrino beam in 2029, and with a beam in 2031, processing data sets far exceeding those of current collider experiments. Fermilab’s Kyle Knoepfel described this as handling “data-rich movie files that consist of gigabytes of data instead of megabyte-resolution photographs.” This necessitates a new software framework, Phlex, designed to manage the scale and complexity of the incoming data. Argonne’s expertise in high energy physics and computing is central to this effort; Peter van Gemmeren, a principal computational scientist at Argonne and event organizer, stated, “With deep expertise in high energy physics and some of the most powerful facilities and tools, Argonne is uniquely positioned to support DUNE’s mission.”

Phlex Software & FORM Infrastructure for DUNE Data Handling

Existing data handling systems, designed for experiments generating megabyte-resolution data, are insufficient for DUNE’s expected gigabyte-scale records. This is similar to processing data-rich movie files rather than still photographs. To address this, the collaboration has developed Phlex, a new software framework intended to optimize data processing for the experiment. Complementing Phlex is FORM, a data input/output and storage infrastructure currently under development at Argonne, designed to support the framework’s processing demands. Scientists are also exploring the integration of compute accelerators, leveraging Argonne’s expertise in this area to further enhance processing speeds.

With deep expertise in high energy physics and some of the most powerful facilities and tools, Argonne is uniquely positioned to support DUNE’s mission.

Peter van Gemmeren, a principal computational scientist at Argonne and event organizer

2031 DUNE Data Collection & Computing Resource Management

Andrew Olivier, Barnali Chowdhury, Peter van Gemmeren, Esteban “Steve” Rangel, and Xiaoyong Jin represented Argonne’s contributions to both the physics and the substantial computing challenges inherent in the experiment. Kyle Knoepfel from Fermilab presented Phlex, a new software framework designed to handle the massive data records, gigabyte-sized files, that DUNE will generate, a significant increase over the megabyte-resolution data typical of collider experiments. Participants also explored strategies for efficient computing resource allocation, including quotas and prioritization, led by Steven Timm from Fermilab and Andrew McNab from the University of Manchester. The Aurora supercomputer at the Argonne Leadership Computing Facility will be instrumental in processing the incoming data, offering the necessary scale for this ambitious undertaking.

Workshops like this break silos and promote active exchange and networking between different collaborations and areas of research.

ProtoDUNE Validation at CERN & Aurora Supercomputer Integration

The success of the Deep Underground Neutrino Experiment (DUNE) increasingly relies on meticulous validation of its core technologies, as evidenced by recent activity at CERN and Argonne National Laboratory. Scientists are leveraging the 770-ton ProtoDUNE prototype, operated at the European Center for Nuclear Research, to refine detector responses before the full-scale experiment begins construction in South Dakota. This prototype is not simply a scaled-down version; it’s a crucial testing ground for understanding how the detector will interact with particles like pions and protons, essential for calibrating its sensitivity to elusive neutrinos. Researchers are specifically studying particle behavior in liquid argon, a key component of the DUNE detectors, to ensure accurate data capture. Complementing the ProtoDUNE validation is the integration of the Aurora supercomputer, housed at Argonne’s Leadership Computing Facility, into the DUNE data processing pipeline.

This powerful machine will be instrumental in handling the massive data streams anticipated from the experiment, which will deal with gigabyte-sized data records, a significant leap beyond the megabyte-resolution data typical of current collider experiments. The collaborative effort, bringing together over 50 scientists from more than 200 institutions, underscores the complexity of modern physics research.

The DUNE Core Software and Computing Consortium is tackling some of the most complex challenges in modern physics. By bringing together experts from around the world, we are not only advancing the tools and technologies needed for DUNE but also fostering collaboration that will drive innovation and discovery for years to come.

Michael Kirby of Brookhaven
Quantum News

Quantum News

There is so much happening right now in the field of technology, whether AI or the march of robots. Adrian is an expert on how technology can be transformative, especially frontier technologies. But Quantum occupies a special space. Quite literally a special space. A Hilbert space infact, haha! Here I try to provide some of the news that is considered breaking news in the Quantum Computing and Quantum tech space.

Latest Posts by Quantum News:

IBM and Cleveland Clinic Demonstrate Protein Simulation with Quantum Computing

IBM and Cleveland Clinic Demonstrate Protein Simulation with Quantum Computing

March 27, 2026
IBM Achieves Reliable Quantum Simulation Matching National Laboratory Data

IBM Achieves Reliable Quantum Simulation Matching National Laboratory Data

March 27, 2026
Xanadu Becomes Publicly Listed Company on Nasdaq and Toronto Stock Exchange

Xanadu Becomes Publicly Listed Company on Nasdaq and Toronto Stock Exchange

March 27, 2026