Introducing JANC: A High-Performance Python-Based Solver for Compressible Reacting Flows

On April 18, 2025, researchers Haocheng Wen, Faxuan Luo, Sheng Xu, and Bing Wang introduced JANC: A cost-effective, differentiable compressible reacting flow solver featured with JAX-based adaptive mesh refinement, detailing a novel Python-based solver that leverages automatic differentiation for enhanced efficiency. Their work presents the JAX-AMR framework, which significantly outperforms Openfoam by reducing computational costs to 1-2% when using GPUS, and is available under an MIT license.

Researchers developed JAX-AMR, a JAX-based block-structured adaptive mesh refinement framework, and JANC, a fully differentiable solver for compressible reacting flows. JANC leverages automatic differentiation, XLA JIT compilation, GPU/TPU computing, parallelization, and AMR to achieve significant computational efficiency improvements.

JANC’s core solver on a single A100 GPU in tests reduced computational costs to 1% of OpenFOAM’s, which used 384 CPU cores. With AMR enabled, JANC’s cost dropped to 1-2% of Openfoam’s. The solver also demonstrated efficient parallel computing on 4 A100 GPUS and strong compatibility with adjoint optimisation for dynamic trajectory differentiation. JANC offers a high-performance, cost-effective framework for large-scale combustion and energy research simulations. Source codes are available under the MIT license.

Computational fluid dynamics (CFD) has long been a cornerstone of engineering and scientific research, enabling the simulation of complex fluid flows and heat transfer phenomena. However, traditional CFD methods rely on solving partial differential equations (PDE’s) using numerical discretisation techniques, which can be computationally intensive and limited in their ability to handle highly nonlinear or multiscale problems. Recent advancements in machine learning (ML) have opened new avenues for addressing these challenges. By integrating ML algorithms with CFD simulations, researchers are developing innovative approaches that promise faster, more accurate, and more scalable solutions to fluid dynamics problems.

At the heart of this innovation lies the application of neural networks to solve PDEs directly. Unlike conventional methods that discretise equations over a grid, machine learning models can approximate solutions using continuous functions, potentially reducing computational costs while maintaining high accuracy. This approach is auspicious for problems involving turbulence, combustion, and detonation waves—phenomena that are notoriously difficult to model due to their chaotic and multiscale nature.

One notable example of this integration comes from recent studies where researchers applied machine learning techniques to simulate the interaction between gaseous detonation waves and water droplets. By training neural networks on high-fidelity simulation data, they could predict the behaviour of these complex systems with remarkable accuracy. This not only demonstrates the potential of ML in handling nonlinear dynamics but also highlights its ability to generalise across different scenarios, making it a versatile tool for fluid dynamics research.

The implications of this innovation are far-reaching. In industries such as aerospace, automotive, and energy, where efficient fluid flow management is critical, machine learning-enhanced CFD could lead to the design of more aerodynamic vehicles, optimised combustion systems, and safer industrial processes. For instance, in the context of rotating detonation engines—a technology with potential applications in propulsion—ML-driven simulations have shown promise in improving efficiency and performance.

Additionally, this integration can enhance predictive maintenance in industries like energy production, where understanding fluid dynamics is crucial for optimising operations and reducing downtime. The ability to model complex systems more accurately could also lead to advancements in weather forecasting and climate modeling, providing insights into large-scale fluid behavior with greater precision.

👉 More information
🗞 JANC: A cost-effective, differentiable compressible reacting flow solver featured with JAX-based adaptive mesh refinement
🧠 DOI: https://doi.org/10.48550/arXiv.2504.13750

Quantum News

Quantum News

As the Official Quantum Dog (or hound) by role is to dig out the latest nuggets of quantum goodness. There is so much happening right now in the field of technology, whether AI or the march of robots. But Quantum occupies a special space. Quite literally a special space. A Hilbert space infact, haha! Here I try to provide some of the news that might be considered breaking news in the Quantum Computing space.

Latest Posts by Quantum News:

Scientists Guide Zapata's Path to Fault-Tolerant Quantum Systems

Scientists Guide Zapata’s Path to Fault-Tolerant Quantum Systems

December 22, 2025
NVIDIA’s ALCHEMI Toolkit Links with MatGL for Graph-Based MLIPs

NVIDIA’s ALCHEMI Toolkit Links with MatGL for Graph-Based MLIPs

December 22, 2025
New Consultancy Helps Firms Meet EU DORA Crypto Agility Rules

New Consultancy Helps Firms Meet EU DORA Crypto Agility Rules

December 22, 2025