Researchers are tackling the computational challenges associated with solving partial differential equations (PDEs), an essential task across numerous scientific and engineering disciplines. Cindy Xiangrui Kong, Yueqi Wang, and Haoyang Zheng, from Purdue University, alongside Weijian Luo of hi-Lab, Xiaohongshu Inc, and Guang Lin also from Purdue University, present a new framework, Phys-Instruct, that significantly accelerates PDE solving using diffusion-based models. This work addresses limitations of existing methods, namely, slow sampling speeds and inconsistencies with underlying physical laws, by compressing the solving process into a few steps and explicitly incorporating PDE constraints. Demonstrating substantial improvements across five benchmark problems, Phys-Instruct achieves dramatically faster inference and reduces PDE error by over eight times compared to current state-of-the-art techniques, offering a pathway towards efficient and physically plausible solutions for a wide range of applications.
Accelerating partial differential equation solving via diffusion model distillation and physics guidance enables efficient and accurate simulations
Diffusion-based models are rapidly advancing the field of partial differential equation (PDE) solving, demonstrating both accuracy and broad applicability. However, a significant hurdle remains: the substantial computational cost associated with their iterative sampling process and a frequent lack of inherent physical consistency.
To overcome these limitations, researchers have developed Phys-Instruct, a novel physics-guided framework designed to dramatically accelerate PDE solving while simultaneously enhancing solution fidelity. This work introduces a distillation process that compresses a pre-trained diffusion PDE solver into a generator requiring only a few sampling steps, enabling ultra-fast inference.
Phys-Instruct achieves this compression by matching the distributions of the generator and the original diffusion model, facilitating rapid sampling without sacrificing accuracy. Crucially, the framework also explicitly incorporates PDE knowledge through a novel PDE distillation guidance mechanism, ensuring solutions adhere to fundamental physical principles.
Built upon a robust theoretical foundation, Phys-Instruct introduces a practical, physics-constrained training objective with easily calculated gradients. Evaluations across five distinct PDE benchmarks reveal that Phys-Instruct achieves inference speeds orders of magnitude faster than state-of-the-art diffusion baselines, while simultaneously reducing PDE error by more than 8×.
Beyond speed and accuracy, the resulting unconditional student model functions as a compact prior, offering a versatile tool for efficient and physically consistent inference in a variety of downstream conditional tasks. This innovation unlocks the potential for real-time PDE solving and opens new avenues for applying deep generative models to complex scientific simulations. The research demonstrates that Phys-Instruct is a novel, effective, and efficient framework for ultra-fast PDE solving, representing a significant step towards practical diffusion-based PDE solutions.
Distilling physics-informed diffusion models for efficient partial differential equation solving requires careful consideration of model size and training data
Diffusion-based models are employed to solve partial differential equations, and this work introduces Phys-Instruct, a framework designed to improve sampling efficiency and physical consistency. The research begins by compressing a pre-trained diffusion PDE solver into a few-step generator through matching generator and prior diffusion distributions, enabling rapid sampling.
Simultaneously, physics consistency is enhanced by explicitly injecting the PDE through a PDE guidance mechanism during the distillation process. Phys-Instruct is grounded in a theoretical foundation that yields a practical, physics-constrained training objective with tractable gradients. The methodology centres on distilling a multi-step teacher model into a student generator capable of producing high-quality solutions with fewer function evaluations.
This distillation process aligns the time-indexed marginals of the teacher and student models along the diffusion process, encouraging the student to replicate the teacher’s generative behaviour over time. Specifically, the student model learns to approximate the teacher’s score, ∇xtlog pt(xt), a measure of the time-dependent density of the data at each diffusion step.
The forward diffusion process is defined by the Itô stochastic differential equation, dxt= F(xt, t) dt+ G(t) dwt, where F and G represent drift and diffusion coefficients, and wt is a Wiener process. Training is performed by minimizing a weighted denoising score matching objective, LDSM(φ) = ∫T0 w(t) Ex0,xt|x0g1dt, where w(t) balances contributions from different noise levels and g1 represents the squared difference between the student’s score estimate and the teacher’s score.
This approach facilitates efficient and physically consistent inference for both unconditional and conditional downstream tasks, resulting in a compact prior model. Across five PDE benchmarks, Phys-Instruct achieves orders-of-magnitude faster inference and reduces PDE error by over eight times compared to existing diffusion baselines.
Physics-guided distillation accelerates PDE solving and enhances unconditional generation of physically plausible solutions
Across five PDE benchmarks, the Phys-Instruct framework achieves reductions in PDE error exceeding 8× compared to current state-of-the-art diffusion baselines. Inference speed is also dramatically improved, with the system demonstrating orders-of-magnitude faster performance. This work introduces a physics-guided distillation framework that compresses a multi-step diffusion teacher PDE solver into a generator requiring only a few steps.
The resulting student model operates with a fixed, few-step procedure at inference time, eliminating the need for PDE-based guidance, correction, or refinement during sampling. The research focuses on both sampling efficiency and physical consistency within diffusion models used for solving partial differential equations.
Experiments demonstrate that incorporating physics guidance during distillation improves the quality of unconditional generation, establishing a favourable balance between accuracy and latency. Specifically, the unconditional student model functions as a compact prior, enabling efficient and physically consistent inference for a variety of downstream conditional tasks.
Evaluations were conducted on a broad range of PDE benchmarks with varied formulations, confirming the effectiveness of the approach. The framework’s ability to distill a multi-step diffusion teacher into a few-step generator is a key achievement. This distilled generator serves as a reusable diffusion prior for conditional solvers, facilitating non-iterative sampling across diverse PDE observation scenarios. The study highlights a three-fold contribution, including the proposed Phys-Instruct framework, the demonstration of improved distillation through PDE error constraints, and the establishment of the distilled generator as a versatile diffusion prior.
Diffusion compression via physics-informed generative modelling delivers rapid and accurate PDE solutions for complex systems
Phys-Instruct, a novel physics-guided framework, efficiently solves partial differential equations by compressing a diffusion-based solver into a few-step generator. This compression is achieved through matching generator and prior diffusion distributions, enabling rapid sampling while maintaining accuracy.
Furthermore, the framework enhances physical consistency by explicitly incorporating the partial differential equation itself as a guiding mechanism during the solving process. Across five benchmark problems, Phys-Instruct demonstrates significantly faster inference speeds and reduces errors in solving the equations by more than eightfold compared to existing diffusion methods.
The resulting model also functions effectively as a compact prior, facilitating efficient and physically plausible solutions for various conditional tasks. The authors acknowledge that the framework currently focuses on specific types of equations and does not address resolution robustness for practical applications.
Future research will explore combining Phys-Instruct with more advanced distillation techniques to improve knowledge transfer and sample diversity. Lightweight adaptation mechanisms could also be implemented for conditional problems, and extending the framework to higher-dimensional equations represents a crucial step towards wider scientific use.
These results establish a foundation for building efficient, physics-aware generative models for solving partial differential equations. The framework’s ability to achieve fast inference without relying on post-hoc physics corrections is particularly noteworthy, offering a pathway towards practical applications in scientific computing and engineering.
👉 More information
🗞 Ultra Fast PDE Solving via Physics Guided Few-step Diffusion
🧠 ArXiv: https://arxiv.org/abs/2602.03627
