2025 Volume 6 Issue 2 Pages 179-187
The Physics-Informed Diffusion Model (PIDM) is a surrogate model that utilizes diffusion models to generate numerical simulation results with physical law constraints. In scenarios with abundant numerical simulation data, PIDM shows promise for accurately representing complex phenomena such as eddies. Video generation using diffusion models faces several constraints, including: (a) the substantial memory requirements necessitating high-performance computing resources, and (b) the necessity for specialized training mechanisms, such as guidance, to produce results that align with observations.
In this study, we propose "Physics-Informed Repaint," a novel method that enables the learning and generation of videos aligned with observed data without relying on any pre-trained foundational models. This is achieved by: (i) integrating "Repaint," a technique that facilitates video generation consistent with observations without the need for pre-training guidance, into a diffusion model; and (ii) developing a method to impose Physics-Informed loss constraints solely during the generation phase. Notably, our proposed method requires only the training of an unconditional diffusion model, given that the computational results are available.
We validated the effectiveness of our proposed method for interpolating an 8-day period of missing observations within the highly nonlinear region of the Kuroshio-Oyashio Current Extension, characterized by numerous eddies, using computational results from a hydrodynamic model (DREAMS). Demonstrating the feasibility of our approach, we achieved current prediction on a consumer-grade, single-board GPU machine for a sequence of computational results with approximately 100,000 mesh elements. Furthermore, we analyzed the role of the Physics-Informed loss in our model.