EddyFormer: Accelerated Neural Simulations of Three-Dimensional Turbulence at Scale

Yiheng Du (UC Berkeley) and Aditi S. Krishnapriyan (UC Berkeley, LBNL)

Simulating turbulence in fluid dynamics is a massive computational challenge. Traditional methods like Direct Numerical Simulation (DNS) are incredibly accurate but also prohibitively expensive, limiting their use for large-scale or real-time applications.

We develop EddyFormer, a neural network architecture that accelerates turbulence simulation by up to 30x while maintaining the accuracy of high-resolution DNS. EddyFormer combines the strengths of spectral methods for high accuracy and the Transformer architecture for scalability.

Our model learns to reproduce complex turbulent dynamics across a wide range of physical conditions, even generalizing to domains four times larger than what it was trained on. This work opens up new possibilities for using machine learning in critical scientific and engineering applications, from weather forecasting to aerospace design.

Architecture

EddyFormer's architecture is designed to efficiently capture the complex, multi-scale nature of turbulence. It integrates the high accuracy of spectral methods with the scalability and long-range dependency modeling of the Transformer architecture.

A core innovation is our Spectral Element Method (SEM) tokenization. Instead of treating the simulation domain as a uniform grid of pixels, we partition it into a smaller number of coarse elements. Each element, or 'token', contains a rich, high-order polynomial representation of the flow within it. This compact representation allows the attention mechanism to operate efficiently, capturing long-range interactions between distant parts of the flow without the quadratic complexity of standard Vision Transformers.

The architecture further employs a separation of scales, inspired by classical Large Eddy Simulation methods. The model processes the flow through two specialized streams: a Large Eddy Scale (LES) stream that uses attention on the SEM tokens to model global, coherent structures, and a Sub-Grid Scale (SGS) stream that uses localized spectral convolutions to capture the fine-scale, local eddy dynamics. This two-stream design allows each component to specialize, leading to a more accurate and efficient simulation.

Results on Isotropic Turbulence

We validate EddyFormer on three-dimensional homogeneous isotropic turbulence (Re ≈ 94), a key benchmark for capturing the core physics of the turbulent energy cascade. The Q-criterion visualization below, which highlights vortex structures, shows that EddyFormer accurately captures the vortical features of the flow, while the baseline F-FNO model fails to resolve these fine structures.

The model also preserves key physical invariants, such as the energy spectrum and structure functions, over long simulation rollouts. These results indicate that EddyFormer can achieve a level of accuracy comparable to a 256³ DNS. For a detailed analysis of all error metrics and physical statistics, please refer to the full paper.

Domain Generalization

A key feature of EddyFormer is its ability to generalize to larger domains than it was trained on. We trained the model on a 2D Kolmogorov flow and tested it on domains up to 4x larger. The results show that EddyFormer maintains its accuracy and correctly captures the energy spectrum, unlike baseline models which fail to generalize.

BibTeX

@inproceedings{ du2025eddyformer, title={EddyFormer: Accelerated Neural Simulations of Three-Dimensional Turbulence at Scale}, author={Yiheng Du and Aditi S. Krishnapriyan}, booktitle={The Thirty-Ninth Annual Conference on Neural Information Processing Systems}, year={2025}, url={https://arxiv.org/abs/2510.24173} }