Abstract
In recent years, neural operators have emerged as effective alternatives to traditional numerical solvers. They are known for their efficient computation, excellent generalization, and high solving accuracy. Many researchers have shown interest in their design and application. This paper provides a comprehensive summary and analysis of neural operators. We categorize them into three types based on their architecture: deep operator networks (DeepONets), integral kernel operators, and transformer-based neural operators. We then discuss the basic structures and properties of these operator types. Furthermore, we summarize and discuss the various variants and extensions of these three types of neural operators from three directions: (1) operator basis-based neural operator variants; (2) physics-informed neural operator variants; and (3) application of neural operator variants in complex systems. We also analyze the characteristics and performance of different operator methods through numerical experiments. Taking into account these discussions and analyses, we provide perspectives and suggestions regarding the challenges and potential enhancements for different neural operators. This offers valuable guidance and suggestions for the practical application and development of neural operators.
| Original language | English |
|---|---|
| Article number | 130518 |
| Journal | Neurocomputing |
| Volume | 648 |
| DOIs | |
| State | Published - 1 Oct 2025 |
Keywords
- AI for science
- Complex systems
- Neural operators
- Operator basis
- Partial differential equations
- Physical information
Fingerprint
Dive into the research topics of 'Architectures, variants, and performance of neural operators: A comparative review'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver