Generalizability of local neural operator: Example for elastodynamic problems

  • Hongyu Li
  • , Ximeng Ye
  • , Lei He
  • , Weiqi Qian
  • , Peng Jiang
  • , Tiejun Wang

Research output: Contribution to journalArticlepeer-review

Abstract

Local neural operator (LNO) conception has provided a feasible way for scientific computations. The LNO learns transient partial differential equations from random field samples, and then the pre-trained LNO solves practical problems on specific computational domains. For applications, we may ask: Are the training samples rich enough? To what extent can we trust the solutions obtained from pre-trained LNO models for unknown cases? The generalizability of LNO could answer these questions. Here, we propose to use two plain scalar features, the amplitude and wavenumber of the input functions, to indicate the richness of training samples and to evaluate the generalization error of pre-trained LNO. In elastodynamic practices, we find that isolated evolving wavenumber modes for Lamé–Navier equation caused the training dataset to lack mode diversity. By data supplementation and model fine-tuning targeting to the discovered lack modes, the pre-trained and fine-tuned LNO model solves Lamb problem correctly and efficiently. These results and the proposed generalization criteria provide a paradigm for LNO applications.

Original languageEnglish
Article number118151
JournalComputer Methods in Applied Mechanics and Engineering
Volume445
DOIs
StatePublished - 1 Oct 2025

Keywords

  • Deep learning
  • Elastodynamics
  • Generalizability
  • Local neural operator (LNO)

Fingerprint

Dive into the research topics of 'Generalizability of local neural operator: Example for elastodynamic problems'. Together they form a unique fingerprint.

Cite this