Abstract
Monitoring key parameters in large-scale complex systems such as liquid propellant rocket engines is challenging because physical sensors often fail under extreme conditions. To address this issue, this work proposes Multi-TSformer, a Transformer-based operator learning framework for constructing virtual sensors. The objective is to accurately estimate unmeasurable parameters by capturing nonlinear spatial-temporal dependencies in multivariate time series. The methodology decomposes input sequences into temporal groups, forms spatial-temporal tokens, and leverages attention mechanisms to learn global dependencies. This design enables efficient mapping from multivariate inputs to outputs while preserving dynamic patterns. Multi-TSformer was validated on rocket engine simulation datasets and fine-tuned with limited real engine test data. Results show that the proposed approach reduces mean absolute error by up to 60% compared to baseline neural operator methods, effectively enhancing monitoring accuracy. Furthermore, the model achieves real-time feasibility, with inference requiring only 0.16 s per second of data on an RTX 3090 GPU. In addition, the method generalizes to solving benchmark PDE problems such as the Darcy equation, outperforming several state-of-the-art transformer solvers. These findings confirm the potential of operator learning for virtual sensor construction in aerospace systems. Broader applicability to different engines and complex industrial systems remains a promising direction for future work.
| Original language | English |
|---|---|
| Article number | 09544100251381406 |
| Journal | Proceedings of the Institution of Mechanical Engineers, Part G: Journal of Aerospace Engineering |
| DOIs | |
| State | Accepted/In press - 2025 |
Keywords
- LPRE
- operator learning
- surrogate model
- transformer
- virtual sensors