Enhancing Symbolic Regression and Universal Physics-Informed Neural Networks with Dimensional Analysis
Abstract
We present a new method for enhancing symbolic regression for differential equations via dimensional analysis, specifically Ipsen's and Buckingham pi methods. Since symbolic regression often suffers from high computational costs and overfitting, non-dimensionalizing datasets reduces the number of input variables, simplifies the search space, and ensures that derived equations are physically meaningful. As our main contribution, we integrate Ipsen's method of dimensional analysis with Universal Physics-Informed Neural Networks. We also combine dimensional analysis with the AI Feynman symbolic regression algorithm to show that dimensional analysis significantly improves the accuracy of the recovered equation. The results demonstrate that transforming data into a dimensionless form significantly decreases computation time and improves accuracy of the recovered hidden term. For algebraic equations, using the Buckingham pi theorem reduced complexity, allowing the AI Feynman model to converge faster with fewer data points and lower error rates. For differential equations, Ipsen's method was combined with Universal Physics-Informed Neural Networks (UPINNs) to identify hidden terms more effectively. These findings suggest that integrating dimensional analysis with symbolic regression can significantly lower computational costs, enhance model interpretability, and increase accuracy, providing a robust framework for automated discovery of governing equations in complex systems when data is limited.
- Publication:
-
arXiv e-prints
- Pub Date:
- November 2024
- DOI:
- 10.48550/arXiv.2411.15919
- arXiv:
- arXiv:2411.15919
- Bibcode:
- 2024arXiv241115919P
- Keywords:
-
- Computer Science - Machine Learning