Skip to content

This page was last updated on 2024-07-17 08:45:16 UTC

Recommendations for the article Ensemble-SINDy: Robust sparse model discovery in the low-data, high-noise limit, with active learning and control

$\dot { \boldsymbol x} = { \boldsymbol f} ({ \boldsymbol x})$ . First, we propose, for use in high-noise settings, an extensive toolkit of critically enabling extensions for the SINDy regression method, to progressively cull functionals from an over-complete library and yield a set of sparse equations that regress to the derivate $\dot { \boldsymbol {x}}$ . This toolkit includes: (regression step) weight timepoints based on estimated noise, use ensembles to estimate coefficients, and regress using FFTs; (culling step) leverage linear dependence of functionals, and restore and protect culled functionals based on Figures of Merit (FoMs). In a novel Assessment step, we define FoMs that compare model predictions to the original time-series (i.e., ${ \boldsymbol x}(t)$ rather than $\dot { \boldsymbol {x}}(t)$ ). These innovations can extract sparse governing equations and coefficients from high-noise time-series data (e.g., 300% added noise). For example, it discovers the correct sparse libraries in the Lorenz system, with median coefficient estimate errors equal to 1%−3% (for 50% noise), 6%−8% (for 100% noise), and 23%−25% (for 300% noise). The enabling modules in the toolkit are combined into a single method, but the individual modules can be tactically applied in other equation discovery methods (SINDy or not) to improve results on high-noise data. Second, we propose a technique, applicable to any model discovery method based on $\dot { \boldsymbol x} = { \boldsymbol f} ({ \boldsymbol x})$ , to assess the accuracy of a discovered model in the context of non-unique solutions due to noisy data. Currently, this non-uniqueness can obscure a discovered model’s accuracy and thus a discovery method’s effectiveness. We describe a technique that uses linear dependencies among functionals to transform a discovered model into an equivalent form that is closest to the true model, enabling more accurate assessment of a discovered model’s correctness.">
Abstract Title Authors Publication Date Journal/ Conference Citation count Highest h-index
visibility_off Sparsifying priors for Bayesian uncertainty quantification in model discovery Seth M. Hirsh, D. Barajas-Solano, J. Kutz 2021-07-05 Royal Society Open Science 51 31
visibility_off Convergence of uncertainty estimates in Ensemble and Bayesian sparse model discovery Liyao (Mars) Gao, Urban Fasel, S. Brunton, J. Kutz 2023-01-30 ArXiv 11 63
visibility_off Automatically discovering ordinary differential equations from data with sparse regression Kevin Egan, Weizhen Li, Rui Carvalho 2024-01-09 Communications Physics 7 1
visibility_off SINDy-PI: a robust algorithm for parallel implicit sparse identification of nonlinear dynamics Kadierdan Kaheman, J. Kutz, S. Brunton 2020-04-05 Proceedings. Mathematical, Physical, and Engineering Sciences 184 63
visibility_off Rapid Bayesian identification of sparse nonlinear dynamics from scarce and noisy data Lloyd Fung, Urban Fasel, M. Juniper 2024-02-23 ArXiv 0 37
visibility_off Discovering governing equations from data by sparse identification of nonlinear dynamical systems S. Brunton, J. Proctor, J. Kutz 2015-09-11 Proceedings of the National Academy of Sciences 3114 63
visibility_off Sparse identification of nonlinear dynamics in the presence of library and system uncertainty Andrew O'Brien 2024-01-23 ArXiv 0 0
visibility_off A Toolkit for Data-Driven Discovery of Governing Equations in High-Noise Regimes Charles B. Delahunt, J. Kutz 2021-11-08 IEEE Access 15 31
Abstract Title Authors Publication Date Journal/Conference Citation count Highest h-index