• Anderson, PW Plus is different. Science 177393–396 (1972).

    Google Scholar article

  • Thompson, JMT & Stewart, HB Nonlinear dynamics and chaos (Wiley, 2002).

  • Hirsch, MW, Smale, S. & Devaney, RL Differential equations, dynamical systems and introduction to chaos (Academic, 2012).

  • Kutz, JN, Brunton, SL, Brunton, BW and Proctor, JL Dynamic mode decomposition: data-driven modeling of complex systems (SIAM, 2016).

  • Evans, J. & Rzhetsky, A. Machine Science. Science 329399-400 (2010).

    Google Scholar article

  • Fortunato, S. et al. Science of science. Science 359eaao0185 (2018).

    Google Scholar article

  • Bongard, J. & Lipson, H. Automated Reverse Engineering of Nonlinear Dynamical Systems. proc. Natl Acad. Science. UNITED STATES 1049943–9948 (2007).

    Google Scholar article

  • Schmidt, M. & Lipson, H. Distilling free-form natural laws from experimental data. Science 32481–85 (2009).

    Google Scholar article

  • King, RD, Muggleton, SH, Srinivasan, A. & Sternberg, M. Structure-activity relationships derived from machine learning: the use of atoms and their bonding connectivities to predict mutagenicity through inductive logic programming. proc. Natl Acad. Science. UNITED STATES 93438–442 (1996).

    Google Scholar article

  • Waltz, D. & Buchanan, BG Automation Science. Science 32443–44 (2009).

    Google Scholar article

  • King, RD et al. The robot scientist Adam. computer 4246–54 (2009).

    Google Scholar article

  • Langley, P. BACON: a production system that discovers empirical laws. In proc. Fifth International Joint Conference on Artificial Intelligence Flight. 1344 (Morgan Kaufmann, 1977).

  • Langley, P. Rediscovering Physics with BACON.3. In proc. Sixth International Joint Conference on Artificial Intelligence Flight. 1 505–507 (Morgan Kaufmann, 1979).

  • Crutchfield, JP & McNamara, B. Equations of Motion from a Data Series. Complex system. 1417–452 (1987).

    MathSciNet MATHGoogle Scholar

  • Kevrekidis, IG et al. Equation-free and coarse-grained multi-scale computing: allowing microscopic simulators to perform system-level analysis. Common. Math. Science. 1715–762 (2003).

    MathSciNet ArticleGoogle Scholar

  • Yao, C. & Bollt, EM Nonlinear Parameter Modeling and Estimation with Kronecker Product Representation for Coupled Oscillators and Spatiotemporal Systems. Physics D 22778–99 (2007).

    MathSciNet ArticleGoogle Scholar

  • Rowley, CW, Mezić, I., Bagheri, S., Schlatter, P. & Henningson, DS Spectral analysis of nonlinear flows. J. Fluid Mech. 641115–127 (2009).

    MathSciNet ArticleGoogle Scholar

  • Schmidt, MD et al. Automated refinement and inference of analytical models for metabolic networks. Phys. Biol. 8055011 (2011).

    Google Scholar article

  • Sugihara, G. et al. Detecting causality in complex ecosystems. Science 338496–500 (2012).

    Google Scholar article

  • Ye, H. et al. Equation-free mechanistic ecosystem prediction using empirical dynamic modeling. proc. Natl Acad. Science. UNITED STATES 112E1569–E1576 (2015).

    Google Scholar

  • Daniels, BC & Nemenman, I. Automated adaptive inference of dynamic phenomenological models. Nat. Common. 68133 (2015).

    Google Scholar article

  • Daniels, BC & Nemenman, I. Efficient inference of parsimonious phenomenological models of cell dynamics using S-systems and alternating regression. Plos ONE tene0119821 (2015).

    Google Scholar article

  • Benner, P., Gugercin, S. & Willcox, K. A survey of projection-based model reduction methods for parametric dynamical systems. SIAM Rev. 57483–531 (2015).

    MathSciNet ArticleGoogle Scholar

  • Brunton, SL, Proctor, JL & Kutz, JN Discovery of governing equations from data by parsimonious identification of nonlinear dynamical systems. proc. Natl Acad. Science. UNITED STATES 1133932–3937 (2016).

    MathSciNet ArticleGoogle Scholar

  • Rudy, SH, Brunton, SL, Proctor, JL & Kutz, JN Discovery based on data from partial differential equations. Science. Adv. 3e1602614 (2017).

    Google Scholar article

  • Udrescu, S.-M. & Tegmark, M. AI Feynman: A Physics-Inspired Method for Symbolic Regression. Science. Adv. 6eaay2631 (2020).

    Google Scholar article

  • Mrowca D et al. Flexible neural representation for physical prediction. In Advances in Neural Information Processing Systems Flight. 31 (eds Bengio, S. et al.) (Curran Associates, 2018).

  • Champion, K., Lusch, B., Kutz, JN & Brunton, SL Discovery based on coordinate data and governing equations. proc. Natl Acad. Science. UNITED STATES 11622445–22451 (2019).

    MathSciNet ArticleGoogle Scholar

  • Baldi, P. & Hornik, K. Neural networks and principal component analysis: learning from examples without local minima. Neural network. 253–58 (1989).

    Google Scholar article

  • Hinton, GE & Zemel, RS autoencoders, minimum description length and Helmholtz free energy. Adv. Neural information. Treat. System 63 (1994).

    Google Scholar

  • Masci, J., Meier, U., Cireşan, D. & Schmidhuber, J. Stacked convolutional autoencoders for hierarchical feature extraction. In International Conference on Artificial Neural Networks 52–59 (Springer, 2011).

  • Bishop CM et al. Neural networks for pattern recognition (Oxford Univ. Press, 1995).

  • Camastra, F. & Staiano, A. Estimation of the intrinsic dimension: advances and open problems. Inf. Science. 32826–41 (2016).

    Google Scholar article

  • Campadelli, P., Casiraghi, E., Ceruti, C. & Rozza, A. Intrinsic dimension estimation: relevant techniques and frame of reference. Math. Problem. Eng. 2015759567 (2015).

  • Levina, E. & Bickel, PJ Maximum likelihood estimation of intrinsic dimension. In proc. 17th International Conference on Neural Information Processing Systems 777–784 (MIT Press, 2005).

  • Rozza, A., Lombardi, G., Ceruti, C., Casiraghi, E. & Campadelli, P. New High Intrinsic Dimensionality Estimators. Mach. Learn. 8937–65 (2012).

    MathSciNet ArticleGoogle Scholar

  • Ceruti, C. et al. DANCo: an intrinsic dimensionality estimator exploiting the norm angle and concentration. Pattern recognition. 472569-2581 (2014).

    Google Scholar article

  • Hein, M. & Audibert, J.-Y. Estimating the intrinsic dimensionality of submanifolds in RD. In proc. 22nd International Conference on Machine Learning 289–296 (Association for Computing Machinery, 2005).

  • Grassberger, P. & Procaccia, I. en Chaotic attractor theory 170–189 (Springer, 2004).

  • Pukrittayakamee, A. et al. Simultaneous adjustment of a potential energy surface and its corresponding force fields using feedforward neural networks. J. Chem. Phys. 130134101 (2009).

    Google Scholar article

  • Wu, J., Lim, JJ, Zhang, H., Tenenbaum, JB & Freeman, WT Physics 101: Learning Physical Properties of Objects from Unlabeled Videos. In proc. British Computer Vision Conference (BMVC) (eds Wilson, RC et al.) 39.1-39.12 (BMVA Press, 2016).

  • Chmiela, S. et al. Machine learning of precise, energy-efficient molecular force fields. Science. Adv. 3e1603015 (2017).

    Google Scholar article

  • Schütt, KT, Arbabzadah, F., Chmiela, S., Müller, KR, and Tkatchenko, A. Insights into quantum chemistry from deep tensor neural networks. Nat. Common. 813890 (2017).

    Google Scholar article

  • Smith, JS, Isayev, O. & Roitberg, AE ANI-1: A Scalable Neural Network Potential with DFT Accuracy at Force Field Computational Cost. Chem. Science. 83192–3203 (2017).

    Google Scholar article

  • Lutter, M., Ritter, C. & Peters, J. Deep Lagrangian Networks: Using Physics as a Preliminary Model for Deep Learning. In International Conference on Representations of Learning (2019).

  • Bondesan, R. & Lamacraft, A. Learning the symmetries of classical integrable systems. Preprint at https://arxiv.org/abs/1906.04645 (2019).

  • Greydanus, SJ, Dzumba, M. & Yosinski, J. Hamiltonian Neural Networks. Preprint at https://arxiv.org/abs/1906.01563 (2019).

  • Swischuk, R., Kramer, B., Huang, C., and Willcox, K. Training physics-based reduced-order models for a single-injector combustion process. AIAA J. 582658–2672 (2020).

    Google Scholar article

  • Lange, H., Brunton, SL & Kutz, JN From Fourier to Koopman: Spectral methods for the prediction of long-term time series. J.Mach. Learn. Res. 221–38 (2021).

    MathSciNet MATHGoogle Scholar

  • Mallen, A., Lange, H. & Kutz, JN Koopman Deep probabilistic: long-term time series forecasting under periodic uncertainties. Preprint at https://arxiv.org/abs/2106.06033 (2021).

  • Chen B et al. Dataset for the article titled Discovering State Variables Hidden in Experimental Data (1.0). Zenodo https://doi.org/10.5281/zenodo.6653856 (2022).

  • Chen B et al. BoyuanChen/neural state variables: (v1.0). Zenodo https://doi.org/10.5281/zenodo.6629185 (2022).