Data-driven computational methods : parameter and operator estimations / John Harlim.
Modern scientific computational methods are undergoing a transformative change; big data and statistical learning methods now have the potential to outperform the classical first-principles modeling paradigm. This book bridges this transition, connecting the theory of probability, stochastic processes, functional analysis, numerical analysis, and differential geometry. It describes two classes of computational methods to leverage data for modeling dynamical systems. The first is concerned with data fitting algorithms to estimate parameters in parametric models that are postulated on the basis of physical or dynamical laws. The second is on operator estimation, which uses the data to nonparametrically approximate the operator generated by the transition function of the underlying dynamical systems. This self-contained book is suitable for graduate studies in applied mathematics, statistics, and engineering. Carefully chosen elementary examples with supplementary MATLAB® codes and appendices covering the relevant prerequisite materials are provided, making it suitable for self-study.
Person | |
---|---|
Ausgabe | First edition. |
Ort, Verlag, Jahr |
Cambridge, England
: Cambridge University Press
, 2018
|
Umfang | 1 online resource (xi, 158 pages) : : digital, PDF file(s). |
ISBN | 1-108-61513-9 1-108-56246-9 |
Sprache | Englisch |
Zusatzinfo | Title from publisher's bibliographic system (viewed on 25 Jul 2018). |
Zusatzinfo | Cover -- Half-title -- Title page -- Copyright information -- Dedication -- Table of contents -- Preface -- 1 Introduction -- 1.1 The Role of Data in Parametric Modeling -- 1.1.1 Markov-Chain Monte Carlo -- 1.1.2 The Ensemble Kalman Filter -- 1.2 Nonparametric Modeling -- 1.2.1 Stochastic Spectral Method -- 1.2.2 Karhunen-Loève Expansion -- 1.2.3 Diffusion Forecasting -- 2 Markov-Chain Monte Carlo -- 2.1 A Brief Review of Markov Processes -- 2.2 The Metropolis-Hastings Method -- 2.3 Parameter Estimation Problems -- 2.4 Parameter Estimation with the Metropolis Scheme -- 2.4.1 Specifying Parameters in the Proposal -- 2.4.2 Specifying the Measurement Noise Variance -- 2.4.3 Pseudo-algorithm -- 2.5 MCMC with a Surrogate Model -- 3 Ensemble Kalman Filters -- 3.1 A Review of Ensemble Kalman Filters -- 3.1.1 The Ensemble Transform Kalman Filter -- 3.1.2 Remarks on the Practical Implementation and the Convergence -- 3.2 Parameter Estimation Methods -- 3.2.1 Adaptive Covariance Estimation Methods -- 3.3 Parameter Estimation of Reduced-Order Dynamics -- 4 Stochastic Spectral Methods -- 4.1 A Quick Review on Orthogonal Polynomials -- 4.2 Polynomial Chaos Expansion -- 4.2.1 Multidimensional Random Variables -- 4.2.2 Representing General Random Variables -- 4.3 The Weak Polynomial Chaos Approximation -- 4.4 The Stochastic Galerkin Method -- 4.5 The Stochastic Collocation Method -- 4.5.1 Sparse-Grid Quadrature Rules -- 5 Karhunen-Loève Expansion -- 5.1 Mercer's Theorem -- 5.2 KL Expansion of Random Processes -- 5.2.1 Numerical Approximation -- 5.2.2 An Uncertainty Quantification Application -- 5.3 Connection to POD -- 5.3.1 Discrete Data -- 6 Diffusion Forecast -- 6.1 Diffusion Maps -- 6.1.1 Estimation of the Laplacian -- 6.1.2 Discrete Approximation -- 6.2 Generalization with Variable-Bandwidth Kernels -- 6.2.1 Automatic Specification of ε and d. 6.3 Nonparametric Probabilistic Modeling -- 6.4 Estimation of Initial Densities -- 6.4.1 Nyström Extension -- 6.4.2 Bayesian Filtering -- Appendix A Elementary Probability Theory -- Appendix B Stochastic Processes -- Appendix C Elementary Differential Geometry -- References -- Index. |
Online-Zugang | Cambridge ebooks EBS 2024 |
Bei Problemen beim Zugriff auf diese Online-Quelle beachten Sie unsere Hinweise zum Zugriff auf lizenzierte Angebote von außerhalb des Campus.