Cantitate/Preț
Produs

BERRU Predictive Modeling: Best Estimate Results with Reduced Uncertainties

Autor Dan Gabriel Cacuci
en Limba Engleză Hardback – 12 ian 2019
This book addresses the experimental calibration of best-estimate numerical simulation models. The results of measurements and computations are never exact. Therefore, knowing only the nominal values of experimentally measured or computed quantities is insufficient for applications, particularly since the respective experimental and computed nominal values seldom coincide. In the author’s view, the objective of predictive modeling is to extract “best estimate” values for model parameters and predicted results, together with “best estimate” uncertainties for these parameters and results. To achieve this goal, predictive modeling combines imprecisely known experimental and computational data, which calls for reasoning on the basis of incomplete, error-rich, and occasionally discrepant information.
The customary methods used for data assimilation combine experimental and computational information by minimizing an a priori, user-chosen, “cost functional” (usually a quadratic functional that represents the weighted errors between measured and computed responses). In contrast to these user-influenced methods, the BERRU (Best Estimate Results with Reduced Uncertainties) Predictive Modeling methodology developed by the author relies on the thermodynamics-based maximum entropy principle to eliminate the need for relying on minimizing user-chosen functionals, thus generalizing the “data adjustment” and/or the “4D-VAR” data assimilation procedures used in the geophysical sciences. The BERRU predictive modeling methodology also provides a “model validation metric” which quantifies the consistency (agreement/disagreement) between measurements and computations. This “model validation metric” (or “consistency indicator”) is constructed from parameter covariance matrices, response covariance matrices (measured and computed), and response sensitivities to model parameters.
Traditional methods for computing response sensitivities are hampered by the “curse of dimensionality,” which makes them impractical for applications to large-scale systems that involve many imprecisely known parameters. Reducing the computational effort required for precisely calculating the response sensitivities is paramount, and the comprehensive adjoint sensitivity analysis methodology developed by the author shows great promise in this regard, as shown in this book. After discarding inconsistent data (if any) using the consistency indicator, the BERRU predictive modeling methodology provides best-estimate values for predicted parameters and responses along with best-estimate reduced uncertainties (i.e., smaller predicted standard deviations) for the predicted quantities. Applying the BERRU methodology yields optimal, experimentally validated, “best estimate” predictive modeling tools for designing new technologies and facilities, while also improving on existing ones.  
Citește tot Restrânge

Preț: 109936 lei

Preț vechi: 134068 lei
-18% Nou

Puncte Express: 1649

Preț estimativ în valută:
21039 21855$ 17476£

Carte tipărită la comandă

Livrare economică 03-17 februarie 25

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9783662583937
ISBN-10: 3662583933
Pagini: 392
Ilustrații: XIV, 451 p. 1 illus.
Dimensiuni: 155 x 235 mm
Greutate: 0.83 kg
Ediția:1st ed. 2019
Editura: Springer Berlin, Heidelberg
Colecția Springer
Locul publicării:Berlin, Heidelberg, Germany

Cuprins

Basics of predictive best-estimate model calibration.- Predictive best-estimate model-validation, model-calibration and model-verification concerning open and chaotic systems.- Differences to traditional statostic evaluation methods.- Examples.

Notă biografică

Dan Gabriel Cacuci received his Master of Science, Master and Doctor of Philosophy degrees in applied physics and nuclear engineering from Columbia University in New York City. His scientific expertise encompasses the following areas: predictive best-estimate analysis of large-scale physical and engineering systems, large-scale scientific computations, and nuclear engineering (reactor multi-physics, dynamics, and safety). He currently holds the South Carolina SmartState Endowed Chair and Directorship of the Center of Economic Excellence in Nuclear Science and Energy at the University of South Carolina in Columbia, USA.
Professor Cacuci’s career spans extensive work both in academia and at large-scale multidisciplinary research centers. His teaching and research experience as a full professor at leading academic institutions includes appointments at the University of Tennessee, University of California at Santa Barbara, University of Illinois at Urbana-Champaign, University of Virginia, University of Michigan, University of California at Berkeley, Royal Institute of Technology Stockholm (Sweden), the National Institute for Nuclear Sciences and Technologies (France), and the University of Karlsruhe and Karlsruhe Institute of Technology (Germany). Professor Cacuci’s research and management experience at leading national research centers includes serving as Senior Section Head at Oak Ridge National Laboratory, Institute Director at the Nuclear Research Center Karlsruhe (Germany), and Scientific Director of the Nuclear Energy Directorate, Commissariat a l’Energie Atomique (CEA, France).

Since 1984, Prof. Cacuci has been the Editor of “Nuclear Science and Engineering, The Research Journal of the American Nuclear Society” (ANS). He is a member of the European Academy of Arts and Sciences (2004), Honorary Member of the Romanian Academy (1994), an ANS Fellow (1986), and has received many prestigious awards, including four titles of Doctor Honoris Causa, the E. O. Lawrence Award and Gold Medal (US DOE, 1998), the Arthur Holly Compton Award (ANS 2011), the Eugene P. Wigner Reactor Physics Award (ANS, 2003), the Glenn Seaborg Medal (ANS, 2002), and the Alexander von Humboldt Prize for Senior Scholars (Germany, 1990).

Professor Cacuci has served on numerous international committees, including as the founding coordinator of the EURATOM-Integrated Project NURESIM (European Platform for Nuclear Reactor Simulation, 2004–2008), and founding coordinator (2004–2007) of the Coordinated Action to establish a Sustainable Nuclear Fission Technology Platform in Europe. He has made over 600 presentations worldwide, authored 4 books, 250 articles, and has edited the comprehensive Handbook of Nuclear Engineering (5 volumes, 3580 pages, Springer, 2010).  

Textul de pe ultima copertă

This book addresses the experimental calibration of best-estimate numerical simulation models. The results of measurements and computations are never exact. Therefore, knowing only the nominal values of experimentally measured or computed quantities is insufficient for applications, particularly since the respective experimental and computed nominal values seldom coincide. In the author’s view, the objective of predictive modeling is to extract “best estimate” values for model parameters and predicted results, together with “best estimate” uncertainties for these parameters and results. To achieve this goal, predictive modeling combines imprecisely known experimental and computational data, which calls for reasoning on the basis of incomplete, error-rich, and occasionally discrepant information.
The customary methods used for data assimilation combine experimental and computational information by minimizing an a priori, user-chosen, “cost functional” (usually a quadratic functional that represents the weighted errors between measured and computed responses). In contrast to these user-influenced methods, the BERRU (Best Estimate Results with Reduced Uncertainties) Predictive Modeling methodology developed by the author relies on the thermodynamics-based maximum entropy principle to eliminate the need for relying on minimizing user-chosen functionals, thus generalizing the “data adjustment” and/or the “4D-VAR” data assimilation procedures used in the geophysical sciences. The BERRU predictive modeling methodology also provides a “model validation metric” which quantifies the consistency (agreement/disagreement) between measurements and computations. This “model validation metric” (or “consistency indicator”) is constructed from parameter covariance matrices, response covariance matrices (measured and computed), and response sensitivities to model parameters.
Traditional methods for computing response sensitivities are hampered by the “curse of dimensionality,” which makes them impractical for applications to large-scale systems that involve many imprecisely known parameters. Reducing the computational effort required for precisely calculating the response sensitivities is paramount, and the comprehensive adjoint sensitivity analysis methodology developed by the author shows great promise in this regard, as shown in this book. After discarding inconsistent data (if any) using the consistency indicator, the BERRU predictive modeling methodology provides best-estimate values for predicted parameters and responses along with best-estimate reduced uncertainties (i.e., smaller predicted standard deviations) for the predicted quantities. Applying the BERRU methodology yields optimal, experimentally validated, “best estimate” predictive modeling tools for designing new technologies and facilities, while also improving on existing ones.  

Caracteristici

The first-ever book on the experimental calibration of best estimate numerical simulation models Demonstrates the model’s implementation using various examples, e.g. from chemistry, geophysics, space and aeronautics, or nuclear science Shares the objective of predictive modeling, which is to extract “best estimate” values for model parameters and predicted results, together with “best estimate” uncertainties for these parameters and results