Cantitate/Preț
Produs

Molecular Networking: Statistical Mechanics in the Age of AI and Machine Learning

Autor Caroline Desgranges, Jerome Delhommelle
en Limba Engleză Hardback – 29 ian 2024
The book builds on the analogy between social groups and assemblies of molecules to introduce the concepts of statistical mechanics, machine learning and data science. Applying a data analytics approach to molecular systems, we show how individual (molecular) features and interactions between molecules, or "communication" processes, allow for the prediction of properties and collective behavior of molecular systems - just as polling and social networking shed light on the behavior of social groups. Applications to systems at the cutting-edge of research for biological, environmental, and energy applications are also presented.
Key features:
  • Draws on a data analytics approach of molecular systems
  • Covers hot topics such as artificial intelligence and machine learning of molecular trends
  • Contains applications to systems at the cutting-edge of research for biological, environmental and energy applications
  • Discusses molecular simulation and links with other important, emerging techniques and trends in computational sciences and society
  • Authors have a well-established track record and reputation in the field
Citește tot Restrânge

Preț: 65200 lei

Preț vechi: 87837 lei
-26% Nou

Puncte Express: 978

Preț estimativ în valută:
12477 13156$ 10387£

Carte tipărită la comandă

Livrare economică 10-24 ianuarie 25

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9780367438937
ISBN-10: 0367438933
Pagini: 248
Ilustrații: 59 Line drawings, color; 18 Line drawings, black and white; 7 Halftones, color; 15 Halftones, black and white; 66 Illustrations, color; 33 Illustrations, black and white
Dimensiuni: 178 x 254 x 16 mm
Greutate: 0.62 kg
Ediția:1
Editura: CRC Press
Colecția CRC Press

Public țintă

Academic, Postgraduate, Professional Practice & Development, and Undergraduate Advanced

Cuprins

Section I Molecular networking analytics

Chapter 1 Probabilities, distributions and statistics
1.1 MECHANICS
1.1.1 Newton, Lagrange, and Hamilton
1.1.2 Wave function and uncertainty
1.1.3 Quantum Energy and Density of States
1.2 THERMODYNAMICS
1.2.1 Processes, Work, and Heat
1.2.2 First, Second and Third Laws
1.2.3 Changing Conditions: Legendre Transformations
1.3 STATISTICS AND DISTRIBUTIONS
1.3.1 Maxwell-Boltzmann distribution
1.3.2 Phase space and probability distribution
1.3.3 Micro-Macro Connection
Chapter 2 Communication Rules in Molecular Systems
2.1 COMMUNICATION AND INTERACTIONS
2.1.1 Interactions in a Quantum World
2.1.2 Coarse-graining: Tight-Binding
2.1.3 Further coarse-graining: a classical world
2.2 INTERACTIONS BETWEEN MOLECULES
2.2.1 Molecular Properties and Interactions
2.2.2 2-Body vs. Many-Body Potentials
2.2.3 Towards Macro- and Bio-molecules
2.3 BEYOND INTERACTIONS
2.3.1 Signaling
2.3.2 Phoresis and Active Matter
2.3.3 Chemotaxis
Chapter 3 An Ensemble Approach: Finding descriptors and reducing dimensions
3.1 COLLECTIONS AND ENSEMBLES
3.1.1 Making Sense of the Microscopic Big Data
3.1.2 Defining Ensembles
3.1.3 The Concept of the Most Probable Distribution
3.2 INDIVIDUALS IN AN ISOTHERMAL WORLD: THE CANONICAL ENSEMBLE
3.2.1 Key Parameters and Multipliers
3.2.2 The Central Partition Function
3.2.3 Partition Function and Thermodynamics
3.3 INDIVIDUALS IN ISOLATION: THE MICROCANONICAL ENSEMBLE
3.3.1 Number and Density of States
3.3.2 Boltzmann’s Entropy
3.3.3 Thermodynamic Functions
Chapter 4 Accounting for Individual Features and Changes
4.1 MOLECULES IN A CANONICAL WORLD
4.1.1 Features and Consequences
4.1.2 The Case of Diatomic Molecules
4.1.3 Molecular Symmetry and Polyatomic Molecules
4.2 CONNECTING WITH THE MACROSCOPIC WORLD
4.2.1 Are all Features Essential?
4.2.2 Model-Partition Function Interplay
4.2.3 Thermodynamic properties and Ideality
4.3 CHANGING IDENTITIES: CHEMICAL REACTIONS
4.3.1 Reaction properties and Parameters
4.3.2 Partition Functions and Equilibrium Constants
4.3.3 The Activated Complex
Chapter 5 Machine Learning and Molecular Systems
5.1 DISTINGUISHING FROM THE MOLECULAR CROWD
5.1.1 Labels and Classes
5.1.2 Identifying and Handling Patterns
5.1.3 Learning under supervision
5.2 QUANTITATIVE MODELS FOR MOLECULAR GROUPS
5.2.1 Training regression models
5.2.2 Mapping numbers: Artificial Neural Networks
5.2.3 Optimization through back-propagation
5.3 BEYOND ARTIFICIAL NEURAL NETWORKS
5.3.1 Learning by watching: Convolutional Neural Networks
5.3.2 Time sequences and Recurrent Neural Networks
5.3.3 Understanding policies: the Advent of Reinforcement Learning
Section II Static trends: equilibrium statistics
Chapter 6 Polling a molecular population: Monte Carlo and Wang Landau simulations
6.1 THE BIRTH OF THE MONTE CARLO METHOD
6.1.1 Randomness and Integration
6.1.2 Sample Mean Approach
6.1.3 The Concept of Importance Sampling
6.2 THE METROPOLIS METHOD
6.2.1 Markov Chain and Stochastic Matrix
6.2.2 Randomness and Acceptance
6.2.3 Implementation and Testing
6.3 WANG-LANDAU SAMPLING
6.3.1 A Paradigm Shift: Evaluating the Density of States
6.3.2 The Biased Distribution
6.3.3 A Twist in the Monte Carlo plot
Chapter 7 Molecular networking in insulation: adiabatic ensembles
7.1 ADIABATIC PROCESSES AND ENSEMBLES
7.1.1 Adiabatic vs. Isothermal
7.1.2 The Concept of Heat Function
7.1.3 Eight Statistical Ensembles
7.2 MECHANICS OF ADIABATIC ENSEMBLES
7.2.1 Microcanonical distribution and thermodynamic equations
7.2.2 The (μ, P,R) Ensemble
7.2.3 A Full Picture for the Four Adiabatic Ensembles
7.3 MONTE CARLO EXPLORATION OF ADIABATIC ENSEMBLES
7.3.1 Exploring the Microcanonical Ensemble
7.3.2 Musing in the (N, P,H) Ensemble
7.3.3 Direct Entropy Evaluations in the (μ, P,R) Ensemble
Chapter 8 Networking under one (or more) cues: isothermal ensembles
8.1 THERMAL AND CHEMICAL CUES
8.1.1 The Grand-Canonical Ensemble
8.1.2 Monte Carlo Exploration
8.1.3 Grand Partition Function Determination
8.2 THERMAL AND MECHANICAL CUES
8.2.1 The Isothermal-Isobaric Ensemble
8.2.2 Properties Calculations
8.2.3 Partition Function Computation
8.3 VARIATIONS AND APPLICATIONS
8.3.1 Multi-Component Systems and Semi-Grand Approach
8.3.2 A First Step towards Coexistence: Gibbs Ensemble Monte Carlo
method
8.3.3 Recycling and Reweighting
Chapter 9 Collective properties from partition functions
9.1 GENERATING DATA ON PARTITION FUNCTIONS
9.1.1 Starting from A
9.1.2 From dilute to condensed phases
9.1.3 Direct determination of partition functions
9.2 THE CASE OF PHASE TRANSITIONS
9.2.1 Matching Probabilities
9.2.2 Features of coexistence
9.2.3 Extension to Multi-Component Systems
9.3 GAS STORAGE AND SEPARATION APPLICATIONS
9.3.1 Partition Functions for Adsorbed Fluids
9.3.2 Thermodynamic Properties of Adsorption
9.3.3 Environmental and Energy Applications
Chapter 10 Machine Learning Molecular Trends
10.1 LEARNING INTERMOLECULAR INTERACTIONS
10.1.1 Starting from empirical datasets
10.1.2 Training on tight-binding data
10.1.3 Neural network potentials
10.2 LEARNING PARTITION FUNCTIONS
10.2.1 Single-component systems
10.2.2 Multicomponent mixtures
10.2.3 Adsorbed Phases
10.3 LEARNING TRANSITIONS
10.3.1 Spanning Pathways
10.3.2 From Partition Functions to Reaction Coordinates
10.3.3 On-The-Fly Learning of Collective Variables

Section III Dynamic trends: motion statistics
Chapter 11 Molecular evolution and fluctuations: time-resolved statistics
11.1 COMPUTING MOLECULAR TRAJECTORIES
11.1.1 Ensemble and Time Averages Equivalency
11.1.2 Molecular Equations of Motion
11.1.3 Integration Schemes
11.2 MOLECULAR TRAJECTORIES
11.2.1 Gauss’ principle of least constraint
11.2.2 Keeping the temperature in check
11.2.3 Nos´e-Hoover Thermostat
11.3 MULTIPLE-TIME STEPS AND HYBRID SCHEMES
11.3.1 Time-splitting
11.3.2 Controlling pressure
11.3.3 Hybrid schemes
Chapter 12 Noise and information: correlation functions
12.1 MOTION AND TRANSPORT
12.1.1 Brownian Motion
12.1.2 Langevin Equation & Fluctuation-Dissipation
12.1.3 Einstein Diffusion Equation
12.2 TRANSPORT FROM CORRELATION
12.2.1 D from a Correlation Function
12.2.2 The Mori-Zwanzig approach
12.2.3 Evaluation of Transport Coefficients
12.3 RESPONSE THEORY
12.3.1 Linear response theory
12.3.2 Time-Dependent Linear Response
12.3.3 Nonlinear Response, Dynamical Stability, and Chaos
Chapter 13 External fields and agents: new communication paradigms
13.1 NONEQUILIBRIUM MOLECULAR TRAJECTORIES
13.1.1 Boundary-Driven and Synthetic Setups
13.1.2 Accounting for Heat Dissipation
13.1.3 Extracting Transport Coefficients
13.2 COMPUTING NONEQUILIBRIUM TRAJECTORIES
13.2.1 Physical Boundaries vs Periodic Boundaries
13.2.2 Nonequilibrium Definitions for Temperature
13.2.3 Transport in the Steady-State
13.3 TRANSIENT-TIME CORRELATION FUNCTION
13.3.1 Formalism
13.3.2 Bridging between Equilibrium and Nonequilibrium
13.3.3 Transport close(r) to Equilibrium
Chapter 14 Fluctuation Theorems, Molecular Machines and Emergent Behavior
in Active Matter
14.1 FLUCTUATION THEOREMS
14.1.1 Formalism
14.1.2 Negative Entropy Production Trajectories
14.1.3 Free Energy Differences
14.2 TOWARDS A NEW PHYSICS OF LIVING SYSTEMS
14.2.1 Work Relations and RNA Folding
14.2.2 Mutating, stretching, binding, and unbinding
14.2.3 Free energy calculations via steered MD
14.3 EMERGENCE IN ACTIVE MATTER
14.3.1 Dry Active Matter
14.3.2 Active Brownian Matter & MIPS
14.3.3 Entropy Production: from Active Matter to Molecular Machines
Chapter 15 Learning evolution and transport
15.1 LEARNING TRANSPORT
15.1.1 Rationale for Diffusion Learning
15.1.2 RNNs and LSTMs in Action
15.1.3 Classifying Diffusion Behaviors
15.2 LEARNING DYNAMICS
15.2.1 Learning Equations of Motion for Mesoscopic and Structured Systems
15.2.2 Learning Differential Equations
15.2.3 Data-Driven Identification of Governing Equations
15.3 LEARNING NAVIGATION
15.3.1 Adapting to the Environment
15.3.2 Identifying Navigation Strategies
15.3.3 Learning Collective Motion

Notă biografică

Dr. Caroline Desgranges received a DEA in Physics in 2005 from the University Paul Sabatier-Toulouse III (France) and a PhD in Chemical Engineering from the University of South Carolina (USA) in 2008. She is currently a Research Assistant Professor in Physics & Applied Physics at the University of Massachusetts Lowell.
Dr. Jerome Delhommelle did his undergraduate studies at the Ecole Normale Superieure Paris-Saclay and received his PhD in Chemistry from the University of Paris-Saclay (France) in 2000. He is currently an Associate Professor in Chemistry at the University of Massachusetts Lowell.

Descriere

The book builds on an analogy between social groups and assemblies of molecules to introduce the concepts of statistical mechanics, machine learning and data science. Applications to systems at the cutting-edge of research e.g. environmental and energy applications, will be used.