Materials Genomics
Unit 4: Thermodynamics, Statistical Mechanics & Classical Atomistic Simulation

Prof. Dr. Philipp Pelz

FAU Erlangen-Nürnberg

FAU Logo IMN Logo CENEM Logo ERC Logo Eclipse Logo

Where We Stand

Recap of Units 1-3

  • Unit 1: postulates of QM, hydrogen atom, Born interpretation
  • Unit 2: multi-electron atoms, Born-Oppenheimer, Slater determinants, LCAO
  • Unit 3: Hartree-Fock, post-HF (MP, CC), density functional theory
  • All previous units have been about the electronic structure at \(T=0\)
  • Output: a potential energy surface \(V(\mathbf{x}_1,\dots,\mathbf{x}_N)\) for given nuclear positions

What Unit 4 Adds — The Pivot

  • Real materials live at finite temperature — atoms vibrate, diffuse, react
  • We need thermodynamics (state variables, free energies) and statistical mechanics (ensembles, \(Z\))
  • We pivot from quantum to classical atomistic simulation — millions of atoms instead of dozens of electrons
  • Molecular dynamics as the primary sampler of the Boltzmann distribution; Monte Carlo will follow in Unit 5
  • Sets up Unit 5+ on representations, ML force fields, and materials genomes

Lecture Roadmap

Part I — Thermodynamics & statistical mechanics

  • Macroscopic thermodynamics, four laws, free energies
  • The ideal gas — empirical and kinetic-theory derivations
  • Maxwell-Boltzmann velocity distribution
  • Microstates, Boltzmann entropy
  • Canonical partition function, ensembles

Part II — Classical atomistic simulation

  • Interatomic potentials: pair, EAM, bond-order, ML force fields
  • Static simulations: minimisation, equation of state
  • Molecular dynamics: Verlet, thermostats, barostats
  • Observables: RDF, MSD, diffusion, stress
  • Outlook: Monte Carlo + continuum methods (Unit 5)

Macroscopic Thermodynamics

State Variables and Equations of State

A thermodynamic system is described by a small set of state variables:

  • Mechanical: pressure \(P\), volume \(V\)
  • Thermal: temperature \(T\)
  • Composition: number of particles \(N\) (or chemical potential \(\mu\))
  • Energetic: internal energy \(U\), entropy \(S\)

An equation of state relates them, e.g. for the ideal gas:

\[f(P,V,N,T) = 0 \quad\Longleftrightarrow\quad PV = N k_B T\]

Equilibrium and the Zeroth Law

To define change, we must define not-changeequilibrium.

Two systems in contact are in:

  • Thermal equilibrium if they exchange no heat
  • Mechanical equilibrium if they exchange no work
  • Chemical equilibrium if they exchange no particles

Zeroth law: if \(A\sim B\) and \(B\sim C\) thermally, then \(A\sim C\).

\(\rightarrow\) Justifies temperature as an empirical state variable.

First Law — Energy Conservation

Internal energy \(U\) (“innere Energie”) = total energy stored in the system.

\[\Delta U = Q + W\]

  • \(Q\): heat transferred to the system
  • \(W\): work done on the system
  • Energy is neither created nor destroyed
  • \(U\) is a thermodynamic potential: knowing \(U\) and its derivatives describes the system

Including volumetric work gives the enthalpy:

\[H = U + PV\]

Second Law — Entropy

Heat conduction, friction, fracture, explosions — all are irreversible.

A new state variable, entropy \(S\), is needed:

\[dS \geq \frac{\delta Q}{T}\]

For an isolated system,

\[\Delta S \geq 0\]

Equality holds only in the (idealised) reversible limit.

\(\rightarrow\) Lossless engines do not exist. Time has a direction.

Free Energies — Which Potential When?

The most general thermodynamic potential is the Gibbs free energy:

\[G = H - TS = U + PV - TS\]

  • \(U\) — internal energy (closed, isolated system, \(S,V,N\) fixed)
  • \(H = U + PV\) — enthalpy (constant \(P\))
  • \(F = U - TS\)Helmholtz free energy (constant \(T,V\))
  • \(G = U + PV - TS\)Gibbs free energy (constant \(T,P\))

Choose the potential whose natural variables match the experimental constraints.

Heat Capacity and the Third Law

Heat capacity = ability to store energy as \(T\) changes:

\[C_V = \left(\frac{\partial U}{\partial T}\right)_V, \qquad C_P = \left(\frac{\partial H}{\partial T}\right)_P\]

Third law (Nernst): \(S \to 0\) as \(T \to 0\) for a perfect crystal.

\(\rightarrow\) Sets an absolute zero of entropy.

Macroscopic thermodynamics is powerful but incomplete — it does not explain how \(T\), \(S\), \(P\) emerge from atomic motion. That is the job of statistical mechanics.

Phase Equilibria — A Preview

At a phase boundary (e.g. solid-liquid), the Gibbs free energies of two phases are equal:

\[G_1(T,P) = G_2(T,P)\]

Differentiating along the coexistence curve gives the Clausius-Clapeyron relation:

\[\frac{dP}{dT} = \frac{\Delta S}{\Delta V} = \frac{L}{T \Delta V}\]

with latent heat \(L = T \Delta S\).

\(\rightarrow\) Foundation for phase diagrams, the focus of later MG units.

The Ideal Gas

Why the Ideal Gas?

  • Simplest non-trivial many-particle system
  • \(N\) point particles, mass \(m\), only elastic collisions
  • First atomistic model where macroscopic thermodynamics can be derived from mechanics
  • Standard test case for any new statistical-mechanics method
  • We will see: it explains \(P\), \(T\), the Maxwell-Boltzmann distribution

Empirical Gas Laws

Three empirical laws were combined into one:

  • Boyle (\(T,N\) fixed): \(\quad PV = \text{const}\)
  • Charles (\(P,N\) fixed): \(\quad V \propto T\)
  • Gay-Lussac (\(V,N\) fixed): \(\quad P \propto T\)
  • Avogadro (\(P,T\) fixed): \(\quad V \propto N\)

Combining all four:

\[PV \propto NT \quad\Longrightarrow\quad PV = N k_B T = n R T\]

with Boltzmann constant \(k_B\), gas constant \(R = N_A k_B\), and \(n\) in moles.

Kinetic Theory — Setup

A cubic box, side \(L\), volume \(V=L^3\), with \(N\) particles obeying

\[m_i \ddot{\mathbf{x}}_i = \mathbf{f}_i\]

  • Particles only undergo elastic wall collisions
  • On average: uniform spatial distribution and isotropic velocity distribution
  • Pressure = average force per unit wall area:

\[P = \frac{\langle F\rangle}{L^2}\]

Kinetic Theory — Pressure from Collisions

A particle hitting a wall (normal \(\hat{x}\)) reverses \(v_x \to -v_x\), transferring momentum

\[\Delta p = 2 m v_x\]

Time between successive hits on the same wall: \(\tau = 2L/v_x\).

Average force per particle and total force:

\[F_i = \frac{2 m v_x}{2L/v_x} = \frac{m v_x^2}{L}, \qquad \langle F\rangle = \frac{N m}{L}\langle v_x^2\rangle\]

\[\boxed{\,PV = N m \langle v_x^2\rangle\,}\]

Equipartition and the Ideal Gas Law

Isotropy: \(\langle v_x^2\rangle = \langle v_y^2\rangle = \langle v_z^2\rangle\), so \(\langle v^2\rangle = 3\langle v_x^2\rangle\).

\[PV = \tfrac{1}{3} N m \langle v^2\rangle = \tfrac{2}{3} N \langle E_\text{kin}\rangle\]

Comparing with \(PV = N k_B T\) gives

\[\langle E_\text{kin}\rangle = \tfrac{3}{2} k_B T\]

— the equipartition theorem: \(\tfrac{1}{2}k_B T\) per quadratic degree of freedom.

Maxwell-Boltzmann — Construction

Goal: probability density \(f(\mathbf{v})\) of velocities at temperature \(T\).

  • Isotropy: \(f\) depends only on \(|\mathbf{v}|^2 = v_x^2 + v_y^2 + v_z^2\)
  • Independence of axes: \(f(v_x,v_y,v_z) = g(v_x)\,g(v_y)\,g(v_z)\)
  • Both conditions \(\Rightarrow\) \(g\) must be Gaussian: only the exponential turns a sum into a product

Maxwell-Boltzmann — The Distribution

The unique solution (Maxwell 1860, Boltzmann 1872):

\[f(\mathbf{v}) = \left(\frac{m}{2\pi k_B T}\right)^{3/2} \exp\!\left[-\frac{m\,|\mathbf{v}|^2}{2 k_B T}\right]\]

Equivalently in kinetic energy:

\[f(E_\text{kin}) = \left(\frac{m}{2\pi k_B T}\right)^{3/2} \exp\!\left[-\frac{E_\text{kin}}{k_B T}\right]\]

The exponential weight \(\exp(-E/k_B T)\) is our first hint of the Boltzmann distribution.

Maxwell-Boltzmann — Characteristic Speeds

Speed distribution (multiply \(f(\mathbf{v})\) by \(4\pi v^2\)):

\[F(v) = 4\pi v^2 \left(\frac{m}{2\pi k_B T}\right)^{3/2} e^{-mv^2/2k_B T}\]

Three speed scales:

  • Most probable: \(v_p = \sqrt{2 k_B T/m}\)
  • Mean: \(\langle v\rangle = \sqrt{8 k_B T/(\pi m)}\)
  • RMS: \(v_\text{rms} = \sqrt{3 k_B T/m}\)

Higher \(T\) broadens and shifts the curve to higher speeds.

Statistical Mechanics

From Mechanics to Statistics

Macroscopic system: \(N\sim N_A \sim 10^{23}\) — tracking trajectories is hopeless.

  • Replace exact dynamics by probability distributions over microstates
  • A microstate = a full specification \((\mathbf{x}_1,\dots,\mathbf{x}_N,\mathbf{v}_1,\dots,\mathbf{v}_N)\)
  • A macrostate = a small set of state variables (e.g. \(N,V,T\))
  • Many microstates realise the same macrostate
  • Macroscopic observables are ensemble averages over compatible microstates

The Classical Hamiltonian

Particles obey Newton’s equations under a potential \(V(\mathbf{x}_1,\dots,\mathbf{x}_N)\).

The total energy is the classical Hamiltonian:

\[\mathcal{H} = E_\text{kin} + E_\text{pot} = \sum_{i=1}^{N} \frac{m_i}{2}\, \mathbf{v}_i^{\,2} + V(\mathbf{x}_1,\dots,\mathbf{x}_N)\]

  • Macroscopically, \(U\) is a state function
  • Microscopically, \(U = \langle \mathcal{H}\rangle\) — average over microstates
  • The link between the two is statistical mechanics

The Ergodic Hypothesis

We replace impossible time averages by ensemble averages:

  • Ergodic hypothesis: a single trajectory eventually visits all accessible microstates with the correct frequency
  • \(\Rightarrow\) \(\;\overline{A}_\text{time} = \langle A\rangle_\text{ensemble}\)
  • This is the philosophical bridge between MD (time average) and MC (ensemble average)
  • Microscopic trajectories may momentarily decrease entropy — the second law is statistical (Jarzynski, Crooks)

Boltzmann Entropy

Boltzmann’s tombstone equation:

\[\boxed{\,S = k_B \ln \Omega\,}\]

  • \(\Omega\) = number of microstates compatible with the macrostate
  • More accessible microstates \(\Rightarrow\) higher entropy
  • Connects information (microstate counting) to thermodynamics (entropy)
  • The second law \(\Delta S \geq 0\) becomes “systems evolve towards more probable macrostates”

Statistical Ensembles

An ensemble = collection of hypothetical copies of a system, all sharing the same macroscopic constraints.

  • Microcanonical (NVE): isolated, fixed energy
  • Canonical (NVT): in contact with heat bath
  • Isothermal-isobaric (NPT): heat bath + piston
  • Grand canonical (\(\mu\)VT): exchanges heat and particles
  • Isoenthalpic-isobaric (NPH): pressure bath, no heat exchange
  • Choice mirrors the experimental boundary conditions

The Boltzmann Distribution

For a system in the canonical ensemble (fixed \(N,V,T\), in contact with a heat bath), the probability of a microstate of energy \(E_i\) is

\[\boxed{\,p_i = \frac{1}{Z}\exp\!\left[-\frac{E_i}{k_B T}\right]\,}\]

  • Lower energy \(\Rightarrow\) exponentially more probable
  • Generalisation of the Maxwell-Boltzmann velocity distribution to any Hamiltonian, including interactions
  • Foundation for all of equilibrium statistical mechanics

The Canonical Partition Function

Normalisation gives the canonical partition function \(Z\):

\[Z = \sum_i g(E_i)\,e^{-E_i/k_B T} = \int_0^\infty g(E)\, e^{-E/k_B T}\, dE\]

  • \(g(E)\) = density of states — how many microstates at energy \(E\)
  • \(Z\) encodes all equilibrium thermodynamics of the system
  • Discrete or continuous — same logic

From \(Z\) to Thermodynamic Potentials

Internal energy as a \(Z\)-derivative:

\[U = \langle E\rangle = \frac{1}{Z}\sum_i g(E_i)\,E_i\,e^{-E_i/k_B T} = -\frac{\partial \ln Z}{\partial(1/k_B T)}\cdot\frac{1}{k_B}\]

Comparing with \(F = U - TS\) at fixed \(N,V\) gives the central identity:

\[\boxed{\,F = -k_B T \ln Z\,}\]

  • \(S = -\partial F/\partial T\)
  • \(P = -\partial F/\partial V\)
  • All thermodynamics from one quantity: \(\ln Z\)

Ensembles — Side by Side

Ensemble Fixed Potential
Microcanonical \(N,V,E\) \(S = k_B\ln\Omega\)
Canonical \(N,V,T\) \(F = -k_BT\ln Z\)
Grand canonical \(\mu,V,T\) \(\Omega_\text{gp}\)
Isothermal-isobaric \(N,P,T\) \(G\)
  • Pick the ensemble whose fixed variables match the experiment
  • Different ensembles give the same averages in the thermodynamic limit (\(N\to\infty\))
  • For fluctuations the choice matters
  • MD/MC simulations explicitly target one ensemble at a time

Classical Atomistic Simulation

Why Classical Atomistic Simulation?

  • DFT scales as \(\mathcal{O}(N_e^3)\) and is feasible for \(\sim 10^2\) atoms, ps timescales
  • Many materials questions need \(10^4\)-\(10^9\) atoms and ns-\(\mu\)s timescales: defects, grain boundaries, plasticity, glasses
  • Solution: freeze out electronic degrees of freedom into an effective potential
  • Atoms move under classical forces \(\mathbf{F}_i = -\nabla_i V\)
  • Quality of the simulation = quality of the potential

Interatomic Potentials — General Form

A general potential expands in \(n\)-body terms:

\[V = \sum_i V_1(i) + \tfrac{1}{2}\sum_{i\neq j}V_2(i,j) + \tfrac{1}{6}\sum_{i\neq j\neq k}V_3(i,j,k) + \cdots\]

  • \(V_1\): external field (often a constant for identical atoms — dropped)
  • \(V_2\): pair potential — depends only on \(r_{ij} = |\mathbf{x}_i - \mathbf{x}_j|\)
  • \(V_3, V_4, \dots\): angular and higher many-body terms
  • Truncation level controls cost vs. accuracy

Pair Potentials I — Harmonic and Lennard-Jones

Harmonic spring (near equilibrium):

\[\phi_\text{spring}(r) = \tfrac{1}{2}k(r-r_0)^2\]

  • Taylor expansion of any smooth minimum
  • Useful only near equilibrium

Lennard-Jones (van der Waals):

\[\phi_\text{LJ}(r) = 4\epsilon\!\left[\!\left(\frac{\sigma}{r}\right)^{\!12}\!\!-\!\left(\frac{\sigma}{r}\right)^{\!6}\right]\]

  • \(r^{-12}\): Pauli repulsion
  • \(r^{-6}\): dispersion (London)
  • \(\epsilon\) depth, \(\sigma\) length scale

Pair Potentials II — Coulomb

For charged particles, Coulomb interaction:

\[\phi_C(r_{ij}) = \frac{1}{4\pi\epsilon_0}\frac{q_i q_j}{r_{ij}}\]

  • Decays as \(1/r\)long-ranged
  • Naive truncation produces large artifacts
  • Periodic systems use Ewald summation, particle-mesh Ewald (PME), fast multipole methods
  • Splits short-range (real space) and long-range (reciprocal space) parts

Many-Body Potentials — EAM for Metals

Pair potentials fail for metals — bonding depends on local density, not just pair distances.

Embedded-atom method (Daw & Baskes 1984):

\[V_\text{EAM} = \sum_i F\!\left(\bar\rho_i\right) + \tfrac{1}{2}\sum_{i\neq j}\phi(r_{ij}), \quad \bar\rho_i = \sum_{j\neq i}\rho(r_{ij})\]

  • \(F\) = embedding energy of atom \(i\) in the local electron density \(\bar\rho_i\)
  • Captures Cauchy pressure, surface relaxations, vacancy formation
  • Workhorse for Cu, Al, Ni, Fe and related alloys

Many-Body Potentials — Bond-Order and Reactive

Covalent solids need explicit angles and breakable bonds:

  • Stillinger-Weber: explicit two- and three-body terms for Si
  • Tersoff (1988), Brenner/REBO: bond-order formalism — strength depends on local coordination
  • ReaxFF (van Duin 2001): bond-order + dynamic charges; can break/form bonds
  • AIREBO: REBO + dispersion + torsion for hydrocarbons

\[V_\text{SW} = \sum_{i<j}\phi_2(r_{ij}) + \sum_{i<j<k}\phi_3(r_{ij},r_{ik},\theta_{jik})\]

Modern ML Force Fields — Preview

Machine-learning potentials promise DFT accuracy at force-field cost:

  • Behler-Parrinello NN (2007): symmetry functions \(\to\) neural network
  • GAP (Bartók 2010): Gaussian process regression on SOAP descriptors
  • SchNet, NequIP, MACE: equivariant message-passing on atomic graphs
  • Trained on DFT energies and forces; transferable across compositions
  • Treated in depth in MG Unit 6+

Cutoffs and Periodic Boundaries

Pair-distance evaluation costs \(\mathcal{O}(N^2)\) — too much for large \(N\).

  • All physical potentials decay as \(r\to\infty\): introduce a cutoff radius \(r_c\)
  • \(V(r_{ij}\geq r_c)\approx 0\) \(\Rightarrow\) cost reduces to \(\mathcal{O}(N r_c^d)\)
  • Shift the potential, \(V_\text{shift}(r)=V(r)-V(r_c)\), or use a smooth switching function
  • Periodic boundary conditions: \(\mathbf{x}_n = \mathbf{x} + n_1 \mathbf{L}_1 + n_2 \mathbf{L}_2 + n_3 \mathbf{L}_3\)
  • Mimic an infinite bulk system with a finite simulation cell

Static Simulations — \(T = 0\) Mechanics

Set all velocities to zero: \(\mathbf{v}_i = \mathbf{0}\). The Hamiltonian collapses to

\[\mathcal{H} = V(\mathbf{x}_1,\dots,\mathbf{x}_N)\]

Stable structures are local minima of the PES:

\[\nabla_i V = \mathbf{0} \quad\Longleftrightarrow\quad \mathbf{f}_i = \mathbf{0}\;\;\forall i\]

  • No entropy, no temperature, no kinetics
  • Useful when these can be neglected or added perturbatively
  • Outcome depends strongly on the initial configuration

Energy Minimisation Algorithms

  • Steepest descent: \(\mathbf{x}\leftarrow\mathbf{x} - \alpha \nabla V\) — robust, slow near minima
  • Conjugate gradient: combines current gradient with previous direction — much faster
  • L-BFGS: quasi-Newton, builds an approximate Hessian from gradients
  • FIRE (Bitzek 2006): MD-style with adaptive damping — favourite for solids

All find a local minimum — global optimisation needs additional strategies (basin hopping, simulated annealing, genetic algorithms).

Equation-of-State Fits

Uniformly scale the cell volume, relax atoms at each \(V\), record \(E(V)\).

Fit a third-order Birch-Murnaghan EoS:

\[E(V) = E_0 + \tfrac{9 V_0 B_0}{16}\!\left[\!\left((V_0/V)^{2/3}\!-\!1\right)^{\!3}\! B_0' + \left((V_0/V)^{2/3}\!-\!1\right)^{\!2}\!\!\left(6 - 4 (V_0/V)^{2/3}\right)\right]\]

  • \(V_0\): equilibrium volume
  • \(E_0\): cohesive (minimum) energy
  • \(B_0\): bulk modulus from curvature
  • \(B_0'\): pressure derivative of \(B_0\)

What Static Simulations Give You

  • Lattice constants, cohesive energies of crystals
  • Point-defect energies (vacancies, interstitials, substitutionals)
  • Surface and grain-boundary energies
  • Elastic constants via quasistatic deformation: \(L_x' = (1+\Delta\epsilon)L_x\), relax, repeat
  • Dislocation cores, fracture nucleation — without thermal fluctuations

Limitation: \(T=0\) only — no entropy, no diffusion, no phase transitions.

Molecular Dynamics

Newton’s Equations for \(N\) Atoms

MD = direct numerical integration of Newton’s equations:

\[m_i \ddot{\mathbf{x}}_i = \mathbf{F}_i = -\nabla_i V(\mathbf{x}_1,\dots,\mathbf{x}_N)\]

  • Trajectory-level realisation of classical statistical mechanics
  • Time average \(\to\) ensemble average via the ergodic hypothesis
  • Generates a Boltzmann-distributed sample if integrated correctly
  • Cost dominated by force evaluations

The Velocity Verlet Integrator

The standard time integrator (Allen & Tildesley 1987, Frenkel & Smit 2002):

\[\mathbf{x}_i(t+\Delta t) = \mathbf{x}_i(t) + \mathbf{v}_i(t)\Delta t + \tfrac{\mathbf{F}_i(t)}{2m_i}\Delta t^2\]

\[\mathbf{F}_i(t+\Delta t) = -\nabla_i V\!\left(\mathbf{x}_1(t+\Delta t),\dots\right)\]

\[\mathbf{v}_i(t+\Delta t) = \mathbf{v}_i(t) + \frac{\mathbf{F}_i(t)+\mathbf{F}_i(t+\Delta t)}{2m_i}\Delta t\]

Why Velocity Verlet?

  • Symplectic: preserves the structure of phase-space volume
  • Time-reversible: \({\Delta t}\to -{\Delta t}\) retraces the trajectory
  • Second-order accurate in \(\Delta t\)
  • Energy bounded over very long simulations — no systematic drift
  • One force evaluation per step

\(\rightarrow\) The right tool for sampling NVE.

Time Step and Energy Conservation

  • \(\Delta t\) must resolve the fastest motion in the system (typically a bond vibration)
  • Rule of thumb: \(\Delta t \sim T_\text{vib}/20\)
  • Typical values: 1 fs (organic / biomolecular), 2-5 fs (metals), 0.5 fs (with H)
  • Too large \(\Rightarrow\) energy drift and violation of the first law
  • Constrain bond lengths (SHAKE, RATTLE, LINCS) to allow larger \(\Delta t\)

Sampling NVT — Thermostats

Plain Verlet samples NVE. To target NVT, modify velocities or add fictitious dynamics:

  • Velocity rescaling (Berendsen): trivial, but does not generate the canonical distribution
  • Andersen: stochastic velocity reassignment — disrupts dynamics
  • Langevin: adds friction \(-\gamma m \mathbf{v}\) + random force — a heat bath
  • Nosé-Hoover: extended Lagrangian with a fictitious thermostat coordinate — deterministic and canonical
  • Nosé-Hoover chains: chains of thermostats fix ergodicity issues

Sampling NPT — Barostats

To target constant pressure, allow the simulation cell to fluctuate:

  • Berendsen barostat: simple isotropic rescaling of cell + coordinates — fast equilibration, wrong fluctuations
  • Parrinello-Rahman (1981): full anisotropic cell dynamics — needed for solid-solid phase transitions
  • MTK (Martyna-Tobias-Klein): combines Nosé-Hoover thermostat + Parrinello-Rahman barostat
  • Cell shape changes capture lattice strains and martensitic transitions

Common Observables — Structure

Radial distribution function \(g(r)\):

\[g(r) = \frac{V}{N^2}\!\left\langle\sum_{i\neq j}\delta(r - r_{ij})\right\rangle\]

  • \(g(r)\to 1\) at large \(r\)
  • Sharp peaks at neighbour shells in solids
  • Broad peaks in liquids; smooth Gaussian-like in dense fluids
  • First peak gives nearest-neighbour distance and coordination
  • Connects to scattering experiments via the structure factor \(S(q)\)

Common Observables — Dynamics and Stress

Mean-square displacement:

\[\text{MSD}(t)=\langle|\mathbf{r}_i(t)-\mathbf{r}_i(0)|^2\rangle\]

Long-time slope \(\to\) diffusion constant

\[D = \lim_{t\to\infty}\frac{\text{MSD}(t)}{6t}\]

Virial pressure:

\[P = \frac{N k_B T}{V_\text{box}} + \frac{1}{3 V_\text{box}}\!\left\langle\sum_{i<j}\mathbf{r}_{ij}\!\cdot\!\mathbf{F}_{ij}\right\rangle\]

  • Kinetic + configurational contributions
  • Generalises to a stress tensor for solids

What MD Gives You — and Limits

  • Full dynamical information: diffusion, viscosity, time-correlation functions
  • Non-equilibrium relaxation, shock loading, friction
  • Thermal expansion, melting points, free-energy differences (with extra tools)
  • Limitation: real time, step by step
  • Rare events (chemical reactions, dislocation nucleation) with barriers \(\gg k_B T\) are essentially invisible at accessible timescales
  • Mitigations: metadynamics, accelerated MD, transition path sampling — beyond this lecture

Closing

Unit 4 — Key Takeaways

  • Macroscopic thermodynamics: state variables, four laws, free energies \(U,H,F,G\)
  • Statistical mechanics bridges microstates and macrostates: \(S = k_B \ln \Omega\), \(p_i \propto e^{-E_i/k_B T}\)
  • The canonical partition function \(Z\) encodes all equilibrium thermodynamics: \(F = -k_B T \ln Z\)
  • Ensembles mirror experimental constraints (NVE, NVT, NPT, \(\mu\)VT)
  • Classical atomistic simulation trades electronic detail for huge \(N\) via interatomic potentials
  • MD integrates Newton; thermostats and barostats target the desired ensemble
  • The Boltzmann weight \(e^{-E/k_BT}\) is the link between every ensemble and every simulation method
  • Next: a different way to sample the same Boltzmann distribution — Monte Carlo

Outlook — Unit 5 and Beyond

  • Unit 5: Monte Carlo sampling and continuum mechanics — the rest of the simulation toolkit
  • Unit 6: local atomic environments + universal MLIPs (MACE, M3GNet, CHGNet) — descriptors and the force fields that plug directly back into the MD machinery we built today
  • Unit 7: graph-based crystal representations and GNNs — encoding structures for ML
  • Later units: representation learning, generative models, uncertainty-aware discovery
  • The thermodynamic and statistical-mechanical language of this unit is everywhere in what follows

Continue

References