Top Banner
MD Seminar 2019 Lecture Note (M. Matsumoto): p. 76 5.8 Open Source Codes for Molecular Simulation You can make your own simulation code and tools by yourself, of course. But several codes are available, either as an open source or a commertial software. Here given are examples. 5.8.1 LAMMPS For details, see https://lammps.sandia.gov/ This code (well, actually a group of many codes) is a classical molecular dynamics code with a focus on materials modeling. It’s an acronym for Large-scale Atomic/Molecular Massively Parallel Simulator. 1.1. Overview of LAMMPS LAMMPS is a classical molecular dynamics (MD) code that models ensembles of particles in a liquid, solid, or gaseous state. It can model atomic, polymeric, biological, solid-state (metals, ceramics, oxides), granular, coarse-grained, or macro- scopic systems using a variety of interatomic potentials (force fields) and boundary conditions. It can model 2d or 3d systems with only a few particles up to millions or billions. LAMMPS can be built and run on a laptop or desktop machine, but is designed for parallel computers. It will run on any parallel machine that supports the MPI message-passing library. This includes shared-memory boxes and distributed- memory clusters and supercomputers. LAMMPS is written in C++. Earlier versions were written in F77 and F90. See the History page of the website for details. All versions can be downloaded from the LAMMPS website. LAMMPS is designed to be easy to modify or extend with new capabilities, such as new force fields, atom types, boundary conditions, or diagnostics. See the Modify doc page for more details. In the most general sense, LAMMPS integrates Newton s equations of motion for a collection of interacting particles. A single particle can be an atom or molecule or electron, a coarse-grained cluster of atoms, or a mesoscopic or macroscopic clump of material. The interaction models that LAMMPS includes are mostly short-range in nature; some long-range models are included as well. On parallel machines, LAMMPS uses spatial-decomposition techniques to partition the simulation domain into small sub-domains of equal computational cost, one of which is assigned to each processor. Processors communicate and store ghost atom information for atoms that border their sub-domain.
8

5.8 Open Source Codes for Molecular Simulation

Oct 03, 2021

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: 5.8 Open Source Codes for Molecular Simulation

MD Seminar 2019 Lecture Note (M. Matsumoto): p. 76

5.8 Open Source Codes for Molecular Simulation

You can make your own simulation code and tools by yourself, of course. But several codes are available,

either as an open source or a commertial software. Here given are examples.

5.8.1 LAMMPS For details, see https://lammps.sandia.gov/

This code (well, actually a group of many codes) is a classical molecular dynamics code with a focus

on materials modeling. It’s an acronym for Large-scale Atomic/Molecular Massively Parallel Simulator.

1.1. Overview of LAMMPSLAMMPS is a classical molecular dynamics (MD) code that models ensembles of particles in a liquid, solid, or gaseous

state. It can model atomic, polymeric, biological, solid-state (metals, ceramics, oxides), granular, coarse-grained, or macro-scopic systems using a variety of interatomic potentials (force fields) and boundary conditions. It can model 2d or 3d

systems with only a few particles up to millions or billions.

LAMMPS can be built and run on a laptop or desktop machine, but is designed for parallel computers. It will run onany parallel machine that supports the MPI message-passing library. This includes shared-memory boxes and distributed-

memory clusters and supercomputers. LAMMPS is written in C++. Earlier versions were written in F77 and F90. See the

History page of the website for details. All versions can be downloaded from the LAMMPS website.LAMMPS is designed to be easy to modify or extend with new capabilities, such as new force fields, atom types,

boundary conditions, or diagnostics. See the Modify doc page for more details.

In the most general sense, LAMMPS integrates Newton ’s equations of motion for a collection of interacting particles.A single particle can be an atom or molecule or electron, a coarse-grained cluster of atoms, or a mesoscopic or macroscopic

clump of material. The interaction models that LAMMPS includes are mostly short-range in nature; some long-rangemodels are included as well. On parallel machines, LAMMPS uses spatial-decomposition techniques to partition the

simulation domain into small sub-domains of equal computational cost, one of which is assigned to each processor. Processors

communicate and store“ ghost”atom information for atoms that border their sub-domain.

Page 2: 5.8 Open Source Codes for Molecular Simulation

MD Seminar 2019 Lecture Note (M. Matsumoto): p. 77

5.8.2 GROMACS For details, see http://www.gromacs.org/

Another popular package for molecular simulations is GROMACS.

About GromacsGROMACS is a versatile package to perform molecular dynamics, i.e. simulate the Newtonian equations of motion for

systems with hundreds to millions of particles. It is primarily designed for biochemical molecules like proteins, lipids andnucleic acids that have a lot of complicated bonded interactions, but since GROMACS is extremely fast at calculating the

nonbonded interactions (that usually dominate simulations) many groups are also using it for research on non-biological

systems, e.g. polymers.GROMACS supports all the usual algorithms you expect from a modern molecular dynamics implementation, (check the

online reference or manual for details), but there are also quite a few features that make it stand out from the competition:

GROMACS provides extremely high performance compared to all other programs. A lot of algorithmic optimizations havebeen introduced in the code; we have for instance extracted the calculation of the virial from the innermost loops over

pairwise interactions, and we use our own software routines to calculate the inverse square root. In GROMACS 4.6 andup, on almost all common computing platforms, the innermost loops are written in C using intrinsic functions that the

compiler transforms to SIMD machine instructions, to utilize the available instruction-level parallelism. These kernels are

available in either single and double precision, and in support all the different kinds of SIMD support found in x86-family(and other) processors.

5.8.3 Visuallization tools: VMD and Jmol

For details, see https://www.ks.uiuc.edu/Research/vmd/

For details, see http://jmol.sourceforge.net/

Example of VMD image.Example of Jmol image.

Page 3: 5.8 Open Source Codes for Molecular Simulation

MD Seminar 2019 Lecture Note (M. Matsumoto): p. 78

6 Various Methods for Particle Simulations

So far described is a method called “molecular dynamics simulation”. This belongs to a category of

“particle simulations50) ”, which include various other types. In this section, several examples of particle

methods are introduced. Refer to textbooks51) for more details.

6.1 BD method

Consider a colloidal system, in which organic or inorganic particles of typical size 0.1−10µm are dispersed

in aqueous solutions. When you observe the particles, they show random motions due to the thermal

fluctuations of small molecules surrounding them, which is the Brownian motions. If you are interested

only in the dynamics of colloidal particles, you can assume a simpler form of an equation of motion (often

called a “ ラ ン ジュ バ ン

Langevin  equation”) for each particle, such as

md2r

dt2= −∇U − γv + Fr (6-55)

where ∇U is the external force acting on the particle, −γv is the friction force proportional to the particle

velocity, and Fr is the random force by the surrounding small molecules. Motions of colloidal particles

can be traced by numerically integrating Eq. (6-55), which is known as the Brownian dynamics (BD)

simulation52) The basic concept is “separation of scales”; small molecules exhibit very fast motions

and short-ranged interactions, typical scales of which are ps and nm, while macromolecules have slower

(e.g., µs) dynamics with long-ranged (e.g., µm) interactions. Thus we can smear out the dynamics of

small molecules.

50) In some fields of numerical simulation, “particle simulations” are a special type of “mesh-free” (or meshless) scheme.51) MPS 法については,例えば 越塚誠一, 粒子法 (計算力学レクチャーシリーズ), 丸善 2005.BD 法や DPD 法については,例えば 佐藤 明, HOW TO 分子シミュレーション―分子動力学法、モンテカルロ法、ブラウン動力学法、散逸粒子動力学法, 共立出版 2004.

52) Some details are described in 熱物理工学 Thermal Science and Engineering.

Page 4: 5.8 Open Source Codes for Molecular Simulation

MD Seminar 2019 Lecture Note (M. Matsumoto): p. 79

6.2 DPD method

The dissipative particle dynamics (DPD,散逸粒子動力学) method53) is another type of coarse-grained (粗

視化) schemes; we roughly trace the motion of “particle clusters,” instead of each molecule. The governing

equation of motion is similar to Eq. (6-55), but each “particle” essentially represents a cluster of molecules.

The degree of “coarse graining” varies from several molecules per particle to ∼ 104 molecules per particle.

The DPD simulation has become popular recently in fields of polymer science and bio/medical science.

参考:ダイキン工業 電子システム事業部 COMTECのホームページよりhttp://www.comtec.daikin.co.jp/SC/prd/ms/dpd.html

DPDは、塗料・医薬・化粧品・徐放医薬品の開発にかかわる複雑流体研究向けのシミュレーションツールです。従来の原子論的シ

ミュレーションの範囲を超えた時間、空間スケールでの平衡状態やズレ 、狭隙間流動の構造的・動的な特性の予測を可能にします。

DPD は、密度は均一だが組成は変動する流体相をシミュレーションします。組成依存性は、系を構成するビーズ間ペア反発ポテ

ンシャルに由来します。反発力は、衝突するビーズの性質に依存します。わずかなビーズ間の力の違いによって、非常に複雑な形

態を持った系ができる時もあります。系は、熱ノイズを包含することによって、非常に早く平衡に達します。高分子種は、ビーズ

間が調和ばねによって繋がれたビーズの鎖とみなされます。鎖は一種類以 上のビーズを含み(例えば、ブロック共重合体)、鎖の

構造は分枝や複雑な連結性を持つことができます。全ての 力は、短距離力のためアルゴリズムは速く、ユーザーはより大きな系

の計算ができます。

(油と水と洗剤の系) 青い球は DPD 水分子で、赤い鎖は高分子量アルカン、ピンクと緑は活性剤のヘッドとテールを表します。

左側の画像は、油が水と活性剤の系に足された状態を表す初期構造です。最初は、アルカンは強く水から分離し(高い界面張力)、

活性剤は水層の中でゆるいクラスター状のミセルとして存在します。極性をもつヘッド(緑)は水の方に突き出ていて、親油性の

テール(ピンク)はミセルの内部にあります。右側の画像は、同じ系が時間の経過によって変化したものです。活性剤は、油と水

の界面に移動しています。系は、界面張力を下げることによって最も低い自由エネルギーを探し出します。疎水性な活性剤のテー

ルは、油の方に向き、そして親水性のグループは好んで水層に残ります。

計算プロパティ

• 相形態学• 鎖分子の末端間距離の分布• 鎖分子の結合距離の分布• ストレステンソル• 表面張力• ミセルの臨界濃度

• 集合体と凝結• 制限による混合性への影響• せん断による形態への影響• 密度プロフィル• 多種の拡散率

53) R. D. Groot, P. B. Warren, “Dissipative Particl Dynamis: Bridging the Gap between Atomistic and Mesoscopic Simu-lation,” J. Chem. Phys., 107 (1997) 4423–4435.

Page 5: 5.8 Open Source Codes for Molecular Simulation

MD Seminar 2019 Lecture Note (M. Matsumoto): p. 80

6.3 Particle Hydrodynamic Simulation: SPH vs. MPS

When you numerically treat complicated shapes or boundaries in continuum mechanics, it is very

time-consuming to generate appropriate meshes or lattices; thus meshfree schemes are promising. The

smoothed particle hydrodynamics (SPH) method54) is based on the idea that materials (solid, liquid, or

sometimes gas) are represented as assembly of many “fictitious particles”. Each particle is assumed to

obey a Newtonian equation of motion, derived from the original continuum mechanics. Properties (e.g.,

density, temperature, and pressure/stress) at each spatial point are evaluated as a “smoothed” average

of the particles. For details, refer to some references55) .

The concept of the moving particle simulation (MPS) method is similar to that of SPH, but it uses

a different type of weighting function.56)

Figure 6-44: SPHシミュレーションの例.http://www.sph-flow.com/validation fluid-impact.html より

54) J. J. Monaghan, “Simulating Free Surface Flows with SPH,” J. Computational Phys., 110 (1994) 399–406.55) 入門的な解説としては,たとえばhttp://www.ccs.tsukuba.ac.jp/Astro/Members/takashi/pub/sph.pdf

コンソーシアムも存在する: http://www.sph-flow.com/index.html56) 越塚誠一, 粒子法 (計算力学レクチャーシリーズ), 丸善 2005.

Page 6: 5.8 Open Source Codes for Molecular Simulation

MD Seminar 2019 Lecture Note (M. Matsumoto): p. 81

6.4 Stochastic Approach: Monte Carlo methods

All of above schemes are a ”deterministic (決定論的)” approach57) , which means that particles obey

some sort of equations of motion. In some situations, however, we are mainly concerned with the particle

distribution, instead of instant position of each particle. Fot that purpose, various types of Monte Carlo

(MC) methods have been developed as a ”stochastic (確率論的)” approach, where we are not interested

in the trajectory of each particle, but the probablity distribution is the main output.

1. Metropolis MC58) : Particles are randomly moved with a probability proportional to the Boltzmann

factor exp[− ∆E

kBT

], where ∆E is the energy difference between the configurations before and after

the motion. The system will approach a macroscopic equilibrium state with temperature T after

sufficient number of movements.

2. DSMC (Direct Simulation MC): Originally developed for rarefied gas dynamics (希薄気体力学),

where the Boltzmann transport equation for the distribution function is stochastically solved.

57) Also in the BD and the DPD schemes, the random forces play an important role. In this sense, the BD and the DPDmay be categorized into the stochastic approach.58) Some details are given in 熱物理工学 Thermal Science and Engineering and also in 熱物性論 Thermophysics forThermal Engineering.

Page 7: 5.8 Open Source Codes for Molecular Simulation

MD Seminar 2019 Lecture Note (M. Matsumoto): p. 82

7 MD Simulation in Research: Thermo-Fluidal Systems

7.1 Why nano scale?

• Spatial reasons

• Temporal reasons

7.2 Method

• Quantam systems

– Monte Carlo method

– Molecular dynamics method (ab initio MD, first principle MD)

– Various hybrid scheme (e.g., QM/MM)

• Classical systems

– Monte Carlo method

– Molecular dynamics method

– Mesoscale simulation

• Multi-scale, multi-physics simulations

7.3 Examples from my research fields

1. Phase equilibria

2. Phase change: Boiling, Condensation, Cavitation

3. Bubbles and Droplets

4. Surface instability

5. Thermal resistance

6. and many others...

7.4 Limitation of molecular simulations

• System size and time

• Simulation cost

Page 8: 5.8 Open Source Codes for Molecular Simulation

Contents

1 Equations of Motion: How to Integrate Them Numerically 41.1 Equations of motion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41.2 Numerical integration – Naıve method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51.3 Example: Harmonic oscillator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61.4 Elaborated algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

2 Prototype MD Program 132.1 Atomic Interaction Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132.2 Pair-wise Interaction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 162.3 Essence of MD Program . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192.4 MD Code, ver. 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192.5 Let’s Try: Execution of the Code on Windows . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

2.5.1 Executing C programs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 242.5.2 Visualizing data with gnuplot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

3 Statistical Mechanics for MD Simulation 293.1 Temperature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

3.1.1 Evaluating T . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 293.1.2 Controlling T . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

3.2 Boundary Conditions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 333.3 Pressure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

3.3.1 Thermodynamic expression of P . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

4 Data Analysis: Basics 424.1 Thermodynamic States . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 424.2 Static Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43

4.2.1 Thermophysical properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 434.2.2 Fluctuations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 444.2.3 Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44

4.3 Dynamic Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 474.3.1 Transport properties: direct evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 484.3.2 Autocorrelation functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

5 Advanced Topics 535.1 Bottleneck is the FORCE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 535.2 From physical viewpoints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

5.2.1 Potential cut-off . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 545.2.2 Book-keeping method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 545.2.3 Cell-division method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55

5.3 From mathematical viewpoints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 565.3.1 Systematic time integration based on symplectic transformation . . . . . . . . . . . . . . . 565.3.2 Multi time-step MD . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59

5.4 From computer’s viewpoints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 605.5 Paralell computing 1: Hardware side . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61

5.5.1 Examples of distributed memory systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 615.5.2 Examples of shared memory system . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 625.5.3 Homogeneous vs. heterogeneous . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64

5.6 Parallel computing 2: Software side . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 655.6.1 Let’s try OpenMP —Brief introduction for parallel computing beginners— . . . . . . . . . 66

5.7 Parallel Code of MD Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 695.7.1 Amdahl’s law . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 695.7.2 Particle parallel vs. Region parallel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 705.7.3 Sample code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70

5.8 Open Source Codes for Molecular Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 765.8.1 LAMMPS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 765.8.2 GROMACS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 775.8.3 Visuallization tools: VMD and Jmol . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77

6 Various Methods for Particle Simulations 786.1 BD method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 786.2 DPD method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 796.3 Particle Hydrodynamic Simulation: SPH vs. MPS . . . . . . . . . . . . . . . . . . . . . . . . . . . 806.4 Stochastic Approach: Monte Carlo methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81

7 MD Simulation in Research: Thermo-Fluidal Systems 827.1 Why nano scale? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 827.2 Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 827.3 Examples from my research fields . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 827.4 Limitation of molecular simulations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82