Usage#

As with any other simulation software, coming from a question to an answer requires

  1. Pre-Processing

  2. Running the Simulation

  3. Post-Processing

Since the materialpoint model of DAMASK can be combined with a number of different solvers for initial and boundary value problems, a further initial step is necessary:

  1. Solver Selection

0. Solver Selection#

The below table compares the capabilities and specifics of the three available solvers to help select the most suitable one for your problem at hand.

Feature

Grid Solver

Mesh Solver

MSC Marc

included in DAMASK

open source

solution method

FEM or spectral (FFT)

FEM

FEM

geometry

regular grid

unstructured mesh

unstructured mesh

boundary conditions

mixed periodic

tbd

complex

Warning

The mesh solver is under development and not yet feature complete.

1. Pre-Processing#

DAMASK Materialpoint Model#

The materialpoint model of DAMASK is configured through a configuration file in _YAML format. Its format is documented in material.yaml. Additional information is given in the video tutorial “Configure a DAMASK Simulation” and the Jupyter notebook on how to “Create Texture for a Dual-Phase Microstructure”.

A set of curated configuration files is available as config.tar.xz.

Geometry and Load#

The procedure for generating the geometry and setting boundary and initial conditions depends on the selected solver.

Grid Solver#

A grid solver simulation is set up with a Geometry and a Load Case file. The video tutorials “Define a Grain Structure on a Regular Grid” and “Boundary Conditions on an Infinite Body” explain the underlying ideas of these file formats. Examples are given in the Jupyter notebooks “Generate a Three-Step Load Case for the Grid Solver”, “Create a Polycrystal with Voronoi Tessellation for the Grid Solver”, and “Rescale a Grid Solver Geometry File”.

A complete simulation setup is available as grid.tar.xz.

Mesh Solver#

A mesh solver simulation is set up with a Geometry and a Load Case file.

A complete simulation setup is available as mesh.tar.xz.

MSC Marc#

An MSC Marc input deck (*.dat) can be generated manually using a text editor or with Marc Mentat, where *.proc files can be used for automatization. DAMASK is interfaced to MSC Marc through a hypela2 user subroutine. The link between the geometry in the input deck and the material ID in material.yaml is provided via the StateVariable 2 field.

Note

Material IDs in DAMASK are zero-based.

A complete simulation setup is available as Marc.tar.xz.

2. Running the Simulation#

In general, each solver is an executable file and can, therefore, be directly invoked from the command line prompt. The specifics of how to start a simulation, such as command line arguments or how to parallelize, depend on the selected solver.

Grid Solver#

The basic call to launch the grid solver requires as mandatory arguments the simulated Geometry, corresponding Load Case, and material configuration.

DAMASK_grid --load {load}.yaml --geom {grid}.vti --material {material}.yaml

Optional arguments are –numerics, –jobname, –workingdirectory, –restart.

To parallelize the grid solver with n_threads openMP threads, the environment variable OMP_NUM_THREADS needs to be set accordingly.

OMP_NUM_THREADS={n_threads}
DAMASK_grid --load {load}.yaml --geom {grid}.vti --material {material}.yaml

Parallelization over n_proc MPI processes can be accomplished with

mpiexec -n {n_proc} DAMASK_grid --load {load}.yaml --geom {grid}.vti --material {material}.yaml

and will decompose the simulation domain along the z-axis into n_proc layers of approximately equal height.

Note

MPI and openMP parallelization can be used concurrently.

Mesh Solver#

The basic call to launch the mesh solver requires as arguments the simulated Geometry, corresponding Load Case, and material configuration.

DAMASK_mesh --load {load}.load --geom {mesh}.msh --material {material}.yaml

Optional arguments are –numerics, –jobname, –workingdirectory, –restart.

To parallelize the mesh solver with n_threads openMP threads, the environment variable OMP_NUM_THREADS needs to be set accordingly.

OMP_NUM_THREADS={n_threads}
DAMASK_mesh --load {load}.yaml --geom {mesh}.msh --material {material}.yaml

Parallelization over n_proc MPI processes can be accomplished with

mpiexec -n {n_proc} DAMASK_mesh --load {load}.yaml --geom {mesh}.msh --material {material}.yaml

and will decompose the simulation domain into partitions of approximatly equal number of elements.

Note

MPI and openMP parallelization can be used concurrently.

MSC Marc#

DAMASK is integrated into the commercial FEM software MSC Marc as a user subroutine hypela2 via DAMASK_Marc.f90.

Simulations can be started from the JOBDAMASK menu that gets integrated into Marc Mentat during installation.

Alternatively, the DAMASK Python library contains a small wrapper that assembles the execution call and launches the MSC Marc executable.

import damask
s = damask.solver.Marc()
s.submit_job(model=modelname, job=jobname)

The mandatory material configuration needs to be available in the current working directory as material.yaml. The optional numerics configuration needs to be available in the current working directory as numerics.yaml.

3. Post-Processing#

DAMASK results are stored in an HDF5-based file format. Usage examples are given in the video tutorials “Get a Custom View and Add Derived Quantities”, “Spatially-Resolved Visualization in Paraview”, and “Data Analysis: Using Matplotlib and Pandas” and Jupyter notebooks “Add Derived Field Data”, “Density Plot with Pandas”, “Plot Data per Grain with Scatter”, and “Calculate r-Value”.