Home

Awesome

PartMC

PartMC: Particle-resolved Monte Carlo code for atmospheric aerosol simulation

Latest version Docker build status Github Actions Status License DOI

Version 2.8.0
Released 2024-02-23

Source: https://github.com/compdyn/partmc

Homepage: http://lagrange.mechse.illinois.edu/partmc/

Cite as: M. West, N. Riemer, J. Curtis, M. Michelotti, and J. Tian (2024) PartMC, version, DOI

Copyright (C) 2005-2024 Nicole Riemer and Matthew West
Portions copyright (C) Andreas Bott, Richard Easter, Jeffrey Curtis, Matthew Michelotti, and Jian Tian
Licensed under the GNU General Public License version 2 or (at your option) any later version.
For details see the file COPYING or http://www.gnu.org/licenses/old-licenses/gpl-2.0.html.

References:

Running PartMC with Docker

This is the fastest way to get running.

In the above docker run command the arguments are:

The directory structure inside the docker container is:

/partmc           # a copy of the partmc git source code repository
/build            # the diretory in which partmc was compiled
/build/partmc     # the compiled partmc executable
/run              # the default diretory to run in

Dependencies

Required dependencies:

Optional dependencies:

Installation

  1. Install cmake and NetCDF (see above). The NetCDF libraries are required to compile PartMC. The netcdf.mod Fortran 90 module file is required, and it must be produced by the same compiler being used to compile PartMC.

  2. Unpack PartMC:

     tar xzvf partmc-2.8.0.tar.gz
    
  3. Change into the main PartMC directory (where this README file is located):

     cd partmc-2.8.0
    
  4. Make a directory called build and change into it:

     mkdir build
     cd build
    
  5. If desired, set environment variables to indicate the install locations of supporting libraries. If running echo $SHELL indicates that you are running bash, then you can do something like:

     export NETCDF_HOME=/
     export MOSAIC_HOME=${HOME}/mosaic-2012-01-25
     export SUNDIALS_HOME=${HOME}/opt
     export GSL_HOME=${HOME}/opt
    

    Of course the exact directories will depend on where the libraries are installed. You only need to set variables for libraries installed in non-default locations, and only for those libraries you want to use. Everything except NetCDF is optional.

    If echo $SHELL instead is tcsh or similar, then the environment variables can be set like setenv NETCDF_HOME / and similarly.

  6. Run cmake with the main PartMC directory as an argument (note the double-c):

     ccmake ..
    
  7. Inside ccmake press c to configure, edit the values as needed, press c again, then g to generate. Optional libraries can be activated by setting the ENABLE variable to ON. For a parallel build, toggle advanced mode with t and set the CMAKE_Fortran_COMPILER to mpif90, then reconfigure.

  8. Optionally, enable compiler warnings by pressing t inside ccmake to enable advanced options and then setting CMAKE_Fortran_FLAGS to:

     -O2 -g -fimplicit-none -W -Wall -Wconversion -Wunderflow -Wimplicit-interface -Wno-compare-reals -Wno-unused -Wno-unused-parameter -Wno-unused-dummy-argument -fbounds-check
    
  9. Compile PartMC and test it as follows.

     make
     make test
    
  10. To run just a single test do something like:

     ctest -R bidisperse   # argument is a regexp for test names
    
  11. To see what make is doing run it like:

    VERBOSE=1 make
    
  12. To run tests with visible output or to make some plots from the tests run them as follows. Note that tests often rely on earlier tests in the same directory, so always run test_1, then test_2, etc. Tests occasionally fail due to random sampling, so re-run the entire sequence after failures. For example:

    cd test_run/emission
    ./test_emission_1.sh
    ./test_emission_2.sh
    ./test_emission_3.sh            # similarly for other tests
    gnuplot -persist plot_species.gnuplot # etc...
    
  13. To run full scenarios, do, for example:

    cd ../scenarios/1_urban_plume
    ./1_run.sh
    

Usage

The main partmc command reads .spec files and does the run specified therein. Either particle-resolved runs, sectional-code runs, or exact solutions can be generated. A run produces one NetCDF file per output timestep, containing per-particle data (from particle-resolved runs) or binned data (from sectional or exact runs). The extract_* programs can read these per-timestep NetCDF files and output ASCII data (the extract_sectional_* programs are used for sectional and exact model output).

Python bindings

The PyPartMC project offers pip-installable Python bindings to PartMC. Both source and binary packages are available and ship with all PartMC dependencies included. PyPartMC exposes internal components of PartMC (utility routines and derived types) which then can serve as building blocks to develop PartMC simulations in Python. Time stepping can be performed either using the internal PartMC time-stepper or externally within a Python loop. The latter allows to couple the simulation with external Python components in each timestep. PyPartMC features examples developed as Jupyter notebooks. Snippets of code provided in the README file depict how to use PyPartMC from Julia (using PyCall.jl) and Matlab (using Matlab's built-in Python bridge).