Awesome
GridapPETSc
GridapPETSc
is a plugin of GridapDistributed.jl
that provides the full set of scalable linear and nonlinear solvers in the PETSc library. It also provides serial solvers to Gridap.jl
.
Documentation
Take a look at this tutorial for learning how to use GridapPETSc
in distributed-memory simulations of PDEs.
It can also be used in the serial case, as shown in this test.
Installation
GridapPETSc
julia package requires the PETSC
library (Portable, Extensible Toolkit for Scientific Computation) and MPI
to work correctly. You have two main options to install these dependencies.
-
Do nothing [recommended in most cases]. Use the default precompiled
MPI
installation provided byMPI.jl
and the pre-compiledPETSc
library provided byPETSc_jll
. This will happen under the hood when you installGridapPETSc
. In the case ofGridapPETSc
, you can also force the installation of these default dependencies by setting the environment variableJULIA_PETSC_LIBRARY
to an empty value. -
Choose a specific installation of
MPI
andPETSc
available in the system [recommended in HPC clusters].- First, choose a
MPI
installation. See the documentation ofMPI.jl
for further details. - Second, choose a
PETSc
installation. To this end, create an environment variableJULIA_PETSC_LIBRARY
containing the path to the dynamic library object of thePETSC
installation (i.e., the.so
file in linux systems). Very important: The chosenPETSc
library needs to be configured with theMPI
installation considered in the previous step.
- First, choose a
Notes
GridapPETSc
default sparse matrix format is 0-based compressed sparse row. This type of sparse matrix storage format can be described by theSparseMatrixCSR{0,PetscReal,PetscInt}
andSymSparseMatrixCSR{0,PetscReal,PetscInt}
Julia types as implemented in the SparseMatricesCSR Julia package.- When running in MPI parallel mode (i.e., with a MPI communicator different from
MPI.COMM_SELF
),GridapPETSc
implements a sort of limited garbage collector in order to automatically deallocate PETSc objects. This garbage collector can be manually triggered by a call to the functionGridapPETSc.gridap_petsc_gc()
.GridapPETSc
automatically calls this function inside at different strategic points, and this will be sufficient for most applications. However, for some applications, with a very frequent allocation of PETSc objects, it might be needed to call this function from application code. This need will be signaled by PETSc via the following internal message errorPETSC ERROR: No more room in array, limit 256 recompile src/sys/objects/destroy.c with larger value for MAXREGDESOBJS