PETSc
PETSc
Notes
1. PETSc has been compiled with both 1 different mpi implementation (openMPI and mpich2). The initial request was to use the mpich2 implementation. However, you may use whatever version works best for you.
External Links:
http://www.mcs.anl.gov/petsc/petsc-2/documentation/installation.html
http://www.mcs.anl.gov/petsc/petsc-2/download/index.html
Documentation: Tutorials
http://www.mcs.anl.gov/petsc/petsc-2/documentation/tutorials/index.html
Comments
Portable, Extensible Toolkit for Scientific Computation
PETSc, pronounced PET-see (the S is silent), is a suite of data structures and routines for the scalable (parallel) solution of scientific applications modeled by partial differential equations. It employs the MPI standard for parallelism.
The installed version of PETSc is 3.2; released Sept 8, 2011.
PETSc wth gnu compilers and openMPI
PETSc - gnu version (compiled using gnu compilers, gnu Blas/Lapack and OpenMPI implementaion)
Usage
module load petsc/3.2p3-gnu
Installation Directory
/sw/petsc/petsc-3.2-p3
compilation
mkdir /sw/petsc/ tar -zxvf /data1/petsc-3.2-p3.tar.gz mv petsc-3.2-p3 3.2-p3-gnu cd /sw/petsc/3.2-p3-gnu export PETSC_DIR=$PWD PETSC_ARCH=linux-gnu module load ATLAS/3.9.39 module load mpi/openMPI/1.4.3-gnu module load python/2.7.1 module load cuda/4.0 ./config/configure.py --with-vendor-compilers=0 --with-mpi-dir=/sw/openMPI/1.4.3-gnu/ --with-blas-lapack-dir=/sw/ATLAS/3.9.39/ 2>&1 | tee configure.txt make 2>&1 | tee make.txt make test 2>&1 | tee make_test.txt >>>>>>>>>>>>>>>>>>>>>>> make test 2>&1 | tee make_test.txt Running test examples to verify correct installation Using PETSC_DIR=/sw/petsc/3.2-p3-gnu and PETSC_ARCH=arch-linux2-c-debug C/C++ example src/snes/examples/tutorials/ex19 run successfully with 1 MPI process C/C++ example src/snes/examples/tutorials/ex19 run successfully with 2 MPI processes Fortran example src/snes/examples/tutorials/ex5f run successfully with 1 MPI process Completed test examples >>>>>>>>>>>>>>>>>>>>>>>>> make install 2>&1 | tee make_Install.txt >>>>>>>>>>> make install 2>&1 | tee make_Install.txt *** using PETSC_DIR=/sw/petsc/3.2-p3-gnu PETSC_ARCH=arch-linux2-c-debug *** ******************************************************************** Install directory is current directory; nothing needs to be done ******************************************************************** >>>>>>>>>>
PETSc wth gnu compilers and mpich2
Installation directory
Â
Usage
module load petsc/3.2p3-gnu-mpich2
Compilation
export PETSC_DIR=$PWD PETSC_ARCH=linux-gnu module load ATLAS/3.9.39 module load mpi/mpich2/2.1.4.1p1-gnu module load python/2.7.1 module load cuda/4.0 ./config/configure.py --with-vendor-compilers=0 --with-mpi-dir=/sw/mpi/mpich2/2.1.4.1p1-gnu/ --with-blas-lapack-dir=/sw/ATLAS/3.9.39/ 2>&1 | tee configure.txt make 2>&1 | tee make.txt Now to check if the libraries are working do: make PETSC_DIR=/sw/petsc/3.2-p3-gnu-mpich2 PETSC_ARCH=arch-linux2-c-debug test make test 2>&1 | tee make_test.txt
PETSc wth gnu compilers and mvapich2
Installation directory
/sw/petsc/3.2-p3-gnu-mvapich2
Usage
module load petsc/3.2p3-gnu-mvapich2
Compilation
cd /sw/petsc/3.2-p3-gnu-mvapich2 tar -zxvf /data1/petsc-3.2-p3.tar.gz export PETSC_DIR=$PWD PETSC_ARCH=linux-gnu module load ATLAS/3.9.39 module load mpi/mvapich2/2.1rc2 module load python/2.7.1 module load cuda/4.0 ./config/configure.py --with-vendor-compilers=0 --with-mpi-dir=/sw/mpi/mvapich2/2.1rc2/ --with-blas-lapack-dir=/sw/ATLAS/3.9.39 2>&1 | tee configure.txt make 2>&1 | tee make.txt make test 2>&1 | tee make_test.txt
Version petsc-3.3-p1
module load mpi/mpich2/2.1.4.1p1-gnu #mkdir -p /sw/petsc/petsc-3.3-p1-gnu-mpich2 cd /sw/petsc/petsc-3.3-p1-gnu-mpich2 export PETSC_DIR=$PWD PETSC_ARCH=linux-gnu module load ATLAS/3.9.39 module load mpi/mpich2/2.1.4.1p1-gnu module load python/2.7.1 module load cuda/4.0 ./config/configure.py --with-vendor-compilers=0 --with-mpi-dir=/sw/mpi/mpich2/2.1.4.1p1-gnu/ --with-blas-lapack-dir=/sw/ATLAS/3.9.39/ --with-clanguage=cxx 2>&1 | tee configure.txt make 2>&1 | tee make.txt Now to check if the libraries are working do: make PETSC_DIR=/sw/petsc/petsc-3.3-p1-gnu-mpich2 PETSC_ARCH=arch-linux2-cxx-debug test make test 2>&1 | tee make_test.txt
=============================================================================== Cannot use --with-clanguage=C++ with CMake version 2.6-patch 4 < 2.8, falling back to legacy build =============================================================================== TESTING: alternateConfigureLibrary from PETSc.packages.mpi4py(/sw/petsc/petsc-3.3-p1-gnu-mpich2/config/PETSc/packages/mpi4py.py:49) TESTING: alternateConfigureLibrary from PETSc.packages.petsc4py(/sw/petsc/petsc-3.3-p1-gnu-mpich2/config/PETSc/packages/petsc4py.py:65) Compilers: C Compiler: /sw/mpi/mpich2/2.1.4.1p1-gnu/bin/mpicc -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -g3 -fno-inline -O0 C++ Compiler: /sw/mpi/mpich2/2.1.4.1p1-gnu/bin/mpicxx -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -g Fortran Compiler: /sw/mpi/mpich2/2.1.4.1p1-gnu/bin/mpif90 -Wall -Wno-unused-variable -g Linkers: Static linker: /usr/bin/ar cr Dynamic linker: /usr/bin/ar MPI: Includes: -I/sw/mpi/mpich2/2.1.4.1p1-gnu/include X: Includes: Library: -lX11 cmake: pthread: Library: -lpthread Arch: BLAS/LAPACK: -Wl,-rpath,/sw/ATLAS/3.9.39 -L/sw/ATLAS/3.9.39 -lmkl_intel_lp64 -lmkl_intel_thread -lmkl_core -liomp5 -lpthread PETSc: PETSC_ARCH: arch-linux2-cxx-debug PETSC_DIR: /sw/petsc/petsc-3.3-p1-gnu-mpich2 Clanguage: Cxx shared libraries: disabled dynamic loading: disabled Scalar type: real Precision: double Memory alignment: 16 xxx=========================================================================xxx Configure stage complete. Now build PETSc libraries with (legacy build): make PETSC_DIR=/sw/petsc/petsc-3.3-p1-gnu-mpich2 PETSC_ARCH=arch-linux2-cxx-debug all or (experimental with python): PETSC_DIR=/sw/petsc/petsc-3.3-p1-gnu-mpich2 PETSC_ARCH=arch-linux2-cxx-debug ./config/builder.py xxx=========================================================================xxx
Usage:3.3-p1-gnu-mpich2
module load petsc/3.3-p1-gnu-mpich2 module load intel-cc-11/11.1.072 module load intel-cmkl-11/11.1.072