Building CVS Baselibs
This page will detail the process of building Baselibs. For the purposes of this page, GMAO-Baselibs-4_0_1 as an example tag, but this process should work with any "recent" tag (say, GMAO-Baselibs-3_3_0 or higher).
Check out Baselibs
The first step is to check out the tag you want.
Set up the 'bcvs' alias
Something useful to set up first is an alias that you can use to refer to the CVS repo for Baselibs. This is because the Baselibs are hosted on progress while the GEOS-5 model and most other tools are on the CVSACL repo. So, I recommend setting a 'bcvs' alias:
$ alias bcvs 'cvs -d:ext:USERNAME@progressdirect:/cvsroot/baselibs'
where USERNAME is your username at progress. Note that this works on discover and pleiades, but elsewhere you'll need to set up a tunnel to progress and alter the CVSROOT appropriately.
Checking out the tag
Next, checkout the tag:
$ bcvs co -r GMAO-Baselibs-4_0_1 -d GMAO-Baselibs-4_0_1 Baselibs
This will create a GMAO-Baselibs-4_0_1 inside which is a src directory.
Build Baselibs
The next task is to build Baselibs. In order to correctly build it, two arguments are needed: ESMF_COMM and CONFIG_SETUP. ESMF_COMM is the MPI stack used by ESMF (usually, mvapich2, mpi, openmpi, or intelmpi). CONFIG_SETUP is actually an "identifier" that will allow you to build multiple versions of Baselibs for multiple compiler/MPI combination. The style recommended is for, say, Intel 13.0.1.117 and MVAPICH2 1.9a2 is: CONFIG_SETUP=ifort_13.0.1.117-mvapich2_1.9a2 where you identify the compiler (by its name on the command line), its version, the MPI stack, and its version.
So for the above example you'd issue:
$ make install ESMF_COMM=mvapich2 CONFIG_SETUP=ifort_13.0.1.117-mvapich2_1.9a2 |& tee makeinstall.ifort_13.0.1.117-mvapich2_1.9a2.log
and it would build all the libraries. The tee is so that you can capture the install log, and also see it real-time. Note that the modules installed for above (say on discover) are:
1) comp/intel-13.0.1.117 2) other/mpi/mvapich2-1.9a2/intel-13.0.1.117 3) other/comp/gcc-4.6.3-sp1 4) other/SIVO-PyD/spd_1.6.0_gcc-4.6.3-sp1
or run:
$ module purge $ module load comp/intel-13.0.1.117 other/mpi/mvapich2-1.9a2/intel-13.0.1.117 other/comp/gcc-4.6.3-sp1 other/SIVO-PyD/spd_1.6.0_gcc-4.6.3-sp1
Once built, check for "Error" in your log file:
$ grep Error makeinstall.ifort_13.0.1.117-mvapich2_1.9a2.log (test $HDF5_Make_Ignore && echo "*** Error ignored") || \ (test $HDF5_Make_Ignore && echo "*** Error ignored") || \ H5Eset_auto2(H5E_DEFAULT, PrintErrorStackFunc, PrintErrorStackData); make[6]: [install-exec-hook] Error 1 (ignored) make[6]: [install-data-hook] Error 1 (ignored) Whimper ( "Error creating scratch file" ) ; /usr/local/other/SLES11.1/mvapich2/1.9a2/intel-13.0.1.117/bin/mpicc -c -O2 -DH5_USE_16_API -DLINUX64 -Df2cFortran -DHDF4_NETCDF_HAVE_SD -DLINUX64 -DPGS_MET_COMPILE -I/discover/swdev/mathomp4/Baselibs/GMAO-Baselibs-4_0_1/src/SDPToolkit/include -I/discover/swdev/mathomp4/Baselibs/GMAO-Baselibs-4_0_1/src/SDPToolkit/include/CUC -I/discover/swdev/mathomp4/Baselibs/GMAO-Baselibs-4_0_1/x86_64-unknown-linux-gnu/ifort_13.0.1.117-mvapich2_1.9a2/Linux/include/hdf -I/discover/swdev/mathomp4/Baselibs/GMAO-Baselibs-4_0_1/x86_64-unknown-linux-gnu/ifort_13.0.1.117-mvapich2_1.9a2/Linux/include/hdf5 PGS_MET_ErrorMsg.c -o /discover/swdev/mathomp4/Baselibs/GMAO-Baselibs-4_0_1/src/SDPToolkit/obj/linux64/MET/PGS_MET_ErrorMsg.o cp PGS_MET_ErrorMsg.o tmp/METErrorMsg.o
If you don't see any "Error 2" messages, you are probably safe.
Checking Baselibs
This is optional, but recommended. Many of the Baselibs have the ability to do a check. You should do this only in an environment where you can run MPI (like compute nodes at NCCS or NAS). This is because parallel NetCDF and ESMF tests are run:
$ make check ESMF_COMM=mvapich2 CONFIG_SETUP=ifort_13.0.1.117-mvapich2_1.9a2 |& tee makecheck.ifort_13.0.1.117-mvapich2_1.9a2.log
Note that at the moment, many will exit with errors. For example, NetCDF has tests that require internet access.
Modules
Discover
Intel 11
comp/intel-11.0.083 mpi/impi-3.2.2.006 lib/mkl-10.0.3.020 other/SIVO-PyD/spd_1.6.0_gcc-4.3.4-sp1
Intel 13
comp/intel-13.0.1.117 other/mpi/mvapich2-1.9a2/intel-13.0.1.117 other/comp/gcc-4.6.3-sp1 other/SIVO-PyD/spd_1.6.0_gcc-4.6.3-sp1 comp/intel-13.0.1.117 mpi/impi-4.0.3.008 other/comp/gcc-4.6.3-sp1 other/SIVO-PyD/spd_1.6.0_gcc-4.6.3-sp1 comp/intel-13.0.1.117 mpi/impi-4.0.1.007-beta other/comp/gcc-4.6.3-sp1 other/SIVO-PyD/spd_1.6.0_gcc-4.6.3-sp1
PGI 12
comp/pgi-12.8.0 other/mpi/openmpi/1.6.0-pgi-12.8.0 other/comp/gcc-4.6.3-sp1 other/SIVO-PyD/spd_1.6.0_gcc-4.6.3-sp1 comp/pgi-12.8.0 other/mpi/mvapich2-1.8/pgi-12.8.0 other/comp/gcc-4.6.3-sp1 other/SIVO-PyD/spd_1.6.0_gcc-4.6.3-sp1
Pleiades
Intel 11
comp/intel/11.0.083_64 mpi-intel/3.2.011 math/intel_mkl_64_10.0.011 python/2.6.1 comp/intel/11.0.083_64 mpi-sgi/mpt.2.06a67 math/intel_mkl_64_10.0.011 python/2.6.1
Intel 12
comp-intel/2012.0.032 mpi-sgi/mpt.2.06a67 python/2.6.1
PGI 12
pgi_12.8 mvapich2_1.8.1_pgi_12.8 python/2.6.1 pgi_12.8 mpi-sgi/mpt.2.06a67 python/2.6.1
Table of Modules
Discover | ||
---|---|---|
Compiler | MPI Stack | Other Libraries |
comp/intel-11.0.083 | mpi/impi-3.2.2.006 | lib/mkl-10.0.3.020 other/SIVO-PyD/spd_1.6.0_gcc-4.3.4-sp1 |
comp/intel-13.0.1.117 | other/mpi/mvapich2-1.9a2/intel-13.0.1.117 | other/comp/gcc-4.6.3-sp1 other/SIVO-PyD/spd_1.6.0_gcc-4.6.3-sp1 |
mpi/impi-4.0.3.008 | ||
mpi/impi-4.0.1.007-beta | ||
comp/pgi-12.8.0 | other/mpi/openmpi/1.6.0-pgi-12.8.0 | |
other/mpi/mvapich2-1.8/pgi-12.8.0 | ||
Pleiades | ||
Compiler | MPI Stack | Other Libraries |
comp/intel/11.0.083_64 | mpi-intel/3.2.011 | math/intel_mkl_64_10.0.011 python/2.6.1 |
mpi-sgi/mpt.2.06a67 | ||
comp-intel/2012.0.032 | mpi-sgi/mpt.2.06a67 | python/2.6.1 |
pgi_12.8 | mvapich2_1.8.1_pgi_12.8 | |
mpi-sgi/mpt.2.06a67 |