Using ESMPy on Discover: Difference between revisions
Move to snap 24 |
→Load Modules: Comment out a bit of unnecessary text |
||
(5 intermediate revisions by the same user not shown) | |||
Line 12: | Line 12: | ||
other/comp/gcc-5.3-sp3 | other/comp/gcc-5.3-sp3 | ||
other/SSSO_Ana-PyD/SApd_4.1.1_py2.7_gcc-5.3-sp3 | other/SSSO_Ana-PyD/SApd_4.1.1_py2.7_gcc-5.3-sp3 | ||
<!-- | |||
And you point to the Baselibs in: | And you point to the Baselibs in: | ||
/discover/swdev/mathomp4/Baselibs/ | /discover/swdev/mathomp4/Baselibs/ESMA-Baselibs-5.0.4-beta25/x86_64-unknown-linux-gnu/ifort_17.0.0.098-intelmpi_17.0.0.098 | ||
--> | |||
=== Extra Modules for mpi4py and ESMPy === | === Extra Modules for mpi4py and ESMPy === | ||
Line 24: | Line 24: | ||
$ module use -a /home/mathomp4/modulefiles | $ module use -a /home/mathomp4/modulefiles | ||
$ module load python/mpi4py/2.0.0/ifort_17.0.0.098-intelmpi_17.0.0.098 python/ESMPy/7.1. | $ module load python/mpi4py/2.0.0/ifort_17.0.0.098-intelmpi_17.0.0.098 python/ESMPy/7.1.0b25/ifort_17.0.0.098-intelmpi_17.0.0.098 | ||
This should set up pretty much everything. A simple test to make sure is run: | This should set up pretty much everything. A simple test to make sure is run: | ||
Line 38: | Line 38: | ||
To run the examples that ESMF provides, copy them from: | To run the examples that ESMF provides, copy them from: | ||
$ cp -r /discover/swdev/mathomp4/Baselibs/ | $ cp -r /discover/swdev/mathomp4/Baselibs/ESMA-Baselibs-5.0.4-beta25/src/esmf/src/addon/ESMPy/examples . | ||
=== Download Test NC4 Files === | === Download Test NC4 Files === | ||
Line 70: | Line 70: | ||
srun.slurm: cluster configuration lacks support for cpu binding | srun.slurm: cluster configuration lacks support for cpu binding | ||
ESMPy Ungridded Field Dimensions Example | ESMPy Ungridded Field Dimensions Example | ||
interpolation mean relative error = 0. | interpolation mean relative error = 0.000768815903364 | ||
mass conservation relative error = | mass conservation relative error = 1.49257625157e-16 | ||
$ mpirun -np 6 python examples/grid_mesh_regrid.py | $ mpirun -np 6 python examples/grid_mesh_regrid.py | ||
Line 77: | Line 77: | ||
ESMPy Grid Mesh Regridding Example | ESMPy Grid Mesh Regridding Example | ||
interpolation mean relative error = 0.00235869859211 | interpolation mean relative error = 0.00235869859211 | ||
$ mpirun -np 6 python ./examples/cubed_sphere_to_mesh_regrid.py | |||
srun.slurm: cluster configuration lacks support for cpu binding | |||
ESMPy cubed sphere Grid Mesh Regridding Example | |||
interpolation mean relative error = 4.39699714748e-06 | |||
interpolation max relative (pointwise) error = 9.70527691296e-05 | |||
=== Launching MPI === | === Launching MPI === | ||
ESMPy says they have a couple ways to launch mpi as noted in [http://www.earthsystemmodeling.org/esmf_releases/last_built/esmpy_doc/html/api.html#parallel-execution the API documentation]. So far, only the mpirun methods works. I cannot figure out how to run the mpi_spawn_regrid.py example. If anyone does know, please inform me and edit this section, but it might just be "You can't on a cluster" or "You can't with Intel MPI". | ESMPy says they have a couple ways to launch mpi as noted in [http://www.earthsystemmodeling.org/esmf_releases/last_built/esmpy_doc/html/api.html#parallel-execution the API documentation]. So far, only the mpirun methods works. I cannot figure out how to run the mpi_spawn_regrid.py example. If anyone does know, please inform me and edit this section, but it might just be "You can't on a cluster" or "You can't with Intel MPI". |