Using ESMPy on Discover: Difference between revisions

Move to snap 24
Run Examples: Update examples for snap 24
Line 70: Line 70:
  srun.slurm: cluster configuration lacks support for cpu binding
  srun.slurm: cluster configuration lacks support for cpu binding
  ESMPy Ungridded Field Dimensions Example
  ESMPy Ungridded Field Dimensions Example
   interpolation mean relative error = 0.000768815903378
   interpolation mean relative error = 0.000768815903364
   mass conservation relative error  = 0.0
   mass conservation relative error  = 1.49257625157e-16


  $ mpirun -np 6 python examples/grid_mesh_regrid.py
  $ mpirun -np 6 python examples/grid_mesh_regrid.py
Line 77: Line 77:
  ESMPy Grid Mesh Regridding Example
  ESMPy Grid Mesh Regridding Example
   interpolation mean relative error = 0.00235869859211
   interpolation mean relative error = 0.00235869859211
$ mpirun -np 6 python ./examples/cubed_sphere_to_mesh_regrid.py
srun.slurm: cluster configuration lacks support for cpu binding
ESMPy cubed sphere Grid Mesh Regridding Example
  interpolation mean relative error = 0.00302911738799
  interpolation max relative (pointwise) error = 0.0101182527126


=== Launching MPI ===
=== Launching MPI ===


ESMPy says they have a couple ways to launch mpi as noted in [http://www.earthsystemmodeling.org/esmf_releases/last_built/esmpy_doc/html/api.html#parallel-execution the API documentation]. So far, only the mpirun methods works. I cannot figure out how to run the mpi_spawn_regrid.py example. If anyone does know, please inform me and edit this section, but it might just be "You can't on a cluster" or "You can't with Intel MPI".
ESMPy says they have a couple ways to launch mpi as noted in [http://www.earthsystemmodeling.org/esmf_releases/last_built/esmpy_doc/html/api.html#parallel-execution the API documentation]. So far, only the mpirun methods works. I cannot figure out how to run the mpi_spawn_regrid.py example. If anyone does know, please inform me and edit this section, but it might just be "You can't on a cluster" or "You can't with Intel MPI".