[MITgcm-support] zonal or meridional means on multiple processors

Constantinos Evangelinos ce107 at ocean.mit.edu
Fri Jan 18 17:04:25 EST 2008


On Thu 17 Jan 2008, Samar Khatiwala wrote:
> Hi Martin
>
> I don't have an implementation in the MITgcm context.
>
> The link JM just sent does what you want but (I think) requires
> multiple MPI calls. The implementation I had in
> mind requires a single call. But the former will save you (or at
> least postpone) the pain of learning MPI.
>
> Samar
>
> On Jan 17, 2008, at 10:53 AM, Martin Losch wrote:
> > Hi Samar,
> > thanks, but it does mean that I'll have to use MPI - routines
> > directly, something I usually don't do (I know it's embarrassing,
> > that I don't even know the basic). Do you have by any chance the
> > code snippet with MPI_Gather in the MITgcm context?

Actually the cleanest way is to define subcommunicators of MPI_COMM_MODEL 
which is a cartesian communicator using


call MPI_CART_SUB(MPI_COMM_MODEL, REMAIN_DIMS, MPI_COMM_MODEL_ZON, IERROR)

or 

call MPI_CART_SUB(MPI_COMM_MODEL, REMAIN_DIMS, MPI_COMM_MODEL_MER, IERROR)

to get zonal and meridional communicators for groups of processors.

REMAIN_DIMS would be a LOGICAL vector with .TRUE. for the dimension that one 
wants to keep and .FALSE. for the dimension that one wants to partition.

Then one uses the same code as in global_vec_sum with an allreduce call over 
the corresponding communicator.

Constantinos
-- 
Dr. Constantinos Evangelinos
Department of Earth, Atmospheric and Planetary Sciences
Massachusetts Institute of Technology




More information about the MITgcm-support mailing list