[Mitgcm-support] pre38 mpi on SGI

mitgcm-support at dev.mitgcm.org mitgcm-support at dev.mitgcm.org
Wed Jul 9 15:27:54 EDT 2003


Daniel etc....,

Seems to be something odd with pre38
on SGI with MPI. 

Problem happens for np > 1 and
tested with aim_LatLon and hs94.128....
c38, c37, c35 all seem OK.

Memory overwrite of buoyancyRelation value occurs after several hundred
exchanges  etc.... In test it occurs after four time steps
and happens in the cg2d_r exchange at the bottom of the
con. grad iterative loop. Looks like the specific
block is the MPI_Isend in send_put_y...

Not sure why its happening, but could be connected to RX -> RL,RS
mapping stuff. As far as I can tell this stuff looks OK.

Using native MPI on SGI .e.g f77 -lmpi ( not mpich
mpif77 etc... ).

Busy until 4pm - more later. Anyone any thoughts pls. send e-mail.
Daniel - could you try a short run on bonanza. 
cvs co -r pre38 -d p38 models/MITgcmUV etc....
hs94.128x ... is probably easier than aim....

Chris



More information about the MITgcm-support mailing list