[MITgcm-support] Parameters to parallelise a coupled run ?

Alexandre Pohl Alexandre.Pohl at lsce.ipsl.fr
Thu Dec 12 04:53:35 EST 2013


Hi all,

I asked you a few weeks ago about a way to merge netcdf output files. I am still learning to use the MITgcm, to hopefully be able to apply it soon to paleoclimate modelling (the Ordovician glaciation).

But I am now having problems to parallelise the code. I use the cubed-sphere exch2 grid. I started from the "cpl_aim+ocn" verification case. It runs well with 6 faces, 6 tiles and 3 processes (1 for the ocean, 1 for the atmosphere and 1 for the coupler). 

To parallelise the code, I then divided each face of the cube into 8 tiles using the SIZE.h file :

     &           sNx =  16,
     &           sNy =   8,
     &           OLx =   2,
     &           OLy =   2,
     &           nSx =   1,
     &           nSy =   1,
     &           nPx =  48,
     &           nPy =   1,
     &           Nx  = sNx*nSx*nPx,        > which gives 768 points in X
     &           Ny  = sNy*nSy*nPy,        > which gives 8 points in Y
     &           Nr  =  15)  [Nr=5 for the atmosphere]

It used the "adjustment.cs-32x32x1" verification files to write my own, except that I gave 1 process to each tile ("nSx=1" above), which finally makes 6*8=48 processes ("nPx=48") instead of a single process.

I added " usingMPI=.TRUE.," to my eedata files and set  "nTx=1".

I am now wondering how processes I have to use for the ocean and for the atmosphere. I use the "run_cpl_test" script to prepare, compile and execute the model. I tried "Npr=97; NpOc=48;" which means 48 processes for the ocean, 48 for the atmosphere and 1 single processor for the coupler, but the model unfortunately does not run faster with these 97 processes than with only 3.

Could you tell me if my parameters are wrong?
Have I forgotten something?

I would be very interested if somebody could describe a set of parameters that allow to parallelise the model efficiently.

Thank you for your help,
Alexandre



More information about the MITgcm-support mailing list