[MITgcm-support] Error while using both MPI and MNC packages

Sachiko Mohanty mohantysachiko18 at gmail.com
Thu May 8 03:24:34 EDT 2014


Hello MITgcm users,

I am trying to use both multi processors (MPI) and NedCDF (MNC) packages at
the same time.
In code my configuration is :
* "SIZE.h"*
     &           sNx =  65,
     &           sNy =  45,
     &           OLx =   2,
     &           OLy =   2,
     &           nSx =   1,
     &           nSy =   1,
     &           nPx =   6,
     &           nPy =   4,
     &           Nx  = sNx*nSx*nPx,
     &           Ny  = sNy*nSy*nPy,
     &           Nr  =  23)

*"packages.conf"*

oceanic
gfd
exf
timeave
#mdsio
mnc
obcs

In "input" folder. The configuration is
*"data.pkg"*
# Packages
 &PACKAGES
 useOBCS=.TRUE.,
 useEXF=.TRUE.,
 useMNC=.TRUE.,
 &

* "eedata"*
 # Example "eedata" file  for multi-threaded test
#   (copy "eedata.mth" to "eedata" to use it)
# Lines beginning "#" are comments
# nTx - No. threads per process in X
# nTy - No. threads per process in Y
 &EEPARMS
 nTx=1,
 nTy=1,
 useMPI=.TRUE.,
 &

*"data.mnc"*

# Example "data.mnc" file
# Lines beginning "#" are comments
 &MNC_01
# mnc_echo_gvtypes=.FALSE.,
# mnc_use_indir=.FALSE.,
 mnc_use_outdir=.TRUE.,
 mnc_outdir_str='mnc_test_',
#mnc_outdir_date=.FALSE.,
#monitor_mnc=.FALSE.,
#timeave_mnc=.FALSE.,
#snapshot_mnc=.FALSE.,
#pickup_read_mnc=.FALSE.,
#pickup_write_mnc=.FALSE.,
 &


I use 24 number of processors . This is the message i get while running the
model.

[dell at localhost input]$ mpirun -np 24 ./mitgcmuv > test
EEBOOT_MINIMAL: No. of procs=     1 not equal to nPx*nPy=    24
EEBOOT_MINIMAL: No. of procs=     1 not equal to nPx*nPy=    24
EEDIE: earlier error in multi-proc/thread setting
PROGRAM MAIN: ends with fatal Error
STOP ABNORMAL END: PROGRAM MAIN
(and this error is repeated for 24 times).

Please do provide me some suggestions/solutions.

Thanking all in advance

Regards,

Sachiko Mohanty
Centre for Atmospheric Sc.
IIT Delhi, India
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mitgcm.org/pipermail/mitgcm-support/attachments/20140508/5f8b68d3/attachment.htm>


More information about the MITgcm-support mailing list