[MITgcm-support] MITgcm-support Digest, Vol 131, Issue 15

Sachiko Mohanty mohantysachiko18 at gmail.com
Fri May 9 03:34:36 EDT 2014


@ Miroslaw :
I have included "eedata" in the "input" folder ( I had included it
previously also).

# Example "eedata" file
# Lines beginning "#" are comments
# nTx - No. threads per process in X
# nTy - No. threads per process in Y
 &EEPARMS
 nTx=1,
 nTy=1,
 usingMPI =.TRUE.,
 /
# Note: Some systems use & as the
# namelist terminator. Other systems
# use a / character (as shown here).

The same error continues as i have stated in the previous mail:

{  [dell at localhost input]$ mpirun -np 24 ./mitgcmuv > test2

STOP ABNORMAL END: PROGRAM MAIN    (24 times)


My "STDERR.0000" list the following errors:

(PID.TID 0000.0001) *** ERROR *** EESET_PARMS: Error reading parameter file
"eedata"

(PID.TID 0000.0001) *** ERROR *** EEDIE: earlier error in multi-proc/thread
setting
(PID.TID 0000.0001) *** ERROR *** PROGRAM MAIN: ends with fatal Error     }


And it is obvious that if we remove " usingMPI =.TRUE.," from here, then
the model runs with single processor and "mnc* " files are created.



On Thu, May 8, 2014 at 9:30 PM, <mitgcm-support-request at mitgcm.org> wrote:

> Send MITgcm-support mailing list submissions to
>         mitgcm-support at mitgcm.org
>
> To subscribe or unsubscribe via the World Wide Web, visit
>         http://mitgcm.org/mailman/listinfo/mitgcm-support
> or, via email, send a message with subject or body 'help' to
>         mitgcm-support-request at mitgcm.org
>
> You can reach the person managing the list at
>         mitgcm-support-owner at mitgcm.org
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of MITgcm-support digest..."
>
>
> Today's Topics:
>
>    1. Re: Error while using both MPI and MNC packages
>       (Miroslaw Andrejczuk)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Thu, 8 May 2014 13:29:22 +0000
> From: Miroslaw Andrejczuk <Andrejczuk at atm.ox.ac.uk>
> To: "mitgcm-support at mitgcm.org" <mitgcm-support at mitgcm.org>
> Subject: Re: [MITgcm-support] Error while using both MPI and MNC
>         packages
> Message-ID:
>         <
> 7C25BD49CACF3B48AA364C2BEB2D2DB850081DEE at EXCHNG10.physics.ox.ac.uk>
> Content-Type: text/plain; charset="iso-8859-1"
>
> You need eedata file (in the directory where you run the model) containing:
>
> # Example "eedata" file
> # Lines beginning "#" are comments
> # nTx - No. threads per process in X
> # nTy - No. threads per process in Y
>  &EEPARMS
>  nTx=1,
>  nTy=1,
>  usingMPI =.TRUE.,
>  /
> # Note: Some systems use & as the
> # namelist terminator. Other systems
> # use a / character (as shown here).
>
> Mirek
>
> ________________________________
> From: Sachiko Mohanty [mohantysachiko18 at gmail.com]
> Sent: Thursday, May 08, 2014 2:09 PM
> To: MITgcm-support at mitgcm.org
> Subject: Re: [MITgcm-support] Error while using both MPI and MNC packages
>
> @ Martin : I have now compiled as suggested by you "
>
> genmake2 -mpi [more options]" :- in the code.
>
>
>
>
> The next hurdle i face is at :
>
> (i have kept the configuration same as given in the previous mail)
>
> [dell at localhost input]$ mpirun -np 24 ./mitgcmuv > test2
>
> STOP ABNORMAL END: PROGRAM MAIN    (24 times)
>
>
> My "STDERR.0000" list the following errors:
>
> (PID.TID 0000.0001) *** ERROR *** EESET_PARMS: Error reading parameter
> file "eedata"
>
> (PID.TID 0000.0001) *** ERROR *** EEDIE: earlier error in
> multi-proc/thread setting
> (PID.TID 0000.0001) *** ERROR *** PROGRAM MAIN: ends with fatal Error
>
>
> and my "STDOUT.0000" ends at
>
> (PID.TID 0000.0001) //
> =======================================================
> (PID.TID 0000.0001) // Execution Environment parameter file "eedata"
> (PID.TID 0000.0001) //
> =======================================================
>
> (PID.TID 0000.0001) ># Example "eedata" file  for multi-threaded test
> (PID.TID 0000.0001) >#   (copy "eedata.mth" to "eedata" to use it)
> (PID.TID 0000.0001) ># Lines beginning "#" are comments
>
> (PID.TID 0000.0001) ># nTx - No. threads per process in X
> (PID.TID 0000.0001) ># nTy - No. threads per process in Y
> (PID.TID 0000.0001) > &EEPARMS
> (PID.TID 0000.0001) > nTx=1,
> (PID.TID 0000.0001) > nTy=1,
>
> (PID.TID 0000.0001) > useMPI=.TRUE.,
> (PID.TID 0000.0001) > /
> (PID.TID 0000.0001) ># Note: Some systems use & as the namelist terminator
> (as shown here).
> (PID.TID 0000.0001) >#       Other systems use a / character.
>
> (PID.TID 0000.0001)
> (PID.TID 0000.0001) // Shown below is an example "eedata" file.
> (PID.TID 0000.0001) // To use this example copy and paste the ">" lines.
> (PID.TID 0000.0001) // Then remove the text up to and including the ">".
>
> (PID.TID 0000.0001) ># Example "eedata" file
> (PID.TID 0000.0001) ># Lines beginning "#" are comments
> (PID.TID 0000.0001) ># nTx - No. threads per process in X
> (PID.TID 0000.0001) ># nTy - No. threads per process in Y
>
> (PID.TID 0000.0001) >&EEPARMS
> (PID.TID 0000.0001) > nTx=1,
> (PID.TID 0000.0001) > nTy=1,
> (PID.TID 0000.0001) >&
> (PID.TID 0000.0001) ># Note: Some systems use & as the namelist terminator
> (as shown here).
>
> (PID.TID 0000.0001) >#       Other systems use a / character.
> (PID.TID 0000.0001)
> PROGRAM MAIN: ends with fatal Error
>
>
>
> Please help me out.
>
>
> On Thu, May 8, 2014 at 12:54 PM, Sachiko Mohanty <
> mohantysachiko18 at gmail.com<mailto:mohantysachiko18 at gmail.com>> wrote:
> Hello MITgcm users,
>
> I am trying to use both multi processors (MPI) and NedCDF (MNC) packages
> at the same time.
> In code my configuration is :
>  "SIZE.h"
>      &           sNx =  65,
>      &           sNy =  45,
>      &           OLx =   2,
>      &           OLy =   2,
>      &           nSx =   1,
>      &           nSy =   1,
>      &           nPx =   6,
>      &           nPy =   4,
>      &           Nx  = sNx*nSx*nPx,
>      &           Ny  = sNy*nSy*nPy,
>      &           Nr  =  23)
>
> "packages.conf"
>
> oceanic
> gfd
> exf
> timeave
> #mdsio
> mnc
> obcs
>
> In "input" folder. The configuration is
> "data.pkg"
> # Packages
>  &PACKAGES
>  useOBCS=.TRUE.,
>  useEXF=.TRUE.,
>  useMNC=.TRUE.,
>  &
>
>  "eedata"
>  # Example "eedata" file  for multi-threaded test
> #   (copy "eedata.mth" to "eedata" to use it)
> # Lines beginning "#" are comments
> # nTx - No. threads per process in X
> # nTy - No. threads per process in Y
>  &EEPARMS
>  nTx=1,
>  nTy=1,
>  useMPI=.TRUE.,
>  &
>
> "data.mnc"
>
> # Example "data.mnc" file
> # Lines beginning "#" are comments
>  &MNC_01
> # mnc_echo_gvtypes=.FALSE.,
> # mnc_use_indir=.FALSE.,
>  mnc_use_outdir=.TRUE.,
>  mnc_outdir_str='mnc_test_',
> #mnc_outdir_date=.FALSE.,
> #monitor_mnc=.FALSE.,
> #timeave_mnc=.FALSE.,
> #snapshot_mnc=.FALSE.,
> #pickup_read_mnc=.FALSE.,
> #pickup_write_mnc=.FALSE.,
>  &
>
>
> I use 24 number of processors . This is the message i get while running
> the model.
>
> [dell at localhost input]$ mpirun -np 24 ./mitgcmuv > test
> EEBOOT_MINIMAL: No. of procs=     1 not equal to nPx*nPy=    24
> EEBOOT_MINIMAL: No. of procs=     1 not equal to nPx*nPy=    24
> EEDIE: earlier error in multi-proc/thread setting
> PROGRAM MAIN: ends with fatal Error
> STOP ABNORMAL END: PROGRAM MAIN
> (and this error is repeated for 24 times).
>
> Please do provide me some suggestions/solutions.
>
> Thanking all in advance
>
> Regards,
>
> Sachiko Mohanty
> Centre for Atmospheric Sc.
> IIT Delhi, India
>
>
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <
> http://mitgcm.org/pipermail/mitgcm-support/attachments/20140508/eef34c36/attachment-0001.htm
> >
>
> ------------------------------
>
> _______________________________________________
> MITgcm-support mailing list
> MITgcm-support at mitgcm.org
> http://mitgcm.org/mailman/listinfo/mitgcm-support
>
>
> End of MITgcm-support Digest, Vol 131, Issue 15
> ***********************************************
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mitgcm.org/pipermail/mitgcm-support/attachments/20140509/2332d113/attachment-0001.htm>


More information about the MITgcm-support mailing list