[MITgcm-support] File Read Errors, DWORDLENGTH

helen hill helen0hill at gmail.com
Wed Jul 18 09:53:37 EDT 2007


this is the bit you need from genmake:

   case nas+pgi:
     set LN         = ( '/bin/ln -s' )
     set CPP        = ( '/lib/cpp -traditional -P' )
     set DEFINES    = ( ${DEFINES} '-DALLOW_USE_MPI -DALWAYS_USE_MPI
-DWORDLENGTH=4' )
     set FC         = ( 'ifort' )
     set FFLAGS     = ( '-fp-model strict -132 -r8 -i4 -w95 -W0 -WB -convert
big_endian -assume byterecl' )
     set FOPTIM     = ( '-O0 -noalign')
     set LINK       = ( 'ifort' )
     set INCLUDES   = ( '-I/opt/sgi/mpt/1.12.0.nas/include
-I/opt/pd/netcdf/3.6.0-p1/include' )
     set LIBS       = ( '-L/opt/sgi/mpt/1.12.0.nas/lib -lmpi
-L/opt/pd/netcdf/3.6.0-p1/lib -lnetcdf' )
     breaksw



On 7/18/07, Ryan Abernathey <rpa at mit.edu> wrote:
>
> Hi Helen,Thanks a lot. I tried to modify genmake to get it working, and it
> did compile, but apparently it doesn't work right.
> Unfortunately, I can't access those files. You could either open the
> permissions up or you could just email me the genmake. Are there any other
> differences in the code? I am using the offline model code that was in your
> home directory on ACES.
> -Ryan
> p.s. We're apparently in the same group on columbia, so you could make the
> directories group readable.
>
> On Jul 17, 2007, at 6:29 PM, helen hill wrote:
>
> Hi Ryan
> ../../tools/genmake -enable=ptracers -mods=../code -platform=nas+pgi
> in /u/helenhil/ACCtrace/offline_model/helen-meansolve-np16/bin makes a 16
> processor executable of the kind you are looking for - let me know if the
> permissions are no good...
> H
>
>
> On 7/17/07, Ryan Abernathey <rpa at mit.edu> wrote:
> >
> > Hello,
> >
> > I'm trying to run an offline version of the MITgcm on columbia, and I've
> > encountered an error that I don't fully understand.
> >
> > The model crashes at this step:
> > (PID.TID 0002.0001)  MDSREADFIELD: opening global file: bathy.bin
> > With the following in message to stderr:
> > forrtl: severe (36): attempt to access non-existent record, unit 9, file
> > /nobackup2c/rpa/ACC/GCM_runs/fixup/level_01/1/bathy.bin
> > Image              PC                Routine            Line
> > Source
> > mitgcmuv           4000000000288AD0  Unknown               Unknown
> > Unknown
> > mitgcmuv           4000000000283E50  Unknown               Unknown
> > Unknown
> > mitgcmuv           400000000020D0D0  Unknown               Unknown
> > Unknown
> > mitgcmuv           40000000001710A0  Unknown               Unknown
> > Unknown
> > mitgcmuv           40000000001705D0  Unknown               Unknown
> > Unknown
> > mitgcmuv           400000000019D890  Unknown               Unknown
> > Unknown
> > mitgcmuv           400000000005DDB0  Unknown               Unknown
> > Unknown
> > mitgcmuv           4000000000162540  Unknown               Unknown
> > Unknown
> > mitgcmuv           400000000013A3E0  Unknown               Unknown
> > Unknown
> > mitgcmuv           4000000000153780  Unknown               Unknown
> > Unknown
> > mitgcmuv           40000000001649D0  Unknown               Unknown
> > Unknown
> > mitgcmuv           4000000000100500  Unknown               Unknown
> > Unknown
> > mitgcmuv           4000000000003B00  Unknown               Unknown
> > Unknown
> > libc.so.6.1        2000000003BB5C50  Unknown               Unknown
> > Unknown
> > mitgcmuv           4000000000003840  Unknown               Unknown
> > Unknown
> >
> > I followed the advice here:
> >
> > http://forge.csail.mit.edu/pipermail/mitgcm-support/2004-November/002731.html
> >
> > And changed -DWORDLENGTH=4 to -DWORDLENGTH=1
> >
> > This caused the program to crash even earlier at:
> > (PID.TID 0000.0001) S/R INI_PARMS ; read PARM05 : OK
> > with the error:
> > forrtl: severe (66): output statement overflows record, unit 9, file
> > /nobackup2c/rpa/ACC/GCM_runs/fixup/level_01/1/XC.001.001.data
> > Image              PC                Routine            Line
> > Source
> > mitgcmuv           4000000000288A90  Unknown               Unknown
> > Unknown
> > mitgcmuv           4000000000283E10  Unknown               Unknown
> > Unknown
> > mitgcmuv           400000000020D090  Unknown               Unknown
> > Unknown
> > mitgcmuv           4000000000171060  Unknown               Unknown
> > Unknown
> > mitgcmuv           4000000000170590  Unknown               Unknown
> > Unknown
> > mitgcmuv           40000000001C76C0  Unknown               Unknown
> > Unknown
> > mitgcmuv           40000000001C50C0  Unknown               Unknown
> > Unknown
> > mitgcmuv           400000000006BC40  Unknown               Unknown
> > Unknown
> > mitgcmuv           40000000000A9860  Unknown               Unknown
> > Unknown
> > mitgcmuv           400000000013DA60  Unknown               Unknown
> > Unknown
> > mitgcmuv           4000000000153720  Unknown               Unknown
> > Unknown
> > mitgcmuv           4000000000164990  Unknown               Unknown
> > Unknown
> > mitgcmuv           40000000001004C0  Unknown               Unknown
> > Unknown
> > mitgcmuv           4000000000003B00  Unknown               Unknown
> > Unknown
> > libc.so.6.1        2000000003BB5C50  Unknown               Unknown
> > Unknown
> > mitgcmuv           4000000000003840  Unknown               Unknown
> > Unknown
> >
> > --------------
> >
> > I am using an offline version of the model given to me by Helen which
> > uses genmake rather than genmake2. I added these options for compiling to
> > the genmake script:
> >
> >    case nas+mpi:
> >         set CPP = ( '/lib/cpp  -traditional -P' )
> >         set DEFINES = ( ${DEFINES} '-DALLOW_USE_MPI -DALWAYS_USE_MPI
> > -DWORDLENGTH=4'  ) # or -DWORDLENGTH=1
> >         set LIBS = ( '-L/opt/intel_9.0.03x/lib -lmpi' )
> >         set FC = ( '/opt/intel/comp/9.1.039/bin/ifort' )
> >         set LINK = ( '/opt/intel/comp/9.1.039/bin/ifort' )
> >         set FFLAGS = ( '-mp -132 -r8 -i4 -w95 -W0 -WB -i-static -convert
> > big_endian -assume byterecl' )
> >         breaksw
> >
> > This is a 16 processor mpi run.
> >
> > I am new to the GCM, so please let me know if there is any more
> > information I can provide.
> > If anyone knows how to overcome this error, I would greatly appreciate
> > your advice.
> >
> > Cheers,
> >
> > __________________
> > Ryan Abernathey
> > rpa at mit.edu
> > MIT Ph.D. Student
> > Program in Oceans Atmospheres & Climate
> > http://web.mit.edu/rpa/
> >
> >
> >
> >
> > _______________________________________________
> > MITgcm-support mailing list
> > MITgcm-support at mitgcm.org
> > http://mitgcm.org/mailman/listinfo/mitgcm-support
> >
> >
> _______________________________________________
> MITgcm-support mailing list
> MITgcm-support at mitgcm.org
> http://mitgcm.org/mailman/listinfo/mitgcm-support
>
>
>
> _______________________________________________
> MITgcm-support mailing list
> MITgcm-support at mitgcm.org
> http://mitgcm.org/mailman/listinfo/mitgcm-support
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mitgcm.org/pipermail/mitgcm-support/attachments/20070718/2705c219/attachment.htm>


More information about the MITgcm-support mailing list