[MITgcm-support] Compilation with pgi fortran: problem solved

Gus Correa gus at ldeo.columbia.edu
Fri Jun 20 14:52:52 EDT 2008


Hi Zhi-Ping, Martin, list

I'm glad that it is working!
"Given enough eyeballs, all bugs are shallow."
Once more, Linus' Law proves to be right.

Cheers,
Gus

Zhi-Ping Mei wrote:

> Gus and Martin,
>
> I am glad to tell that you are right on the compiler. When I changed 
> to mpif90, the model runs. Now the model runs past reading namelist, 
> and is still running.
>
> I will post again if I see any other problems.
>
> Thanks a lot for your help and have a great weekend.
>
> Zhi-Ping
>
> --- On *Fri, 6/20/08, Gus Correa /<gus at ldeo.columbia.edu>/* wrote:
>
>     From: Gus Correa <gus at ldeo.columbia.edu>
>     Subject: Re: [MITgcm-support] Compilation with pgi fortran
>     To: "MITgcm-support" <mitgcm-support at mitgcm.org>
>     Date: Friday, June 20, 2008, 5:41 PM
>
>Hi Zhi-Ping and list
>
>I re-read your original message.
>Here are a few thinking points.
>
>1)  Have you tried to use pgf90 instead of
> pgf77?
>Namelists are Fortran90 stuff, although pgf77 may have them as an extension.
>In any case, it may be worth trying.
>Here I've been compiling MITgcm with pgf90, which I presume is a super-set
>of pgf77.
>It doesn't hurt to use the Fortran90 compiler.
>
>2) As you point out,  in eesupp/src/nml_set_terminator.F,
>one sees that the "official" MITgcm namelist terminator is '
>&',
>and the alternative is ' /', both preceded by a blank, probably to
>avoid
>spurious concatenations. Better stick to ' /' with the leading blank.
>Why the absence of leading blank broke the linking to the netCDF library?
>The reason may perhaps be found in the actual Makefile that was produced
>without a leading blank for NML_TERMINATOR.
>The Makefile produced this way may be mangled somehow.
>
>3) Maybe if you send the full part of STDOUT.0000 where the error 
>messages appear,
>we can find something.
>The error seems to occur,
> right after
>
>(PID.TID 0000.0001) S/R INI_PARMS ; starts to read PARM01
>
>when ini_parms is trying to read the namelist (the preceding namelist is 
>just a dump after
>replacing the & by /).
>At this point it mentions some files in /tmp
>I wonder if you have access permission to /tmp in your cluster.
>Check with your system administrator.
>
>But there may be an alternative.
>I Googled pgfio-F-228 (your error message),
>and look what I found, from a colleague who works here: :)
>
>http://forge.csail.mit.edu/pipermail/mitgcm-support/2007-June/004898.html
>
>That one seems to have been a case of lack of access permission to /tmp.
>
>Well, I hope this helps ...
>
>Gus Correa
>
>However,  it was edited and I lost track of what is where.
>
>Zhi-Ping Mei wrote:
>
>> Hi Gus and Martin,
>>
>> A bit strange, I did not receive the message from Martin, but saw it 
>> quoted in Gus'
> message. But appreciate inputs of both of you.
>>
>> Below is the line of defines of build option file:
>>
>> DEFINES='-DALLOW_USE_MPI -DALWAYS_USE_MPI -DWORDLENGTH=4 
>> -DNML_TERMINATOR="/" '
>>
>> When I do not leave a space before the /, I can link the netcdf 
>> library. So I stick to this way. Now the problem is with reading the 
>> namelist only. In fact, I did compile from scrach, that is rm all the 
>> files in the bin directory,  then genmake2 --> make depend --> make.
>>
>> I haven't run other examples yet, but I may give a try. I am currently
>
>> running from my own biological model that is similar to darwin.
>>
>> Multi-lines issue: I used to have only several lines (4-5) in sRef, 
>> tRef and deltaZ (one line). What you saw in the output in the last 
>> message is one of my attempts to use one item each line, to make sure 
>> every value is
> read, because I found the model reads one line as a 
>> record. But that did not make any difference.
>>
>> Some files, like zonalWindFile=' ', is an emptied single quote, as
>I 
>> am running in offline mode.
>>
>> Zhi-Ping
>>
>>
>> --- On *Fri, 6/20/08, Gus Correa /<gus at ldeo.columbia.edu>/* wrote:
>>
>>     From: Gus Correa <gus at ldeo.columbia.edu>
>>     Subject: Re: [MITgcm-support] Compilation with pgi fortran
>>     To: "MITgcm-support" <mitgcm-support at mitgcm.org>
>>     Date: Friday, June 20, 2008, 3:04 PM
>>
>>Hi Zhi-Ping, Martin, list
>>
>>Zhi-Ping: Did the problem with the netCDF library go away?
>>Now I am not sure what is happening, whether it is the namelist being 
>>rejected at runtime,
>>or the netCDF library not being linked.
>>
>>Have
>> you run other MITgcm examples in the same
> machine?
>>
>>Also, you may want to recompile from scratch .
>>Do a make clean, or make CLEAN in the directory where you're building 
>>the model, remove the Makefile,
>>and run the genmake2 script again, to start clean, in case you didn't
>do 
>>this yet.
>>
>>Martin and Zhi-Ping: I seem to have the same multi-line style of 
>>namelist arrays that
>>Zhi-Ping is using.  It has been working here (PGI 5.4, old stuff). E.g:
>>
>> delR=  10., 10., 10., 10., 10., 10., 10., 10., 10., 10.,
>>        20., 30., 50., 80., 120., 150., 200., 250., 500., 500.,
>>        750., 750., 750., 750.,
>>
>>However, one of the versions of the data file sent by Zhi-Ping had one 
>>item per line
>>for delZ, making the number of lines for a single item quite big.
>>I don't know if there is a limit, a maximum for the number of lines
>that 
>>can be used in a
> single
>>namelist item in PGI, but there may be a
>> maximum.
>>
>>Yet another thing I just saw.
>>The name of the zonalWindFile, meridWindFile, etc, seem to be one sole 
>>double quote.
>>However, it may be two double quotes poorly formatted by the email and 
>>its fonts, hard to tell.
>>In any case, if you don't need any files, just comment out the lines 
>>with a leading hash #,
>>or delete them.
>>Nil strings may be another source of problems.
>>
>>HIH
>>Gus Correa
>>
>>
>>****
>>
>>Martin Losch wrote:
>>
>>Hi Zhi-Ping,
>>
>>this could be the problem, because the last thing the model does is  
>>reading namelist PARM01 (I had a similar problem once, can't remember  
>>what compiler)
>>
>>>
>>> tRef= 24.0, 23.0, 22.0, 21.0, 20.0, 19.0,
>>>       18.0, 17.0, 16.0, 15.0, 14.0, 13.0,
>>>  
>     12.0, 11.0, 10.0, 9.0,  8.0,  7.0,
>>>       6.0,  5.0,  4.0,  3.0,  2.0,
>>> sRef= 34.65, 34.75, 34.82, 34.87, 34.90,
>>>       34.90, 34.86, 34.78, 34.69,
>> 34.60,
>>>       34.58, 34.62, 34.68, 34.72, 34.73,
>>>       34.74, 34.73, 34.73, 34.72, 34.72,
>>>       34.71, 34.70, 34.69,
>>
>>Arrange these on a single line like ths:
>>tRef= 24.0, 23.0, 22.0, 21.0, 20.0, 19.0, 18.0, 17.0, 16.0, 15.0,  14.0, 
>>13.0, 12.0, 11.0, 10.0, 9.0,  8.0,  7.0, 6.0,  5.0,  4.0,   3.0,  2.0,
>>(or for a test tRef = 23*1.,)
>>Some compilers can't handle continuation lines in name lists.
>>
>>Martin
>>
>>Zhi-Ping Mei wrote:
>>
>>> Hi Gus,
>>>
>>> Thanks for the detailed explanations.
>>>
>>> I use Emacs to edit all the files on Linux system. I checked the quote
>
>>> were paired. So we can exclude this
> possibility.
>>>
>>> I am using pgf77 version 6.1.
>>>
>>> In the data file, the name list are in the format:
>>> name=value,
>>> But I also have a comma in the last item of each group. I removed the 
>>> comma from the last item of each group and run the
>> model. I observed 
>>> the same error.
>>>
>>> Below is the text copied from DATA file (for the clarity purposes, I 
>>> removed some commented lines, but they are shown in the output):
>>>
>>> #
>>> # ******************************
>>> # Model parameters
>>> # Continuous equation parameters
>>> # ******************************
>>> &PARM01
>>> tRef= 24.0, 23.0, 22.0, 21.0, 20.0, 19.0,
>>>       18.0, 17.0, 16.0, 15.0, 14.0, 13.0,
>>>       12.0, 11.0, 10.0, 9.0,  8.0,  7.0,
>>>       6.0,  5.0,  4.0,  3.0, 
> 2.0,
>>> sRef= 34.65, 34.75, 34.82, 34.87, 34.90,
>>>       34.90, 34.86, 34.78, 34.69, 34.60,
>>>       34.58, 34.62, 34.68, 34.72, 34.73,
>>>       34.74, 34.73, 34.73, 34.72, 34.72,
>>>       34.71, 34.70, 34.69,
>>> no_slip_sides=.false.,
>>> no_slip_bottom=.TRUE.,
>>> viscAz=1.E-3,
>>> viscAh=1.E4,
>>> diffKhT=1.E2,
>>>
>> diffKzT=1.E-5,
>>> diffKhS=1.E2,
>>> diffKzS=1.E-5,
>>> beta=1.E-11,
>>> tAlpha=2.E-4,
>>> sBeta =7.4E-4,
>>> gravity=9.81,
>>> gBaro=9.81,
>>> rigidLid=.FALSE.,
>>> implicitFreeSurface=.true.,
>>> eosType='POLY3',
>>> saltStepping=.TRUE.,
>>> tempStepping=.TRUE.,
>>> momStepping=.TRUE.,
>>> implicitDiffusion=.true.,
>>> implicitViscosity=.true.,
>>> allowFreezing=.false.,
>>>
> useSingleCpuIO=.TRUE.,
>>> useCDscheme=.FALSE.,
>>> tempAdvScheme = 3,
>>> saltAdvScheme = 3,
>>> /
>>> # **************************
>>> # Elliptic solver parameters
>>> # **************************
>>> &PARM02
>>> cg2dMaxIters=300,
>>> cg2dTargetResidual=1.E-7,
>>> /
>>>
>>> # ************************
>>> # Time stepping parameters
>>> # ************************
>>> &PARM03
>>> nIter0=0,
>>> nTimeSteps=35040,
>>>
>> deltaTmom=900.,
>>> deltaTtracer=900.,
>>> deltaTClock=900.,
>>> abEps=0.1,
>>> pChkptFreq  = 31104000.,
>>> chkptFreq   = 31104000.,
>>> dumpFreq = 604800.,
>>> taveFreq = 604800.,
>>> monitorFreq = 604800.,
>>> tauThetaClimRelax=0.,
>>> tauSaltClimRelax=0.,
>>>
> periodicExternalForcing=.TRUE.,
>>> externForcingPeriod=2592000.,
>>> externForcingCycle=31104000.,
>>> /
>>>
>>> # *******************
>>> # Gridding parameters
>>> # *******************
>>> &PARM04
>>> usingCartesianGrid=.FALSE.,
>>> usingSphericalPolarGrid=.TRUE.,
>>> delX=360*1.E0,
>>> delY=160*1.E0,
>>> delZ=10., 10., 15., 20., 20., 25., 35., 50., 75.,
>>>      100., 150., 200., 275., 350., 415., 450., 500.,
>>>      500.,  500., 500., 500., 500., 500.,
>>> phimin=-80.,
>>> thetamin=0.,
>>> /
>>> # **********
>>> # Data Files
>>> # **********
>>>
>> &PARM05
>>> bathyFile=       'input/bathy_fl.bin',
>>> #hydrogThetaFile= 'input/LEVITUS_1x1_ptmp_NEW_corK',
>>> #hydrogSaltFile=  'input/LEVITUS_1x1_salt_NEW_corK',
>>> zonalWindFile=  
> '',
>>> meridWindFile=   '',
>>> thetaClimFile=   '',
>>> saltClimFile=    '',
>>> surfQFile=       '',
>>> EmPmRFile=       '',
>>> /
>>>
>>> %%%% end of data file
>>>
>>> Following are the error message from PBS system of the cluster:
>>>
>>> PGFIO-F-228/namelist read/unit=11/end of file reached without finding 
>>> group.
>>>  File name = /tmp/FTNcaaaaybtaz    formatted, sequential access   
>>> record = 141
>>>  In source file ini_parms.f, at line number 3496
>>> PGFIO-F-228/namelist read/unit=11/end of file reached without finding 
>>> group.
>>> PGFIO-F-228/namelist read/unit=11/end of file reached without finding 
>>> group.
>>>  File name = /tmp/FTNcaaaaizBaz  
>>  formatted, sequential access   
>>> record = 141
>>>  In source file ini_parms.f, at line number
> 3496
>>> PGFIO-F-228/namelist read/unit=11/end of file reached without finding 
>>> group.
>>>  File name = /tmp/FTNcaaaaaifaz    formatted, sequential access   
>>> record = 141
>>>  In source file ini_parms.f,PGFIO-F-228/namelist read/unit=11/end of 
>>> file reached without finding group.
>>>
>>> %%%%%%%
>>>
>>> Finally the output of STDOUT.0000:
>>>
>>> (PID.TID 0000.0001)
>>> (PID.TID 0000.0001) // 
>>> ======================================================
>>> (PID.TID 0000.0001) //                      MITgcm UV
>>> (PID.TID 0000.0001) //                      =========
>>> (PID.TID 0000.0001) // 
>>> ======================================================
>>> (PID.TID 0000.0001) // execution environment starting up...
>>> (PID.TID 0000.0001)
>>> (PID.TID
>> 0000.0001) //
> MITgcmUV version:  checkpoint58b_post
>>> (PID.TID 0000.0001) // Build user:        meiz0001
>>> (PID.TID 0000.0001) // Build host:        masternode001
>>> (PID.TID 0000.0001) // Build date:        Thu Jun 19 21:40:00 ADT 2008
>>> (PID.TID 0000.0001)
>>> (PID.TID 0000.0001) // 
>>> =======================================================
>>> (PID.TID 0000.0001) // Execution Environment parameter file
>>"eedata"
>>> (PID.TID 0000.0001) // 
>>> =======================================================
>>> (PID.TID 0000.0001) ># Example "eedata" file
>>> (PID.TID 0000.0001) ># Lines beginning "#" are comments
>>> (PID.TID 0000.0001) ># nTx - No. threads per process in X
>>> (PID.TID 0000.0001) ># nTy - No. threads per process in Y
>>> (PID.TID 0000.0001) > &EEPARMS
>>> (PID.TID 0000.0001) > nTx=1,nTy=1
>>>
> (PID.TID 0000.0001) > /
>>> (PID.TID
>> 0000.0001) ># Note: Some systems use & as the
>>> (PID.TID 0000.0001) ># namelist terminator. Other systems
>>> (PID.TID 0000.0001) ># use a / character (as shown here).
>>> (PID.TID 0000.0001)
>>> (PID.TID 0000.0001) // 
>>> =======================================================
>>> (PID.TID 0000.0001) // Computational Grid Specification ( see files 
>>> "SIZE.h" )
>>> (PID.TID 0000.0001) //                                  ( and 
>>> "eedata"       )
>>> (PID.TID 0000.0001) // 
>>> =======================================================
>>> (PID.TID 0000.0001)      nPx =    8 ; /* No. processes in X */
>>> (PID.TID 0000.0001)      nPy =    4 ; /* No. processes in Y */
>>> (PID.TID 0000.0001)      nSx =    1 ; /* No. tiles in X per process */
>>> (PID.TID 0000.0001)
>      nSy =    1 ; /* No. tiles in Y per process */
>>> (PID.TID 0000.0001)      sNx =   45 ; /* Tile size in X
>> */
>>> (PID.TID 0000.0001)      sNy =   40 ; /* Tile size in Y */
>>> (PID.TID 0000.0001)      OLx =    4 ; /* Tile overlap distance in X */
>>> (PID.TID 0000.0001)      OLy =    4 ; /* Tile overlap distance in Y */
>>> (PID.TID 0000.0001)      nTx =    1 ; /* No. threads in X per process
>*/
>>> (PID.TID 0000.0001)      nTy =    1 ; /* No. threads in Y per process
>*/
>>> (PID.TID 0000.0001)       Nr =   23 ; /* No. levels in the vertical  
>*/
>>> (PID.TID 0000.0001)       nX =  360 ; /* Total domain size in X ( = 
>>> nPx*nSx*sNx ) */
>>> (PID.TID 0000.0001)       nY =  160 ; /* Total domain size in Y ( = 
>>> nPy*nSy*sNy ) */
>>> (PID.TID 0000.0001)   nTiles =    1 ; /* Total no. tiles per process (
>
>>> = nSx*nSy )
> */
>>> (PID.TID 0000.0001)   nProcs =   32 ; /* Total no. processes ( = 
>>> nPx*nPy ) */
>>> (PID.TID 0000.0001) nThreads =    1 ; /* Total no. threads per process
>
>>> ( =
>> nTx*nTy ) */
>>> (PID.TID 0000.0001) usingMPI =    F ; /* Flag used to control whether 
>>> MPI is in use */
>>> (PID.TID 0000.0001)                   /*  note: To execute a program 
>>> with MPI calls */
>>> (PID.TID 0000.0001)                   /*  it must be launched 
>>> appropriately e.g     */
>>> (PID.TID 0000.0001)                   /*  "mpirun -np 64 
>>> ......"                    */
>>> (PID.TID 0000.0001) useCoupler=   F ; /* Flag used to control 
>>> communications with */
>>> (PID.TID 0000.0001)                   /*  other model components, 
>>> through a coupler */
>>> (PID.TID 0000.0001)
>>> (PID.TID 0000.0001)
> ======= Starting MPI parallel Run =========
>>> (PID.TID 0000.0001)  My Processor Name 
>>> =                                                          node075
>>> (PID.TID 0000.0001)  Located at (  0,  0) on processor grid (0:  7,0: 
>3)
>>> (PID.TID 0000.0001)
>>  Origin at  (   1,   1) on global grid (1: 360,1: 160)
>>> (PID.TID 0000.0001)  North neighbor = processor 0001
>>> (PID.TID 0000.0001)  South neighbor = processor 0003
>>> (PID.TID 0000.0001)   East neighbor = processor 0004
>>> (PID.TID 0000.0001)   West neighbor = processor 0028
>>> (PID.TID 0000.0001) // 
>>> ======================================================
>>> (PID.TID 0000.0001) // Mapping of tiles to threads
>>> (PID.TID 0000.0001) // 
>>> ======================================================
>>> (PID.TID 0000.0001) // -o- Thread   1, tiles (   1:   1,   1:  
> 1)
>>> (PID.TID 0000.0001)
>>> (PID.TID 0000.0001) // 
>>> ======================================================
>>> (PID.TID 0000.0001) // Tile <-> Tile connectvity table
>>> (PID.TID 0000.0001) // 
>>> ======================================================
>>> (PID.TID 0000.0001) // Tile number: 000001
>> (process no. = 000000)
>>> (PID.TID 0000.0001) //        WEST: Tile = 000008, Process = 000028, 
>>> Comm = messages
>>> (PID.TID 0000.0001) //                bi = 000001, bj = 000001
>>> (PID.TID 0000.0001) //        EAST: Tile = 000002, Process = 000004, 
>>> Comm = messages
>>> (PID.TID 0000.0001) //                bi = 000001, bj = 000001
>>> (PID.TID 0000.0001) //       SOUTH: Tile = 000025, Process = 000003, 
>>> Comm = messages
>>> (PID.TID 0000.0001) //                bi = 000001, bj = 000001
>>>
> (PID.TID 0000.0001) //       NORTH: Tile = 000009, Process = 000001, 
>>> Comm = messages
>>> (PID.TID 0000.0001) //                bi = 000001, bj = 000001
>>> (PID.TID 0000.0001)
>>> (PID.TID 0000.0001) // 
>>> =======================================================
>>> (PID.TID 0000.0001) // Model parameter file "data"
>>> (PID.TID 0000.0001) // 
>>>
>> =======================================================
>>> (PID.TID 0000.0001) >#
>>> (PID.TID 0000.0001) ># ******************************
>>> (PID.TID 0000.0001) ># Model parameters
>>> (PID.TID 0000.0001) ># Continuous equation parameters
>>> (PID.TID 0000.0001) ># ******************************
>>> (PID.TID 0000.0001) >&PARM01
>>> (PID.TID 0000.0001) >tRef= 24.0,
>>> (PID.TID 0000.0001) >      23.0,
>>> (PID.TID 0000.0001) >   
>   22.0,
>>> (PID.TID 0000.0001) >      21.0,
>>> (PID.TID 0000.0001) >      20.0,
>>> (PID.TID 0000.0001) >      19.0,
>>> (PID.TID 0000.0001) >      18.0,
>>> (PID.TID 0000.0001) >      17.0,
>>> (PID.TID 0000.0001) >      16.0,
>>> (PID.TID 0000.0001) >      15.0,
>>> (PID.TID 0000.0001) >      14.0,
>>> (PID.TID 0000.0001) >      13.0,
>>> (PID.TID 0000.0001) >      12.0,
>>>
>> (PID.TID 0000.0001) >      11.0,
>>> (PID.TID 0000.0001) >      10.0,
>>> (PID.TID 0000.0001) >      9.0,
>>> (PID.TID 0000.0001) >      8.0,
>>> (PID.TID 0000.0001) >      7.0,
>>> (PID.TID 0000.0001) >      6.0,
>>> (PID.TID 0000.0001) >      5.0,
>>> (PID.TID 0000.0001) >      4.0,
>>> (PID.TID 0000.0001) >      3.0,
>>> (PID.TID 0000.0001) >     
> 2.0,
>>> (PID.TID 0000.0001) >sRef= 34.65,
>>> (PID.TID 0000.0001) >      34.75,
>>> (PID.TID 0000.0001) >      34.82,
>>> (PID.TID 0000.0001) >      34.87,
>>> (PID.TID 0000.0001) >      34.90,
>>> (PID.TID 0000.0001) >      34.90,
>>> (PID.TID 0000.0001) >      34.86,
>>> (PID.TID 0000.0001) >      34.78,
>>> (PID.TID 0000.0001) >      34.69,
>>> (PID.TID 0000.0001) >      34.60,
>>> (PID.TID 0000.0001) >      34.58,
>>> (PID.TID 0000.0001) >     
>> 34.62,
>>> (PID.TID 0000.0001) >      34.68,
>>> (PID.TID 0000.0001) >      34.72,
>>> (PID.TID 0000.0001) >      34.73,
>>> (PID.TID 0000.0001) >      34.74,
>>> (PID.TID 0000.0001) >      34.73,
>>> (PID.TID 0000.0001) >      34.73,
>>> (PID.TID 0000.0001) >      34.72,
>>> (PID.TID
> 0000.0001) >      34.72,
>>> (PID.TID 0000.0001) >      34.71,
>>> (PID.TID 0000.0001) >      34.70,
>>> (PID.TID 0000.0001) >      34.69,
>>> (PID.TID 0000.0001) >no_slip_sides=.false.,
>>> (PID.TID 0000.0001) >no_slip_bottom=.TRUE.,
>>> (PID.TID 0000.0001) >viscAz=1.E-3,
>>> (PID.TID 0000.0001) >viscAh=1.E4,
>>> (PID.TID 0000.0001) >diffKhT=1.E2,
>>> (PID.TID 0000.0001) >diffKzT=1.E-5,
>>> (PID.TID 0000.0001) >diffKhS=1.E2,
>>> (PID.TID 0000.0001) >diffKzS=1.E-5,
>>> (PID.TID 0000.0001) >beta=1.E-11,
>>> (PID.TID 0000.0001)
>> >tAlpha=2.E-4,
>>> (PID.TID 0000.0001) >sBeta =7.4E-4,
>>> (PID.TID 0000.0001) >gravity=9.81,
>>> (PID.TID 0000.0001) >gBaro=9.81,
>>> (PID.TID 0000.0001) >rigidLid=.FALSE.,
>>> (PID.TID 0000.0001)
> >implicitFreeSurface=.true.,
>>> (PID.TID 0000.0001) >eosType='POLY3',
>>> (PID.TID 0000.0001) >saltStepping=.TRUE.,
>>> (PID.TID 0000.0001) >tempStepping=.TRUE.,
>>> (PID.TID 0000.0001) >momStepping=.TRUE.,
>>> (PID.TID 0000.0001) >implicitDiffusion=.true.,
>>> (PID.TID 0000.0001) >implicitViscosity=.true.,
>>> (PID.TID 0000.0001) >allowFreezing=.false.,
>>> (PID.TID 0000.0001) >useSingleCpuIO=.TRUE.,
>>> (PID.TID 0000.0001) >useCDscheme=.FALSE.,
>>> (PID.TID 0000.0001) >tempAdvScheme = 3,
>>> (PID.TID 0000.0001) >saltAdvScheme = 3,
>>> (PID.TID 0000.0001) >/
>>> (PID.TID 0000.0001) >
>>> (PID.TID 0000.0001) >#
>> **************************
>>> (PID.TID 0000.0001) ># Elliptic solver parameters
>>> (PID.TID 0000.0001) ># **************************
>>> (PID.TID
> 0000.0001) >&PARM02
>>> (PID.TID 0000.0001) >cg2dMaxIters=300,
>>> (PID.TID 0000.0001) >cg2dTargetResidual=1.E-7,
>>> (PID.TID 0000.0001) >/
>>> (PID.TID 0000.0001) >
>>> (PID.TID 0000.0001) ># ************************
>>> (PID.TID 0000.0001) ># Time stepping parameters
>>> (PID.TID 0000.0001) ># ************************
>>> (PID.TID 0000.0001) >&PARM03
>>> (PID.TID 0000.0001) >nIter0=0,
>>> (PID.TID 0000.0001) >nTimeSteps=35040,
>>> (PID.TID 0000.0001) >#Mei deltaTmom=10800.,
>>> (PID.TID 0000.0001) >#Mei
>>> (PID.TID 0000.0001) >deltaTmom=900.,
>>> (PID.TID 0000.0001) >#Mei deltaTtracer=10800.,
>>> (PID.TID 0000.0001) >#Mei
>>> (PID.TID 0000.0001)
>> >deltaTtracer=900.,
>>> (PID.TID 0000.0001) >#Mei deltaTClock =10800.,
>>> (PID.TID 0000.0001)
> >#Mei
>>> (PID.TID 0000.0001) >deltaTClock=900.,
>>> (PID.TID 0000.0001) >abEps=0.1,
>>> (PID.TID 0000.0001) >pChkptFreq  = 31104000.,
>>> (PID.TID 0000.0001) >chkptFreq   = 31104000.,
>>> (PID.TID 0000.0001) >#pChkptFreq  = 2592000.,
>>> (PID.TID 0000.0001) >#chkptFreq   = 2592000.,
>>> (PID.TID 0000.0001) >#Mei dumpFreq    = 31104000.,
>>> (PID.TID 0000.0001) >#Mei
>>> (PID.TID 0000.0001) >dumpFreq = 604800.,
>>> (PID.TID 0000.0001) >#Mei taveFreq    = 31104000.,
>>> (PID.TID 0000.0001) >#Mei
>>> (PID.TID 0000.0001) >taveFreq = 604800.,
>>> (PID.TID 0000.0001) >#taveFreq    = 86400.,
>>> (PID.TID 0000.0001) >#monitorFreq = 10800.,
>>> (PID.TID 0000.0001) >#Mei monitorFreq = 31104000.,
>>> (PID.TID 0000.0001) >#Mei
>>> (PID.TID
>> 0000.0001) >monitorFreq =
> 604800.,
>>> (PID.TID 0000.0001) >tauThetaClimRelax=0.,
>>> (PID.TID 0000.0001) >tauSaltClimRelax=0.,
>>> (PID.TID 0000.0001) >periodicExternalForcing=.TRUE.,
>>> (PID.TID 0000.0001) >externForcingPeriod=2592000.,
>>> (PID.TID 0000.0001) >externForcingCycle=31104000.,
>>> (PID.TID 0000.0001) >/
>>> (PID.TID 0000.0001) >
>>> (PID.TID 0000.0001) ># *******************
>>> (PID.TID 0000.0001) ># Gridding parameters
>>> (PID.TID 0000.0001) ># *******************
>>> (PID.TID 0000.0001) >&PARM04
>>> (PID.TID 0000.0001) >usingCartesianGrid=.FALSE.,
>>> (PID.TID 0000.0001) >usingSphericalPolarGrid=.TRUE.,
>>> (PID.TID 0000.0001) >delX=360*1.E0,
>>> (PID.TID 0000.0001) >delY=160*1.E0,
>>> (PID.TID 0000.0001) >delZ=10.,
>>> (PID.TID 0000.0001) >     10.,
>>> (PID.TID
> 0000.0001) >     15.,
>>> (PID.TID
>> 0000.0001) >     20.,
>>> (PID.TID 0000.0001) >     20.,
>>> (PID.TID 0000.0001) >     25.,
>>> (PID.TID 0000.0001) >     35.,
>>> (PID.TID 0000.0001) >     50.,
>>> (PID.TID 0000.0001) >     75.,
>>> (PID.TID 0000.0001) >     100.,
>>> (PID.TID 0000.0001) >     150.,
>>> (PID.TID 0000.0001) >     200.,
>>> (PID.TID 0000.0001) >     275.,
>>> (PID.TID 0000.0001) >     350.,
>>> (PID.TID 0000.0001) >     415.,
>>> (PID.TID 0000.0001) >     450.,
>>> (PID.TID 0000.0001) >     500.,
>>> (PID.TID 0000.0001) >     500.,
>>> (PID.TID 0000.0001) >     500.,
>>> (PID.TID 0000.0001) >     500.,
>>> (PID.TID 0000.0001) >     500.,
>>> (PID.TID 0000.0001) >     500.,
>>> (PID.TID 0000.0001) >    
> 500.,
>>> (PID.TID 0000.0001) >phimin=-80.,
>>> (PID.TID 0000.0001) >thetamin=0.,
>>> (PID.TID 0000.0001)
>> >/
>>> (PID.TID 0000.0001) >
>>> (PID.TID 0000.0001) ># **********
>>> (PID.TID 0000.0001) ># Data Files
>>> (PID.TID 0000.0001) ># **********
>>> (PID.TID 0000.0001) >&PARM05
>>> (PID.TID 0000.0001) >bathyFile=       'input/bathy_fl.bin',
>>> (PID.TID 0000.0001) >#hydrogThetaFile=
>>'input/LEVITUS_1x1_ptmp_NEW_corK',
>>> (PID.TID 0000.0001) >#hydrogSaltFile= 
>>'input/LEVITUS_1x1_salt_NEW_corK',
>>> (PID.TID 0000.0001) >zonalWindFile=   '',
>>> (PID.TID 0000.0001) >meridWindFile=   '',
>>> (PID.TID 0000.0001) >thetaClimFile=   '',
>>> (PID.TID 0000.0001) >saltClimFile=    '',
>>> (PID.TID 0000.0001) >surfQFile=       '',
>>> (PID.TID 0000.0001)
> >EmPmRFile=       '',
>>> (PID.TID 0000.0001) >/
>>> (PID.TID 0000.0001)
>>> (PID.TID 0000.0001) S/R INI_PARMS ; starts to read PARM01
>>>
>>> %%%%%%%%%%% End of the
>> output.
>>>
>>> One more thing, I use pgf77, not pgf90. Does it matter?
>>>
>>> In the eesupp/src/nml_set_terminator.F, I see the following lines:
>>>
>>> #include "CPP_OPTIONS.h"
>>>
>>> #define FTN_NML_F90
>>> #ifndef NML_TERMINATOR
>>> #define NML_TERMINATOR  ' &'
>>> #else
>>> #define NML_TERMINATOR  ' /'
>>>
>>> #endif
>>>
>>> What do they do? Are they in conflict with
>-DNML_TERMINATOR="/"
>>in the 
>>> build_option file?
>>>
>>> Please let me know if you need further information.
>>>
>>> Zhi-Ping
>>>
>>>
>>>
>>>
>>>
> --- On Thu, 6/19/08, Gus Correa <gus at ldeo.columbia.edu> wrote:
>>>
>>> > From: Gus Correa <gus at ldeo.columbia.edu>
>>> > Subject: Re: [MITgcm-support] Compilation with pgi fortran
>>> > To: "MITgcm Support" <mitgcm-support at mitgcm.org>
>>> > Date: Thursday, June 19, 2008, 10:40
>> PM
>>> > Hi Zhi-Ping and list
>>> >
>>> > The "official" Fortran90 namelist terminator is
>>> > the slash ("/"),
>>> > but somehow all namelists in the MITgcm example seem to use
>>> > the
>>> > ampersand ("&") as a terminator (see data,
>>> > eedata, and other
>>> > input parameter files).
>>> >
>>> > Some compilers are lax and take the ampersand also.
>>> > PGI was not tolerant, and required the slash as a namelist
>>> > terminator.
>>> > I haven't
> used the latest PGI versions, so things may
>>> > have changed.
>>> >
>>> > So depending on the compiler, you have to replace the
>>> > namelist terminator
>>> > by using the pre-processor directive
>>> > -DNML_TERMINATOR="/".
>>> > This will strip off the ampersand and replace it by slash
>>> > on all namelists
>>> > on the fly when the program
>> runs.
>>> >
>>> > Indeed, as you noticed, I have a blank before the slash.
>>> > I don't really remember why, but since it is being
>>> > overwritten on the
>>> > namelist file, the blank is probably there for safety, to
>>> > avoid
>>> > concatenating
>>> > the slash with the last value on the namelist.
>>> > So, the blank shouldn't hurt.
>>> >
>>> > Anyway, did you make sure that all
> the single quotes and
>>> > double quotes
>>> > in the
>>> > pre-processor directives are matched and paired?
>>> > The fact that your netcdf library didn't link,
>>> > after you changed the build_options file,
>>> > suggests to me that maybe the compilation
>>> > got lost on quotes that were not paired.
>>> >
>>> > Another thing to check is if all the namelist items are of
>>> > the form
>>> > name=value,
>>> >
>> and that there is comma at the end (except the last item on
>>> > each namelist).
>>> >
>>> > A more remote possibility, is if you edited your namelist
>>> > files in a Windows
>>> > computer (or an old Mac, or a DOS machine),
>>> > and you if are running the experiment on a Linux machine.
>>> > The line terminators in those
> computers are different than
>>> > Linux,
>>> > Parsing input files this way is a total mess.
>>> > A couple of computer users here had this problem,
>>> > and it was hard to understand what was going on,
>>> > until they told me where they had been preparing their
>>> > input files.
>>> >
>>> > If this is the case, you should edit your files (data, etc)
>>> > in the Linux
>>> > machine,
>>> > or use a line terminator converter.
>>> > See this article:
>>> >
>>> >
>> http://en.wikipedia.org/wiki/Newline
>>> >
>>> > Otherwise, please send more details on the error messages.
>>> >
>>> > HIH
>>> > Gus Correa
>>> >
>>> > Zhi-Ping Mei wrote:
>>> >
>>> > >Thanks Gus,
>>> > >
>>>
> > >
>>> > >
>>> > >
>>> > >
>>> > >>Do you have this line on your build_options file,
>>> > >>to define "/" as the namelist terminator?
>>> > >>
>>> > >>DEFINES='-DWORDLENGTH=4
>>> > -DNML_TERMINATOR=" /" '
>>> > >>/" '
>>> > >>
>>> > >>
>>> > >>
>>> > >I have -DWORDLENGTH=4, but did not have
>>> > -DNML_TERMINATOR=" /" '. So I added
>>> > -DNML_TERMINATOR=" /" ' to the DEFINES. I see
>>> > the same problem.
>>> > >
>>> > >Do you mean there is a space before /? I tried
>> that
>>> > either. When a space is put before /, netcdf library
>>> > can't be linked.
>>> > >
>>> > >
>>> > >Zhi-Ping
>>> >
> >
>>> > >
>>> > >>Zhi-Ping Mei wrote:
>>> > >>
>>> > >>
>>> > >>
>>> > >>>Hell all,
>>> > >>>
>>> > >>>I am trying to compile MITgcm with PGI fortran.
>>> > The
>>> > >>>
>>> > >>>
>>> > >>compilation went fine and the binary mitgcmuv is
>>> > generated.
>>> > >>
>>> > >>
>>> > >>>However, when I run the binary, I got following
>>> > error:
>>> > >>>
>>> > >>>PGFIO-F-228/namelist read/unit=11/end of file
>>> > reached
>>> > >>>
>>> > >>>
>>> > >>without finding group.
>>> >
>> >>
>>> > >>
>>> > >>>File
> name = /tmp/FTNcaaaaaDxah formatted,
>>> > >>>
>>> > >>>
>>> > >>sequential access record = 85
>>> > >>
>>> > >>
>>> > >>>In source file ini_parms.f,PGFIO-F-228/namelist
>>> > >>>
>>> > >>>
>>> > >>read/unit=11/end of file reached without finding
>>> > group.
>>> > >>
>>> > >>
>>> > >>>PGFIO-F-228/namelist read/unit=11/end of file
>>> > reached
>>> > >>>
>>> > >>>
>>> > >>without finding group.
>>> > >>
>>> > >>
>>> > >>>PGFIO-F-228/namelist read/unit=11/end of file
>>> > reached
>>> > >>>
>>> > >>>
>>> > >>without finding
> group.
>>> > >>
>>> >
>> >>
>>> > >>>at line number 3496
>>> > >>>...
>>> > >>>
>>> > >>>which appears to be related to reading
>>> > 'data'
>>> > >>>
>>> > >>>When I compiled with g77, I never encountered
>>> > such
>>> > >>>
>>> > >>>
>>> > >>problem. So I suspect it's related to
>>> > compilation.
>>> > >>
>>> > >>
>>> > >>>The following is the lines of build options:
>>> > >>>===================================
>>> > >>>#!/bin/bash
>>> > >>>#
>>> > >>># $Header:
>>> > >>>
>>> > >>>
>>>
> >
>>>/u/gcmpack/MITgcm/tools/build_options/linux_ia32_pgf77+mpi,v
>>> > >>1.5 2004/09/25 00:42:14 heimbach Exp $
>>> > >>
>>> > >>
>>>
>> > >>>#
>>> > >>>
>>> > >>>FC='mpif77'
>>> > >>>CC='mpicc'
>>> > >>>LINK='mpif77'
>>> > >>>DEFINES='-DALLOW_USE_MPI -DALWAYS_USE_MPI
>>> > >>>
>>> > >>>
>>> > >>-DWORDLENGTH=4'
>>> > >>
>>> > >>
>>> > >>>CPP='cpp -traditional -P'
>>> >
>>>>>INCLUDES='-I/usr/local/mpich/1.2.5/ip/up/pgi/ssh/include
>>> > >>>
>>> > >>>
>>> > >>-I/usr/local/netcdf/pgi/include'
>>> > >>
>>> > >>
>>> >
> >>>LIBS='-L/usr/local/mpich/1.2.5/ip/up/pgi/ssh/lib
>>> > >>>
>>> > >>>
>>> > >>-lfmpich -lmpich -L/usr/local/netcdf/pgi/lib'
>>> > >>
>>> > >>
>>> > >>>if test "x$IEEE" = x ; then
>>> > >>> # No
>> need for IEEE-754
>>> > >>> FFLAGS='-byteswapio -r8 -Mnodclchk
>>> > >>>
>>> > >>>
>>> > >>-Mextend'
>>> > >>
>>> > >>
>>> > >>> FOPTIM='-fastsse
>>> > >>>
>>> > >>>
>>> > >>-Mvect=cachesize:524288,transform'
>>> > >>
>>> > >>
>>> > >>>else
>>> > >>> # Try to follow IEEE-754
>>> > >>> FFLAGS='-byteswapio -r8 -Mnodclchk
>>> >
> >>>
>>> > >>>
>>> > >>-Mextend'
>>> > >>
>>> > >>
>>> > >>> FOPTIM='-O0 -Mvect=sse -Mscalarsse
>>> > >>>
>>> > >>>
>>> > >>-Mcache_align -Mnoflushz -Kieee'
>>> > >>
>>> > >>
>>> > >>>fi
>>> > >>>
>>> >
>> >>>==================================================
>>> > >>>
>>> > >>>and the following are the lines of job
>>> > submission
>>> > >>>
>>> > >>>
>>> > >>script:
>>> > >>
>>> > >>
>>> > 
>>>
>>>>>``````````````````````````````````````````````````````````````````````````````````````````
>>> > >>>#PBS -S
> /bin/bash
>>> > >>>#PBS -N mitgcmuv
>>> > >>>#PBS -l nodes=16:ppn=2
>>> > >>>#PBS -l walltime=120:00:00
>>> > >>>#PBS -o qsub.output
>>> > >>>#PBS -j oe
>>> > >>>
>>> > >>># torch.mpi Last Modified Jan 16, 2008,
>>> > Kristian
>>> > >>>
>>> > >>>
>>> > >>Strickland
>>> > >>
>>> > >>
>>> > >>>#
>> Determine number of processors
>>> > >>>NPROCS=`wc -l < $PBS_NODEFILE`
>>> > >>>
>>> > >>># Pad the log with extra info
>>> > >>>echo "PBS: qsub was run on
>>> > $PBS_O_HOST"
>>> > >>>echo "PBS: job
>>> > \"$PBS_JOBNAME\"
>>> > >>>
>>> >
> >>>
>>> > >>submitted while in: $PBS_O_WORKDIR"
>>> > >>
>>> > >>
>>> > >>>echo "PBS: submitted to queue:
>>> > $PBS_O_QUEUE,
>>> > >>>
>>> > >>>
>>> > >>execution in queue: $PBS_QUEUE"
>>> > >>
>>> > >>
>>> > >>>echo "PBS: execution mode is
>>> > >>>
>>> > >>>
>>> > >>$PBS_ENVIRONMENT"
>>> > >>
>>> > >>
>>> > >>>echo "PBS: this job has allocated $NPROCS
>>>
>> > >>>
>>> > >>>
>>> > >>CPUs"
>>> > >>
>>> > >>
>>> > >>>## Next two lines for debugging
>>> > >>>#echo "PBS: node file is $PBS_NODEFILE and
>>>
> > >>>
>>> > >>>
>>> > >>contains one CPU per node listed:"
>>> > >>
>>> > >>
>>> > >>>#echo `cat $PBS_NODEFILE`
>>> > >>>echo "PBS: PATH = $PBS_O_PATH"
>>> > >>>echo ----------------------------------------
>>> > >>>
>>> > >>>export
>>> > >>>
>>> > >>>
>>> > >>MPICH="/usr/local/mpich/1.2.5/ip/up/pgi/ssh"
>>> > >>
>>> > >>
>>> > >>>export MPICH_PATH="${MPICH}/bin"
>>> > >>>export MPICH_LIB="${MPICH}/lib"
>>> > >>>export
>> PATH="${MPICH_PATH}:${PATH}"
>>> > >>>export
>>> > >>>
>>> > >>>
>>>
> >
>>>LD_LIBRARY_PATH="${MPICH_LIB}:${LD_LIBRARY_PATH}"
>>> > >>
>>> > >>
>>> > >>>cd $PBS_O_WORKDIR
>>> > >>>
>>> > >>># Run the parallel MPI executable
>>> > "a.out"
>>> > >>>mpirun -machinefile ${PBS_NODEFILE} -np $NPROCS
>>> >
>>> > >>>
>>> > >>>
>>> > >>./${PBS_JOBNAME}
>>> > >>
>>> > >>
>>> > >>>#>& out.dat
>>> >
>>>>>```````````````````````````````````````````````````````````````````
>>> > >>>
>>> > >>>I appreciate someone who are familiar with
>>> > compilation
>>> > >>>
>>> > >>>
>>> > >>take a look and give me some advices.
>> Thanks in
>>>
> > advance,
>>> > >>
>>> > >>
>>> > >>>Zhi-Ping
>>> > >>>
>>> > >>>Mount Allison University
>>> > >>>New Brunswick
>>> > >>>Canda
>>> > >>>
>>> > >>>
>>> > >>>
>>> > >>>_______________________________________________
>>> > >>>MITgcm-support mailing list
>>> > >>>MITgcm-support at mitgcm.org
>>> > >>>http://mitgcm.org/mailman/listinfo/mitgcm-support
>>> > >>>
>>> > >>>
>>> > >>>
>>> > >>>
>>> > >>_______________________________________________
>>> > >>MITgcm-support mailing list
>>> > >>MITgcm-support at mitgcm.org
>>> >
> >>http://mitgcm.org/mailman/listinfo/mitgcm-support
>>> > >>
>>> >
>> >>
>>> > >
>>> > >
>>> > >
>>> > >_______________________________________________
>>> > >MITgcm-support mailing list
>>> > >MITgcm-support at mitgcm.org
>>> > >http://mitgcm.org/mailman/listinfo/mitgcm-support
>>> > >
>>> > >
>>> >
>>> > _______________________________________________
>>> > MITgcm-support mailing list
>>> > MITgcm-support at mitgcm.org
>>> > http://mitgcm.org/mailman/listinfo/mitgcm-support
>>>
>>>
>>>------------------------------------------------------------------------
>>>
>>>_______________________________________________
>>>MITgcm-support mailing
> list
>>>MITgcm-support at mitgcm.org
>>>http://mitgcm.org/mailman/listinfo/mitgcm-support
>>>  
>>>
>>
>>_______________________________________________
>>MITgcm-support mailing
>> list
>>MITgcm-support at mitgcm.org
>>http://mitgcm.org/mailman/listinfo/mitgcm-support
>>
>>
>>------------------------------------------------------------------------
>>
>>_______________________________________________
>>MITgcm-support mailing list
>>MITgcm-support at mitgcm.org
>>http://mitgcm.org/mailman/listinfo/mitgcm-support
>>  
>>
>
>-- 
>---------------------------------------------------------------------
>Gustavo J. Ponce Correa, PhD - Email: gus at ldeo.columbia.edu
>Lamont-Doherty Earth Observatory - Columbia University
>P.O. Box 1000 [61 Route 9W] - Palisades, NY, 10964-8000 - USA
>Oceanography Bldg., Rm. 103-D, ph. (845) 365-8911, fax (845)
> 365-8736
>---------------------------------------------------------------------
>
>_______________________________________________
>MITgcm-support mailing list
>MITgcm-support at mitgcm.org
>http://mitgcm.org/mailman/listinfo/mitgcm-support
>
>
>------------------------------------------------------------------------
>
>_______________________________________________
>MITgcm-support mailing list
>MITgcm-support at mitgcm.org
>http://mitgcm.org/mailman/listinfo/mitgcm-support
>  
>




More information about the MITgcm-support mailing list