[MITgcm-support] curvilinear grid
Madhu
madhu.iitd at gmail.com
Thu Oct 13 02:11:15 EDT 2011
Hi Martin,
Thanks for reply..
Attached are the required flies
curvi.txt : is the screen dump after executing MITGCM
list.txt : list of the files in INPUT folder..
Regards
Madhu
On Wed, Oct 12, 2011 at 2:24 PM, Martin Losch <Martin.Losch at awi.de> wrote:
> Madhu,
>
> we need more explanation/error messages in order to help you. E.g. the
> content of your code directory and namelists (data) files, the content
> (tail) of your curvi.txt.
>
> M.
>
>
> On Oct 12, 2011, at 8:34 AM, Madhu wrote:
>
> >
> > Hi All
> > I am new user to MITGCM and want to use the model for eastern coast of
> India (in Bay Of Bengal) to study the internal waves.
> > I have tested the model for rectangular domain. It has run and results
> are ok.
> > But ultimately i have to run the model using orthogonal curvilinear grids
> (attached as *.pdf) to reduce the computational time. I have generated the
> curvilinear grids/bathy and associated parameters are LONC.bin, LATC.bin,
> DXF.bin, DYF.bin, RA.bin, topo.bin etc. When the model is run with new grid
> setup , it failed to run and gave the Error massage as follows.
> > ++++++++++++
> > $./mitgcmuv >curvi.txt
> > ** WARNING ** INI_MODEL_IO: use tiled-files to write sections (for OBCS)
> > EXCH1_RS_CUBE: Wrong Tiling
> > EXCH1_RS_CUBE: works only with sNx=sNy & nSx=6 & nSy=nPx=nPy=1
> > STOP ABNORMAL END: EXCH1_RS_CUBE: Wrong Tiling
> > $
> > ++++++++++++++
> >
> > I want to test it first with single tile case hence in 'SIZE.h' file the
> setup was as follow
> > & sNx = 50,
> > & sNy = 100,
> > & OLx = 1,
> > & OLy = 1,
> > & nSx = 1,
> > & nSy = 1,
> > & nPx = 1,
> > & nPy = 1,
> > & Nx = sNx*nSx*nPx,
> > & Ny = sNy*nSy*nPy,
> > & Nr = 5)
> >
> >
> >
> > I will be happy to know, If there is any suggestion with which i can
> resolve this issue.
> >
> > Also if there is any other similar test case please send the information.
> >
> > Waiting for reply
> > --
> > Regards
> >
> > Madhu
> > <Grid.pdf>_______________________________________________
> > MITgcm-support mailing list
> > MITgcm-support at mitgcm.org
> > http://mitgcm.org/mailman/listinfo/mitgcm-support
>
>
> _______________________________________________
> MITgcm-support mailing list
> MITgcm-support at mitgcm.org
> http://mitgcm.org/mailman/listinfo/mitgcm-support
>
--
Madhu
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mitgcm.org/pipermail/mitgcm-support/attachments/20111013/606ed84a/attachment.htm>
-------------- next part --------------
(PID.TID 0000.0001)
(PID.TID 0000.0001) // ======================================================
(PID.TID 0000.0001) // MITgcm UV
(PID.TID 0000.0001) // =========
(PID.TID 0000.0001) // ======================================================
(PID.TID 0000.0001) // execution environment starting up...
(PID.TID 0000.0001)
(PID.TID 0000.0001) // MITgcmUV version: checkpoint62s
(PID.TID 0000.0001) // Build user: ADRAO
(PID.TID 0000.0001) // Build host: ADRAO
(PID.TID 0000.0001) // Build date: Thu Oct 13 11:19:30 IST 2011
(PID.TID 0000.0001)
(PID.TID 0000.0001) // =======================================================
(PID.TID 0000.0001) // Execution Environment parameter file "eedata"
(PID.TID 0000.0001) // =======================================================
(PID.TID 0000.0001) ># Example "eedata" file
(PID.TID 0000.0001) ># Lines beginning "#" are comments
(PID.TID 0000.0001) ># nTx - No. threads per process in X
(PID.TID 0000.0001) ># nTy - No. threads per process in Y
(PID.TID 0000.0001) > &EEPARMS
(PID.TID 0000.0001) > useCubedSphereExchange=.TRUE.,
(PID.TID 0000.0001) > /
(PID.TID 0000.0001) ># Note: Some systems use & as the
(PID.TID 0000.0001) ># namelist terminator. Other systems
(PID.TID 0000.0001) ># use a / character (as shown here).
(PID.TID 0000.0001)
(PID.TID 0000.0001) // =======================================================
(PID.TID 0000.0001) // Computational Grid Specification ( see files "SIZE.h" )
(PID.TID 0000.0001) // ( and "eedata" )
(PID.TID 0000.0001) // =======================================================
(PID.TID 0000.0001) nPx = 1 ; /* No. processes in X */
(PID.TID 0000.0001) nPy = 1 ; /* No. processes in Y */
(PID.TID 0000.0001) nSx = 1 ; /* No. tiles in X per process */
(PID.TID 0000.0001) nSy = 1 ; /* No. tiles in Y per process */
(PID.TID 0000.0001) sNx = 100 ; /* Tile size in X */
(PID.TID 0000.0001) sNy = 50 ; /* Tile size in Y */
(PID.TID 0000.0001) OLx = 1 ; /* Tile overlap distance in X */
(PID.TID 0000.0001) OLy = 1 ; /* Tile overlap distance in Y */
(PID.TID 0000.0001) nTx = 1 ; /* No. threads in X per process */
(PID.TID 0000.0001) nTy = 1 ; /* No. threads in Y per process */
(PID.TID 0000.0001) Nr = 5 ; /* No. levels in the vertical */
(PID.TID 0000.0001) Nx = 100 ; /* Total domain size in X ( = nPx*nSx*sNx ) */
(PID.TID 0000.0001) Ny = 50 ; /* Total domain size in Y ( = nPy*nSy*sNy ) */
(PID.TID 0000.0001) nTiles = 1 ; /* Total no. tiles per process ( = nSx*nSy ) */
(PID.TID 0000.0001) nProcs = 1 ; /* Total no. processes ( = nPx*nPy ) */
(PID.TID 0000.0001) nThreads = 1 ; /* Total no. threads per process ( = nTx*nTy ) */
(PID.TID 0000.0001) usingMPI = F ; /* Flag used to control whether MPI is in use */
(PID.TID 0000.0001) /* note: To execute a program with MPI calls */
(PID.TID 0000.0001) /* it must be launched appropriately e.g */
(PID.TID 0000.0001) /* "mpirun -np 64 ......" */
(PID.TID 0000.0001) useCoupler= F ;/* Flag used to control communications with */
(PID.TID 0000.0001) /* other model components, through a coupler */
(PID.TID 0000.0001) printMapIncludesZeros= F ; /* print zeros in Std.Output maps */
(PID.TID 0000.0001) maxLengthPrt1D= 65 /* maxLength of 1D array printed to StdOut */
(PID.TID 0000.0001)
(PID.TID 0000.0001) // ======================================================
(PID.TID 0000.0001) // Mapping of tiles to threads
(PID.TID 0000.0001) // ======================================================
(PID.TID 0000.0001) // -o- Thread 1, tiles ( 1: 1, 1: 1)
(PID.TID 0000.0001)
(PID.TID 0000.0001) // ======================================================
(PID.TID 0000.0001) // Tile <-> Tile connectvity table
(PID.TID 0000.0001) // ======================================================
(PID.TID 0000.0001) // Tile number: 000001 (process no. = 000001)
(PID.TID 0000.0001) // WEST: Tile = 000001, Process = 000001, Comm = put
(PID.TID 0000.0001) // bi = 000001, bj = 000001
(PID.TID 0000.0001) // EAST: Tile = 000001, Process = 000001, Comm = put
(PID.TID 0000.0001) // bi = 000001, bj = 000001
(PID.TID 0000.0001) // SOUTH: Tile = 000001, Process = 000001, Comm = put
(PID.TID 0000.0001) // bi = 000001, bj = 000001
(PID.TID 0000.0001) // NORTH: Tile = 000001, Process = 000001, Comm = put
(PID.TID 0000.0001) // bi = 000001, bj = 000001
(PID.TID 0000.0001)
(PID.TID 0000.0001) DEBUG_MSG: ENTERED S/R THE_MODEL_MAIN
(PID.TID 0000.0001) DEBUG_MSG: CALLING S/R INITIALISE_FIXED
(PID.TID 0000.0001) INI_PARMS: opening model parameter file "data"
(PID.TID 0000.0001) OPEN_COPY_DATA_FILE: opening file data
(PID.TID 0000.0001) // =======================================================
(PID.TID 0000.0001) // Parameter file "data"
(PID.TID 0000.0001) // =======================================================
(PID.TID 0000.0001) > ====================
(PID.TID 0000.0001) ># | Model parameters |
(PID.TID 0000.0001) ># ====================
(PID.TID 0000.0001) >#
(PID.TID 0000.0001) ># Continuous equation parameters
(PID.TID 0000.0001) > &PARM01
(PID.TID 0000.0001) > tRef= 28.57,27.57,24.50,23.00,21.50,
(PID.TID 0000.0001) > sRef= 5*35.,
(PID.TID 0000.0001) >
(PID.TID 0000.0001) > viscAz=1.E-1,
(PID.TID 0000.0001) > viscAh=1.E0,
(PID.TID 0000.0001) >#no_slip_sides=.FALSE.,
(PID.TID 0000.0001) > no_slip_sides=.FALSE.,
(PID.TID 0000.0001) > no_slip_bottom=.FALSE.,
(PID.TID 0000.0001) > viscA4=0.E12,
(PID.TID 0000.0001) > diffKhT=1.E0,
(PID.TID 0000.0001) > diffKzT=1.E-5,
(PID.TID 0000.0001) > diffKhS=1.E3,
(PID.TID 0000.0001) > diffKzS=1.E-5,
(PID.TID 0000.0001) > f0=1.e-4,
(PID.TID 0000.0001) >
(PID.TID 0000.0001) > beta=2.E-11,
(PID.TID 0000.0001) > gravity=9.81,
(PID.TID 0000.0001) > rigidLid=.FALSE.,
(PID.TID 0000.0001) > implicitFreeSurface=.TRUE.,
(PID.TID 0000.0001) > exactConserv=.TRUE.,
(PID.TID 0000.0001) > eosType='LINEAR',
(PID.TID 0000.0001) > hFacMin=0.2,
(PID.TID 0000.0001) > nonHydrostatic=.TRUE.,
(PID.TID 0000.0001) > readBinaryPrec=64,
(PID.TID 0000.0001) > writeBinaryPrec=64,
(PID.TID 0000.0001) > globalFiles=.TRUE.,
(PID.TID 0000.0001) > /
(PID.TID 0000.0001) >
(PID.TID 0000.0001) ># Elliptic solver parameters
(PID.TID 0000.0001) > &PARM02
(PID.TID 0000.0001) > cg2dMaxIters=1000,
(PID.TID 0000.0001) > cg2dTargetResidual=1.E-13,
(PID.TID 0000.0001) > cg3dMaxIters=50,
(PID.TID 0000.0001) > cg3dTargetResidual=1.E-13,
(PID.TID 0000.0001) > /
(PID.TID 0000.0001) >
(PID.TID 0000.0001) ># Time stepping parameters
(PID.TID 0000.0001) > &PARM03
(PID.TID 0000.0001) > niter0=0,
(PID.TID 0000.0001) > nTimeSteps=28800,
(PID.TID 0000.0001) ># startTime=0,
(PID.TID 0000.0001) ># endTime=432000,
(PID.TID 0000.0001) > deltaTmom=30,
(PID.TID 0000.0001) ># deltaTtracer=30,
(PID.TID 0000.0001) > abEps=0.02,
(PID.TID 0000.0001) ># pChkptFreq=4000000.0,
(PID.TID 0000.0001) ># chkptFreq=0.0,
(PID.TID 0000.0001) > dumpFreq=10000,
(PID.TID 0000.0001) > monitorFreq=100,
(PID.TID 0000.0001) > /
(PID.TID 0000.0001) >
(PID.TID 0000.0001) ># Gridding parameters
(PID.TID 0000.0001) > &PARM04
(PID.TID 0000.0001) > usingCartesianGrid=.FALSE.,
(PID.TID 0000.0001) > usingSphericalPolarGrid=.FALSE.,
(PID.TID 0000.0001) > usingCurvilinearGrid=.TRUE.,
(PID.TID 0000.0001) > delz=100.E2, 250.E2, 300.E2, 200.E2, 150.E2,
(PID.TID 0000.0001) > Ro_SeaLevel=1.E5,
(PID.TID 0000.0001) > rSphere=6370.E3,
(PID.TID 0000.0001) > /
(PID.TID 0000.0001) >
(PID.TID 0000.0001) ># Input datasets
(PID.TID 0000.0001) > &PARM05
(PID.TID 0000.0001) > bathyFile='topo.bin',
(PID.TID 0000.0001) ># checkIniTemp=.false.,
(PID.TID 0000.0001) > /
(PID.TID 0000.0001) >
(PID.TID 0000.0001)
(PID.TID 0000.0001) INI_PARMS ; starts to read PARM01
(PID.TID 0000.0001) INI_PARMS ; read PARM01 : OK
(PID.TID 0000.0001) INI_PARMS ; starts to read PARM02
(PID.TID 0000.0001) INI_PARMS ; read PARM02 : OK
(PID.TID 0000.0001) INI_PARMS ; starts to read PARM03
(PID.TID 0000.0001) INI_PARMS ; read PARM03 : OK
(PID.TID 0000.0001) INI_PARMS ; starts to read PARM04
(PID.TID 0000.0001) INI_PARMS ; read PARM04 : OK
(PID.TID 0000.0001) INI_PARMS ; starts to read PARM05
(PID.TID 0000.0001) INI_PARMS ; read PARM05 : OK
(PID.TID 0000.0001) INI_PARMS: finished reading file "data"
(PID.TID 0000.0001) PACKAGES_BOOT: opening data.pkg
(PID.TID 0000.0001) OPEN_COPY_DATA_FILE: opening file data.pkg
(PID.TID 0000.0001) // =======================================================
(PID.TID 0000.0001) // Parameter file "data.pkg"
(PID.TID 0000.0001) // =======================================================
(PID.TID 0000.0001) ># Packages
(PID.TID 0000.0001) > &PACKAGES
(PID.TID 0000.0001) ># useOBCS=.TRUE.,
(PID.TID 0000.0001) ># useMNC=.TRUE.,
(PID.TID 0000.0001) > /
(PID.TID 0000.0001)
(PID.TID 0000.0001) PACKAGES_BOOT: finished reading data.pkg
(PID.TID 0000.0001) SET_PARMS: done
(PID.TID 0000.0001) Enter INI_VERTICAL_GRID: setInterFDr= T ; setCenterDr= F
(PID.TID 0000.0001) MDS_READ_FIELD: opening global file: LONC.bin
(PID.TID 0000.0001) MDS_READ_FIELD: opening global file: LATC.bin
-------------- next part --------------
total 2884
drwxrwxr-x 3 ADRAO ADRAO 65536 Oct 13 11:27 .
drwxrwxr-x 4 ADRAO ADRAO 4096 Oct 13 11:12 ..
-rw-rw-r-- 1 ADRAO ADRAO 9363 Oct 13 11:23 curvi.txt
drwxrwsr-x 2 ADRAO ADRAO 4096 Aug 30 10:02 CVS
-rwxr-xr-x 1 ADRAO ADRAO 1228 Oct 11 12:02 data
-rwxr-xr-x 1 ADRAO ADRAO 60 Oct 11 12:01 data.pkg
-rwxr-xr-x 1 ADRAO ADRAO 138 Oct 10 11:26 depth.m
-rwxr-xr-x 1 ADRAO ADRAO 40000 Sep 27 11:19 DXC.bin
-rwxr-xr-x 1 ADRAO ADRAO 40000 Sep 27 11:19 DXF.bin
-rwxr-xr-x 1 ADRAO ADRAO 40000 Sep 27 11:19 DXG.bin
-rwxr-xr-x 1 ADRAO ADRAO 40000 Sep 27 11:19 DXV.bin
-rwxr-xr-x 1 ADRAO ADRAO 40000 Sep 27 11:19 DYC.bin
-rwxr-xr-x 1 ADRAO ADRAO 40000 Sep 27 11:19 DYF.bin
-rwxr-xr-x 1 ADRAO ADRAO 40000 Sep 27 11:19 DYG.bin
-rwxr-xr-x 1 ADRAO ADRAO 40000 Sep 27 11:19 DYU.bin
-rw-rw-r-- 1 ADRAO ADRAO 286 Sep 28 2001 eedata
-rwxr-xr-x 1 ADRAO ADRAO 40000 Sep 27 11:19 LATC.bin
-rwxr-xr-x 1 ADRAO ADRAO 40000 Sep 27 11:19 LATG.bin
-rw-rw-r-- 1 ADRAO ADRAO 0 Oct 13 11:27 list.txt
-rwxr-xr-x 1 ADRAO ADRAO 40000 Sep 27 11:19 LONC.bin
-rwxr-xr-x 1 ADRAO ADRAO 40000 Sep 27 11:19 LONG.bin
-rwxrwxr-x 1 ADRAO ADRAO 2148685 Oct 13 11:22 mitgcmuv
-rwxr-xr-x 1 ADRAO ADRAO 40000 Sep 27 11:19 RA.bin
-rwxr-xr-x 1 ADRAO ADRAO 40000 Sep 27 11:19 RAS.bin
-rwxr-xr-x 1 ADRAO ADRAO 40000 Sep 27 11:19 RAW.bin
-rwxr-xr-x 1 ADRAO ADRAO 40000 Sep 27 11:19 RAZ.bin
-rw-rw-r-- 1 ADRAO ADRAO 159 Oct 13 11:23 STDERR.0000
-rwxr-xr-x 1 ADRAO ADRAO 40000 Sep 27 11:19 topo.bin
More information about the MITgcm-support
mailing list