[MITgcm-support] Global_ocean query
Amitabh Mitra
amitabhmitra1 at rediffmail.com
Sat Mar 12 03:31:38 EST 2005
Dear Sir,
i) I am trying to run the global_ocean.90x40x15 experiment
with a resolution of 1x1.
ii) The region taken into consideration is 0E to 360E and
80S to 80N.
iii) The files SIZE.h and Data are attached along with the
mail.
The following error is displayed:
SGI_2100 21% mpirun -np 1 ./mitgcmuv
S/R EXTERNAL_FIELDS_LOAD: Reading new data 0.E+0, 0
time,SST,SSS,fu,fv,Q,E-P,i0,i1,a,b = 0.0000E+00 -6.7200E-01 3.3977E+01 -1.306
2E-03 2.2001E-02 6.3990E+01 0.0000E+00 12 1 5.0000E-01 5.0000E-01
time,fu0,fu1,fu = 0.0000E+00 -2.6089E-03 -3.6054E-06 -1.3062E-03 5.0000E-01
5.0000E-01
cg2d: Sum(rhs),rhsMax = 8.32667268468867E-16 3.76753611073055E+00
time,SST,SSS,fu,fv,Q,E-P,i0,i1,a,b = 4.3200E+04 -6.4810E-01 3.3980E+01 -1.262
8E-03 2.3000E-02 6.4303E+01 0.0000E+00 12 1 5.1667E-01 4.8333E-01
time,fu0,fu1,fu = 4.3200E+04 -2.6089E-03 -3.6054E-06 -1.2628E-03 5.1667E-01
4.8333E-01
cg2d: Sum(rhs),rhsMax = -1.96120897300034E-13 4.84441519521263E+00
time,SST,SSS,fu,fv,Q,E-P,i0,i1,a,b = 8.6400E+04 -6.2419E-01 3.3984E+01 -1.219
4E-03 2.4000E-02 6.4616E+01 0.0000E+00 12 1 5.3333E-01 4.6667E-01
time,fu0,fu1,fu = 8.6400E+04 -2.6089E-03 -3.6054E-06 -1.2194E-03 5.3333E-01
4.6667E-01
cg2d: Sum(rhs),rhsMax = 1.52933221642115E-14 4.64235047198259E+00
time,SST,SSS,fu,fv,Q,E-P,i0,i1,a,b = 1.2960E+05 -6.0029E-01 3.3987E+01 -1.176
0E-03 2.5000E-02 6.4929E+01 0.0000E+00 12 1 5.5000E-01 4.5000E-01
time,fu0,fu1,fu = 1.2960E+05 -2.6089E-03 -3.6054E-06 -1.1760E-03 5.5000E-01
4.5000E-01
cg2d: Sum(rhs),rhsMax = -1.72972747236599E-13 4.33415858523235E+00
time,SST,SSS,fu,fv,Q,E-P,i0,i1,a,b = 1.7280E+05 -5.7639E-01 3.3990E+01 -1.132
6E-03 2.6000E-02 6.5241E+01 0.0000E+00 12 1 5.6667E-01 4.3333E-01
time,fu0,fu1,fu = 1.7280E+05 -2.6089E-03 -3.6054E-06 -1.1326E-03 5.6667E-01
4.3333E-01
cg2d: Sum(rhs),rhsMax = -1.63535851527286E-13 1.21686878525210E+01
time,SST,SSS,fu,fv,Q,E-P,i0,i1,a,b = 2.1600E+05 -5.5248E-01 3.3994E+01 -1.089
1E-03 2.7000E-02 6.5554E+01 0.0000E+00 12 1 5.8333E-01 4.1667E-01
time,fu0,fu1,fu = 2.1600E+05 -2.6089E-03 -3.6054E-06 -1.0891E-03 5.8333E-01
4.1667E-01
cg2d: Sum(rhs),rhsMax = 2.66731081666194E-14 1.51809687099544E+02
S/R EEDIE: Only 0 threads have completed, 1 are expected for this confi
guration!
Possibly you have different setenv PARALLEL and nThreads?
STOP MON_SOLUTION: STOPPED DUE TO EXTREME VALUES OF SOLUTION
STOP
SGI_2100 22%
---------------------------------------------------------
The bathymetry data ( from Etopo2 modified ) shows a depth of
3000 m to 5000 m but the salinity and potential temperature
data ( from Levitus ) are defined for a depth of 1000 m only.
Maybe the missing values in the files lev_t1.bin and lev_s1.bin
are generating the errors. What could be the reason and how
could I remove the error ?
Thanking You for your kind support,
Amitabh Mitra
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mitgcm.org/pipermail/mitgcm-support/attachments/20050312/5270c807/attachment.htm>
-------------- next part --------------
# ====================
# | Model parameters |
# ====================
#
# Continuous equation parameters
&PARM01
tRef = 17*20.,
sRef = 17*35.,
viscAr=1.E-3,
viscAh=5.E5,
diffKhT=0.0,
diffKrT=3.E-5,
diffKhS=0.0,
diffKrS=3.E-5,
rhonil=1035.,
gravity=9.81,
eosType = 'JMD95Z',
ivdc_kappa=100.,
implicitDiffusion=.TRUE.,
useOldFreezing=.TRUE.,
useRealFreshWaterFlux=.TRUE.,
useCDscheme=.TRUE.,
useNHMTerms=.TRUE.,
# turn on looped cells
hFacMin=.05,
hFacMindr=50.,
# set precision of data files
readBinaryPrec=32,
&
# Elliptic solver parameters
&PARM02
cg2dMaxIters=500,
cg2dTargetResidual=1.E-13,
&
# Time stepping parameters
&PARM03
nIter0 = 0,
nTimeSteps = 20,
# 100 years of integration will yield a reasonable flow field
# startTime = 0.,
# endTime = 3110400000.,
deltaTmom = 1200.0,
tauCD = 321428.,
deltaTtracer= 43200.0,
deltaTClock = 43200.0,
# if you are using a version later than checkpoint45d on the main branch
# you can uncomment the following line and increase the time step
# deltaTtracer and deltaTClock to 172800.0 as well to speed up the
# asynchronous time stepping
# deltaTfreesurf = 172800.0,
abEps = 0.1,
pChkptFreq= 311040000.,
dumpFreq= 311040000.,
dumpFreq= 864000.,
taveFreq= 311040000.,
taveFreq= 864000.,
#monitorFreq=31104000.,
monitorFreq=1.,
# 2 months restoring timescale for temperature
tauThetaClimRelax = 5184000.0,
# 6 months restoring timescale for salinity
tauSaltClimRelax = 15552000.0,
periodicExternalForcing=.TRUE.,
externForcingPeriod=2592000.,
externForcingCycle=31104000.,
&
# Gridding parameters
&PARM04
usingCartesianGrid=.FALSE.,
usingSphericalPolarGrid=.TRUE.,
delR= 10., 10., 20., 25., 25., 25.,
25., 50., 50., 50., 100., 100.,
100., 100., 100., 100., 100.
phiMin=-80.,
dySpacing=1.,
dxSpacing=1.,
&
# Input datasets
&PARM05
bathyFile= 'bathymetry.bin',
hydrogThetaFile='lev_t1.bin',
hydrogSaltFile= 'lev_s1.bin',
zonalWindFile= 'trenberth_taux1.bin',
meridWindFile= 'trenberth_tauy1.bin',
thetaClimFile= 'lev_sst1.bin',
saltClimFile= 'lev_sss1.bin',
surfQFile= 'ncep_qnet.bin',
# fresh water flux is turned off, uncomment next line to turn on
# (not recommened together with surface salinity restoring)
# EmPmRFile= 'ncep_emp.bin',
&
-------------- next part --------------
C $Header: /u/gcmpack/MITgcm/verification/global_ocean.90x40x15/code/SIZE.h,v 1.4 2003/12/10 16:25:57 adcroft Exp $
C $Name: checkpoint53 $
C
C /==========================================================\
C | SIZE.h Declare size of underlying computational grid. |
C |==========================================================|
C | The design here support a three-dimensional model grid |
C | with indices I,J and K. The three-dimensional domain |
C | is comprised of nPx*nSx blocks of size sNx along one axis|
C | nPy*nSy blocks of size sNy along another axis and one |
C | block of size Nz along the final axis. |
C | Blocks have overlap regions of size OLx and OLy along the|
C | dimensions that are subdivided. |
C \==========================================================/
C Voodoo numbers controlling data layout.
C sNx - No. X points in sub-grid.
C sNy - No. Y points in sub-grid.
C OLx - Overlap extent in X.
C OLy - Overlat extent in Y.
C nSx - No. sub-grids in X.
C nSy - No. sub-grids in Y.
C nPx - No. of processes to use in X.
C nPy - No. of processes to use in Y.
C Nx - No. points in X for the total domain.
C Ny - No. points in Y for the total domain.
C Nr - No. points in Z for full process domain.
INTEGER sNx
INTEGER sNy
INTEGER OLx
INTEGER OLy
INTEGER nSx
INTEGER nSy
INTEGER nPx
INTEGER nPy
INTEGER Nx
INTEGER Ny
INTEGER Nr
PARAMETER (
& sNx = 180,
& sNy = 160,
& OLx = 2,
& OLy = 2,
& nSx = 2,
& nSy = 1,
& nPx = 1,
& nPy = 1,
& Nx = sNx*nSx*nPx,
& Ny = sNy*nSy*nPy,
& Nr = 17)
C MAX_OLX - Set to the maximum overlap region size of any array
C MAX_OLY that will be exchanged. Controls the sizing of exch
C routine buufers.
INTEGER MAX_OLX
INTEGER MAX_OLY
PARAMETER ( MAX_OLX = OLx,
& MAX_OLY = OLy )
More information about the MITgcm-support
mailing list