[MITgcm-support] NaN output data

Fancer Lancer fancer.lancer at gmail.com
Sun Dec 8 22:11:23 EST 2013


Dear Yulinghui,

Whatever you do you always should follow at least to the necessary
condition of the numerical scheme convergence, that is the CFL condition
(Courant-Friedrichs-Lewy condition), that states

*Scfl > dt*max(abs(U))/dx*

Where Scfl - is the CFL coefficient of the corresponding numerical scheme.
It must be satisfied for each spatial coordinate (dx, dy, dz and for the
corresponding velocity U, V, W). If you see, that your model is blowing up
first, that you should check is the CFL condition. It can be found in the
appropriate line of the monitor:
(PID.TID 0000.0001) %MON advcfl_uvel_max              =
3.5977280824480E-04
(PID.TID 0000.0001) %MON advcfl_vvel_max              =
0.0000000000000E+00
(PID.TID 0000.0001) %MON advcfl_wvel_max              =
1.1230328934126E-02
(PID.TID 0000.0001) %MON advcfl_W_hf_max              =
5.5028568395335E-03

For most of my cases it is enough to keep CFL below 0.05. It lets the model
to converge and gives well accuracy.

In compliance with the CFL formulae you may perform one of the following
ways to improve the model convergence:
1) Decrease the time-step - dt,
2) Increase the corresponding spatial step - dx, dy, dz,
3) Decrease the corresponding velocity - U, V, W.
If you don't want to decrease the time-step and increase the spatial step
you should somehow decrease the inner velocity. Since you run the setup
with tide you could decrease the tidal velocity.

Sincerely,
-Sergey









On Mon, Dec 9, 2013 at 6:40 AM, yulinghui <sysuylh at 163.com> wrote:

> Dear all:
>        I apply the OBCS (OBWU=U0*sin(wt)) to generate the internal waves.
> If my compute zone is extend to 3000km and dx(min)=1000m, the numerical
> model could continue running without NaN.  But  if my compute zone is
> 1500km and dx(min)=200m, the numerical result would have NaN in the output
> data after about 3 tide cycle.  I want to use the high resolution to study
> the internal solitary wave, how should i do to improve it except reducing
> the time step(now is 10s).
> Best regards
>
>
>
> input file data:
> # ====================
> # | Model parameters |
> # ====================
> #
> # Continuous equation parameters
>  &PARM01
>  tRefFile  = 'MD.tRef',
>  sRef= 200*35.,
>  viscAz=1.E-3,
>  viscAh=1.E-2,
>  no_slip_sides=.FALSE.,
>  no_slip_bottom=.FALSE.,
>  diffKhT=1.E-2
>  diffKzT=1.E-3,
>  f0=1.07E-4,
>  beta=0.E-11,
>  eosType='LINEAR',
>  tAlpha=2.E-4,
>  sBeta =0.E-4,
>  gravity=9.81,
>  implicitFreeSurface=.TRUE.,
>  exactConserv=.TRUE.
>  nonHydrostatic=.TRUE.,
>  hFacMin=0.2,
>  implicSurfPress=0.5,
>  implicDiv2DFlow=0.5,
>  nonlinFreeSurf=0,
>  hFacInf=0.2,
>  hFacSup=1.8,
>  saltStepping=.FALSE.,
>  staggerTimeStep=.TRUE.
>  tempAdvScheme=77,
>  tempVertAdvScheme=77,
> #- not safe to use globalFiles in multi-processors runs
> #globalFiles=.TRUE.,
>  readBinaryPrec=64,
>  writeBinaryPrec=64,
>  writeStatePrec=64,
>  &
>
> # Elliptic solver parameters
>  &PARM02
>  cg2dMaxIters=1000,
>  cg2dTargetResidual=1.E-13,
>  cg3dMaxIters=400,
>  cg3dTargetResidual=1.E-13,
>  &
>
> # Time stepping parameters
>  &PARM03
>  nIter0=0,
>  nTimeSteps=44600,
>  deltaT=10,
>  abEps=0.1,
>  pChkptFreq=446000.,
>  chkptFreq=0.,
>  dumpFreq=11150
>  monitorFreq=2500.,
>  monitorSelect=200,
>  &
>
> # Gridding parameters
>  &PARM04
>  usingCartesianGrid=.TRUE.,
>  delXfile='delXvar',
>  delY=1.E3,
>  delRfile='delZvar',
>  &
>
> # Input datasets
>  &PARM05
>  hydrogThetaFile='T.init',
>  bathyFile='topog',
>  &
>
> input data data.obcs
> # Open-boundaries
>  &OBCS_PARM01
>  OB_Ieast=-1,
>  OB_Iwest=1,
>  useOrlanskiEast=.TRUE.,
> # useOBCSbalance=.TRUE.,
> # OBCS_balanceFacE= 1.,
> # OBCS_monitorFreq=4000.,
>  &
>
> # Orlanski parameters
>  &OBCS_PARM02
>  Cmax=0.45,
>  cVelTimeScale=50.,
>  &
>
>
>
>
> _______________________________________________
> MITgcm-support mailing list
> MITgcm-support at mitgcm.org
> http://mitgcm.org/mailman/listinfo/mitgcm-support
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mitgcm.org/pipermail/mitgcm-support/attachments/20131209/005ccf8c/attachment.htm>


More information about the MITgcm-support mailing list