[MITgcm-support] Fwd: Inter-tile discontinuities
Pochini, Enrico
epochini at inogs.it
Thu Nov 7 10:38:05 EST 2019
Yes Martin, I'm using now the LSR solver, i don't really know what caused
the issue. The SEAICE_OLx/y=0 seems to be the probable culprit, since with
default OLx/y it is SEAICE_OLx/y= OLx/y - 2=1. I also had to switch to
#undef SEAICE_ALLOW_FREEDRIFT which was causing non-convergence and undef
other options like SEAICE_ALLOW_MOM_ADVECTION.
I didn't see the sea ice fields due to a forgotten SEAICEwriteState, but
I'm not observing the tiles borders in eta, u,v fields so I assume it is
solved.
Attached are the files that i'm using.
Enrico
Il giorno gio 7 nov 2019 alle ore 16:08 Martin Losch <Martin.Losch at awi.de>
ha scritto:
> Hi Enrico,
>
> may I ask, what you changed to make it run? Did you use the LSR solver
> instead of the Krylov solver? Or did you just removed SEAICE_OLx/y from you
> data.seaice?
>
> Maritn
>
> > On 7. Nov 2019, at 16:04, Pochini, Enrico <epochini at inogs.it> wrote:
> >
> > Dear Martin, dear Senja
> >
> > thank you very much for your thorough explanation and for the
> suggestions on the parameters and compilation options.
> > The tiles now are now smoothly connected!
> > I still have issues though, i.e. the simulation starts generating Nans
> and i don't understand whether it is something related to sea ice or not.
> I'll try to figure it out.
> > Thank you,
> >
> > Enrico
> >
> > Il giorno mar 5 nov 2019 alle ore 11:59 Martin Losch <
> Martin.Losch at awi.de> ha scritto:
> > Hi Enrico,
> >
> > I think you are the first to use the "Krylov solver”
> (SEAICEuseKrylov=.TRUE.,) with this model in a realistic configuration
> (other than tests that I have done during the implementation). Note that we
> have very little experience with it. In fact, my experience is, that it is
> actually doing very well in terms of parallelisation (in contrast to what
> you are seeing), but that using it slows down the simulation. If you want
> to continue with this solver, then you need to follow Senja’s advice and
> comment out
> > SEAICE_Olx = 0,
> > SEAICE_Oly = 0,
> > in your data.seaice (in fact, the entire “old defaults” section should
> not be in a new run). That should fix the problem. Here’s an attempt of an
> explanation: The Krylov solver uses the LSR solver as a preconditioner (see
> Lemieux et al 2009?, Lemieux et al 2010, for an explanation), but we never
> check convergence and let it run for only a few steps (you’ll find the
> default SEAICElinearIterMax = 10 in STDOUT.0000) and LSR_ERROR is never
> used (that’s why changing it doesn’t help). This makes the preconditioner
> solution very stripy and this probably carries over to the actual iteration
> that is done within SEAICE_KRYLOV and SEAICE_FGMRES. Using the default
> values for SEAICIE_OLx/y will make this problem smaller so that the Krylov
> solver has a good chance of converging to the correct field. The nonlinear
> iteration will never converge with just SEAICEnonLinIterMax = 10, see
> STDOUT.0000, eg. like this:
> >
> > >> grep %KRYLOV_MON STDOUT.0000
> > (PID.TID 0000.0001) %KRYLOV_MON: time step = 8
> > (PID.TID 0000.0001) %KRYLOV_MON: Nb. of time steps = 8
> > (PID.TID 0000.0001) %KRYLOV_MON: Nb. of Picard steps = 80
> > (PID.TID 0000.0001) %KRYLOV_MON: Nb. of Krylov steps = 80
> > (PID.TID 0000.0001) %KRYLOV_MON: Nb. of Picard failures = 8
> > (PID.TID 0000.0001) %KRYLOV_MON: Nb. of Krylov failures = 0
> >
> > There are as many Picard failures as steps (the krylov solver for the
> linear subproblem converges). In the end this does not matter, because we
> normally don’t want to spend the time to solve non-linear problem.
> > I would use these CPP flags:
> > # undef SEAICE_PRECOND_EXTRA_EXCHANGE
> > # define SEAICE_DELTA_SMOOTHREG
> > # define SEAICE_LSR_ZEBRA
> > with the Krylov solver (the last one if a good idea in many cases).
> >
> > I would be very interested in what the solution looks like when you
> don’t set SEAICE_Olx/y to zero (use the default = Olx/Oly-2). If the
> “inter-tile discontinuities” remain, there’s a problem with the solver
> which I would need to debug.
> >
> > Alternatively you can use the default LSR solver (as most people do),
> with these parameters
> >
> > SEAICEuseKrylov=.FALSE., ! default
> > #- seaice dynamics params:
> > LSR_ERROR = 1.E-5,
> > SEAICEnonLinIterMax=10,
> > # SEAICE_Olx = 0,
> > # SEAICE_Oly = 0,
> >
> > Both the Krylov and the LSR solver are really a Picard (or fixed-point)
> iteration, where the lineared subproblem is solved either with a Krylov
> method (FGMRES), or with LSR (line successive over relaxation). The
> advantage of the LSR solver is, that it is not very well parallelized and I
> recommend at least LSR_ERROR=1e-5 or smaller (I am working on new defaults
> for this solver). The advantage is that it is simple and relatively fast.
> With this solver it also makes sense to use non-zero overlaps (although
> this is not essential). If sea ice is not the focus of your science
> question, then I recommend using this default solver (but not with the
> default values of LSR_ERROR).
> >
> > Martin
> >
> > PS please consider removing the entire “old defaults” section in your
> data.seaice, unless you need to keep these parameters. Here are further
> comments on data.seaice (because you use an example from one of the test,
> you carried over many parameters that were put there just to test some
> parts of the code), they are unrelated to your problem.
> >
> > #- seaice dynamics params
> > LSR_ERROR = 1.E-12, ! way too small, just for testing
> > # LSR_mixIniGuess=1 : compute free-drift residual; =2,4 mix into initial
> guess
> > LSR_mixIniGuess = 1, ! doesn’t do anything but compute some
> free-drift residual, can be removed
> > # add small diffKh to test diffusion with multi-dim advect.
> > SEAICEdiffKhArea = 20., ! diffusion of seaice does not make sense
> with a stable advection scheme, so use the default of 0.
> > #- seaice thermodyn params:
> > SEAICE_multDim = 1, ! here you can use 7 (see documenation)
> > #- constant seawater freezing point:
> > SEAICE_tempFrz0 = -1.96,
> > SEAICE_dTempFrz_dS = 0., ! why do you want a constant freezing point
> of sea water??
> > #- to reproduce old results with former #defined
> SEAICE_SOLVE4TEMP_LEGACY code
> > useMaykutSatVapPoly = .TRUE., ! I would use the default, this is
> just for testing the old polynomial code
> > postSolvTempIter = 0,
> > ! these albedos are very high and the result of a tuning experiment with
> specific forcing (An Nguyen et all 2011), I would reconsider them in the
> context of your forcing
> > SEAICE_dryIceAlb = 0.8756,
> > SEAICE_wetIceAlb = 0.7856,
> > SEAICE_drySnowAlb = 0.9656,
> > SEAICE_wetSnowAlb = 0.8256,
> > ! default is 27.5e3, so not really very different
> > SEAICE_strength = 2.6780e+04,
> >
> > Also there’s a lot of diffusion of tracer (1e3) and momentum (1e4) in
> you “data” for a grid spacing of 5km. You can get away without any explicit
> diffusion for tracers with a positive (flux limited) advection scheme for
> tracers, e.g.
> > tempAdvScheme = 33,
> > saltAdvSchem = 33,
> > (or 77, 7)
> > use these together with
> > staggerTimeStep = .TRUE.,
> >
> > You can probably reduce you viscosity dramatically. I normally use the
> nondimensional form viscAhGrid, which needs to be smaller 1, but order 0.01
> is often good (that’s about viscAh = 416 m^2/s) , or the even bi-harmonic
> version of this (viscA4Grid). There is also flow-depenedent viscosity (e.g.
> Leith), see documentation.
> >
> > > On 4. Nov 2019, at 15:00, Pochini, Enrico <epochini at inogs.it> wrote:
> > >
> > > Hi Martin,
> > >
> > > thank you for your answer. I used the data.seaice and SEAICE_OPTIONS.h
> from verification/lab_sea. I've tried to set the parameters as you said and
> I also tried with SEAICEuseKrylov=.TRUE., which automatically sets
> LSR_ERROR=1.E-12 and SEAICEnonLinIterMax=10. However I still have the issue.
> > > I attach a link to the code and namelist which I used. In STDOUT it
> seems that the solver for the sea ice dynamics does not converge.
> > > Eventually the model crashes...
> > >
> > > Enrico
> > >
> > > results72.nc
> > >
> > > results_seaice72.nc
> > >
> > > sysdiag_48.nc
> > >
> > > SEAICE_OPTIONS.h
> > >
> > > SEAICE_SIZE.h
> > >
> > > data.seaice
> > >
> > > data
> > >
> > > STDOUT.0000
> > >
> > > job_err.err
> > >
> > > Il giorno mar 29 ott 2019 alle ore 17:17 Martin Losch <
> Martin.Losch at awi.de> ha scritto:
> > > Hi Enrico,
> > >
> > > this is most likely linked to the dynamics solver. For historical
> reasons, the default parameters are not very fortunate. Without knowing
> your “data.seaice” and “SEAICE_OPTIONS.h”, I am guessing that you can fix
> this problem by reducing the linear tolerance (by setting LSR_ERROR to
> values of 1.e-5 or smaller).
> > > I also suggest to increase the number of nonlinear iterations to 10
> (SEAICEnonLinIterMax, default is 2). These unfortunate defaults are part of
> an issue (<https://github.com/MITgcm/MITgcm/issues/171>) on github, but I
> never got around to changing them to something more sensible (and nobody
> seems to care).
> > >
> > > Alternatively you could use the EVP code, but then make sure that you
> use at least mEVP (again, the defaults are not great), see the
> documentation (<
> https://mitgcm.readthedocs.io/en/latest/phys_pkgs/seaice.html#more-stable-variants-of-elastic-viscous-plastic-dynamics-evp-mevp-and-aevp
> >)
> > >
> > > Martin
> > >
> > > > On 29. Oct 2019, at 14:42, Pochini, Enrico <epochini at inogs.it>
> wrote:
> > > >
> > > > Dear all,
> > > >
> > > > (I've attached the file as gdrive link..)
> > > > I'm having issues with the SEAICE module i guess. After I activated
> it I started to observe discontinuities at tiles edge, where the marine ice
> is present. The discontinuities grow in time. I see discontinuities in
> surface fields and shallower levels of 3D fields.
> > > > What could cause this issue?
> > > >
> > > > Thanks,
> > > >
> > > > Enrico
> > > > results24.nc
> > > > _______________________________________________
> > > > MITgcm-support mailing list
> > > > MITgcm-support at mitgcm.org
> > > > http://mailman.mitgcm.org/mailman/listinfo/mitgcm-support
> > >
> > > _______________________________________________
> > > MITgcm-support mailing list
> > > MITgcm-support at mitgcm.org
> > > http://mailman.mitgcm.org/mailman/listinfo/mitgcm-support
> > > _______________________________________________
> > > MITgcm-support mailing list
> > > MITgcm-support at mitgcm.org
> > > http://mailman.mitgcm.org/mailman/listinfo/mitgcm-support
> >
> > _______________________________________________
> > MITgcm-support mailing list
> > MITgcm-support at mitgcm.org
> > http://mailman.mitgcm.org/mailman/listinfo/mitgcm-support
> > _______________________________________________
> > MITgcm-support mailing list
> > MITgcm-support at mitgcm.org
> > http://mailman.mitgcm.org/mailman/listinfo/mitgcm-support
>
> _______________________________________________
> MITgcm-support mailing list
> MITgcm-support at mitgcm.org
> http://mailman.mitgcm.org/mailman/listinfo/mitgcm-support
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.mitgcm.org/pipermail/mitgcm-support/attachments/20191107/5b95b045/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: data.seaice
Type: application/octet-stream
Size: 478 bytes
Desc: not available
URL: <http://mailman.mitgcm.org/pipermail/mitgcm-support/attachments/20191107/5b95b045/attachment-0001.obj>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: SEAICE_OPTIONS.h
Type: text/x-chdr
Size: 7736 bytes
Desc: not available
URL: <http://mailman.mitgcm.org/pipermail/mitgcm-support/attachments/20191107/5b95b045/attachment-0001.bin>
More information about the MITgcm-support
mailing list