[MITgcm-devel] seaice dynamics in a 2-D set-up

Jean-Michel Campin jmc at ocean.mit.edu
Tue Jun 16 13:01:10 EDT 2015


Hi Martin,

I did few more tests with this 2-D (y-z) set-up, and found something 
curious when I change delX : I use monthly-mean forcing, run for 2 years
with delX=delY, then do a short run (10 iter, deltaT=20mn) with
delX=300*delY and compare with same run with delX=delY.

Normally, changing delX should not affect the solution. However, cg2d solver
is affected and requires more iterations with small delX than with large delX,
but converges to the same solution (I can also modify ini_cg2d.F to remove
this dependency on delX do get exactly the same cg2d convergence).

Regarding seaice dynamics, I was running with default LSR_ERROR (=1.e-4),
default NPSEUDOTIMESTEPS (=2) and SEAICE_EPS = 1.e-9.
When I increase delX (by 300), more iterations (linear solver) are needed
initially but then it goes back (after ~10 model time-steps) to similar behavior 
as the delX=delY case, but the solution (i.e., uIce,vIce) is different 
from the delX=delY case.
I have attached a plot (uIce,vIce fct of grid-point index j) after 5.iters:
s04b is with delX=delY and s05b is with delX=300*delY)

If I increase NPSEUDOTIMESTEPS to 10, the difference are similar but
develop more rapidly: after 1 model time step, it reach same differences
as the NPSEUDOTIMESTEPS=2 case did after 10 time-steps.

I also tried JFNK (SOLV_MAX_ITERS=10, SEAICEnewtonIterMax=200, SEAICEkrylovIterMax=50)
and the 2 solutions with different delX are now very similar (diff < 1.e-8 or so).

So, it seems that this 2-D (Nx=1) set-up has some problems
regarding spurious dependence on delX but JFNK does fix it !

Cheers,
Jean-Michel

On Thu, Mar 05, 2015 at 09:44:02AM +0100, Martin Losch wrote:
> Hi Jean-Michel,
> 
> the default for the LSR convergence criterion is this
> LSR_ERROR  = 0.0001    _d 0
> and that???s what I normally use, I think that now you set it to 2e-9? That???s not what I meant, I suggested to set the parameter SEAICE_EPS = 2e-9; this will make the problem less stiff.
> 
> In general, the linear solver converges very slowly, too, but that???s not so important (although I have tried to improve that unsucessufully). What is more important is the convergence of the nonlinear solver (the actual residuals). I suggest that if you have only a limited amount of N iterations to spend, then it makes more sense to have a ???large??? LSR_ERROR (like the default 1e-4), so that only very few iterations are spent in the linear solver but make more nonlinear iterations (NPSEUDOTIMESTEPS=20 or more). 
> 
> There is more that you can try:
>  SEAICEuseStrImpCpl = .TRUE. (Hutchings et al 2005)
>  
> if your OLx/y > 2, you can try SEAICE_OLx/y = OLx/y - 2, to make the restricted additive Schwarz method, that we use, smoother.
> 
> This usually does not help very much, but it???s worth a try.
> 
> 
> JFNK: The linear solve for JFNK ist different (FGMRES) and LSR is only the (poor but best available) precondition. I suggest this for practical purposes:
> 
>       SEAICEnewtonIterMax = 200,
>       SEAICEkrylovIterMax = 50,
>       JFNKgamma_nonlin    = 1. _d -05    # default
>       JFNKgamma_lin_min   = 0.10 _d 0   # default
>       JFNKgamma_lin_max   = 0.99 _d 0   # default
>       JFNKres_tFac        = 0.5
>       SEAICE_JFNKepsilon  = 1. _d -06 # default
>       SEAICE_JFNKalpha    = 1.5, # a bit more agressive than the default
> 
> Martin
> 
> > On 04 Mar 2015, at 21:23, Jean-Michel Campin <jmc at ocean.mit.edu> wrote:
> > 
> > Hi Martin,
> > 
> > Thanks for the suggestions: I increased the tolerance (from 1e-12 before to
> > 2e-9 as you suggested) and it reduces the number of "non converging" messages
> > that I am getting, also run a little bit faster.
> > 
> > Now, I am still not sure that the linear solver converges well enough.
> > In fact, in this latest run, the report that I am getting from SEAICE_LSR
> > seems to indicate no real convergence. For instance:
> > SEAICE_LSR: Residual Initial ipass,Uice,Vice=    1  7.743844 19E-03  1.27959541E-02
> > SEAICE_LSR (ipass=   1) iters,dU,Resid=    98  1.84258913E-09  7.73824698E-03
> > SEAICE_LSR (ipass=   1) iters,dV,Resid=   448  1.99455537E-09  1.27569603E-02
> > SEAICE_LSR: Residual Initial ipass,Uice,Vice=    2  7.73904339E-03  1.27591200E-02
> > SEAICE_LSR (ipass=   2) iters,dU,Resid=    90  1.91089900E-09  7.73729344E-03
> > SEAICE_LSR (ipass=   2) iters,dV,Resid=   114  1.99557763E-09  1.27661753E-02
> > the final residual is in fact very close (sometimes smaller, sometimes larger)
> > to the initial residual.
> > 
> > Regarding the JNFK option: I am using the default number of non-linear
> > solver iteration (i.e. 2) and I imagined that using JFNK would make sense with
> > more Non-Lin iterations (-> increase the cost) but not fixing the linear solver
> > convergence.
> > 
> > And finally regarding Nx=1: since some parts of the solver (the implicit
> > tridiagonal part only ?) are not tile independent, I am not sure that
> > when I have Nx=1 I really zero out all derivative in X direction.
> > I tried SEAICEuseMultiTileSolver=T (but I don't trust too much this
> > code because I know who wrote it) but this did not help.
> > 
> > Cheers,
> > Jean-Michel
> > 
> > On Wed, Mar 04, 2015 at 06:07:27PM +0100, Martin Losch wrote:
> >> Hi Jean-Michel,
> >> 
> >> do you think that Ny=1 is the problem? I don???t think so, because always the LSR solver converges only very slowly. I see the same problem when I start from a horizonally homogeneous ice distribution. It takes a while before the ice starts moving, ie. from uice=vice=0 => solver does not do anything => doesn???t cost anything, but when it starts, LSR picks up, does not converge and uses a lot of time.
> >> 
> >> If you want a converging (but still expensive) solver use JFNK (for a start the parameters in offline_exf_seaice.dyn_jfnk should be OK, also have a look at SEAICE_OPTIONS.h!)
> >> 
> >> Also, convergence is always improved for larger seaice_eps (default is 1e-11, but 2e-9 is used very often), and a smoother/better regularization (not implemented: deltaCreg = deltaC + seaice_eps).
> >> 
> >> Martin
> >>> On 04 Mar 2015, at 17:55, Jean-Michel Campin <jmc at ocean.mit.edu> wrote:
> >>> 
> >>> Hi Martin and others,
> >>> 
> >>> I am trying to run a simple oceanic 2-D (Y-Z) set-up with seaice dynamics.
> >>> It starts without seaice and before seaice forms the LSR solver
> >>> converges (not just doing 1 iteration, but within 10-12 iters).
> >>> 
> >>> But as soon as there is seaice, it stops converging and
> >>> more than half of the CPU time is spent in SEAICE_DYNSOLVER
> >>> (despite the fact that I have 50 levels + SOM advection scheme).
> >>> 
> >>> I can reduce the max number of LSR iterations (SOLV_MAX_ITERS),
> >>> faster but still not great.
> >>> I tried separatly SEAICEuseMultiTileSolver=T and SEAICE_OLx,y=1
> >>> but the improvement is very marginal.
> >>> 
> >>> I am tempting to reduce even more SOLV_MAX_ITERS but start to worry
> >>> about what the ice velocity will be.
> >>> Is there anything I could try ?
> >>> and should we think of some code modification for the specific case
> >>> where Nx=1 or Ny=1 ?
> >>> 
> >>> Cheers,
> >>> Jean-Michel
> >>> 
> >>> _______________________________________________
> >>> MITgcm-devel mailing list
> >>> MITgcm-devel at mitgcm.org
> >>> http://mitgcm.org/mailman/listinfo/mitgcm-devel
> >> 
> >> 
> >> _______________________________________________
> >> MITgcm-devel mailing list
> >> MITgcm-devel at mitgcm.org
> >> http://mitgcm.org/mailman/listinfo/mitgcm-devel
> > 
> > _______________________________________________
> > MITgcm-devel mailing list
> > MITgcm-devel at mitgcm.org
> > http://mitgcm.org/mailman/listinfo/mitgcm-devel
> 
> 
> _______________________________________________
> MITgcm-devel mailing list
> MITgcm-devel at mitgcm.org
> http://mitgcm.org/mailman/listinfo/mitgcm-devel
-------------- next part --------------
A non-text attachment was scrubbed...
Name: ice_uv_45b.it5.jpg
Type: image/jpeg
Size: 239664 bytes
Desc: not available
URL: <http://mitgcm.org/pipermail/mitgcm-devel/attachments/20150616/5e78739c/attachment-0001.jpg>


More information about the MITgcm-devel mailing list