[MITgcm-devel] seaice code beyond checkoint61j blows for ECCO-GODAE

Martin Losch Martin.Losch at awi.de
Thu Apr 30 03:33:11 EDT 2009


Hi Dimitris,

that's bad news and this is really hard to debug, because apparently,  
the solver techique (LSOR) is very sensitive to the properties of the  
matrix it is trying to invert.
One way to make sure that the matrix is alway diagonally dominant is  
to set SEAICE_zetaMin to a value large than zero (such as 4e8), but  
Jinlun keeps reminding us, that that should not be necessary and is  
even wrong.

Can you figure out, where in the domain it blows up? Usually near or  
at the boundary, where additionally there are large gradients in ice  
volume/concentration/strength?

The only thing I can think of are boundary conditions. Do you use no  
slip or free slip for cs510 and the Arctic configuration? Try the other.

In the meantime I'll revisit the code and see if I can find anything.  
Maybe it would be good idea, if someone other then myself would also  
have a look and compare to what I have written in the appendix of  
ceaice_part1.tex

Martin


On Apr 30, 2009, at 12:23 AM, Dimitris Menemenlis wrote:

> Martin,
>
> The Arctic CS510 configuration crashes after a few days of  
> integration for:
> new code with CS510 defaults
> and metric terms off
> and clip_velocities true or false
> and use_saltplume true or false
>
> It runs OK with OLD_AND_BAD_DISRETIZATION
> the test that we are running is with
> use_saltplume=TRUE
> clipvelocities=.FALSE.
>
> curious why Hong's global CS510 worked.
> the main difference between the setups is the open boundary conditions
> (only oceanic here since there is no sea ice at the boundaries)
> and the fact that global integration uses reafreshwaterforcing and
> nonlinear free surface.
>
> when we run the crashing new configuration with debuglevel=3
> it stops at second time step with following error message:
> ABNORMAL END: S/R SEAICE_LSR did not converge (uIce)
>
> for good measure we are doing all of above integrations with  
> deltat=120
> even though the OLD_AND_SUPERBAD code run stably with deltat=1200
>
> what else can we try?
>
> An and Dimitris
>
> On Apr 29, 2009, at 11:51 AM, Martin Losch wrote:
>
>> Hi all,
>> can I summarize, what has happened so far in terms of new sea ice
>> code? I am a little confused as to what works and what not. Correct  
>> me
>> if I am wrong:
>>
>> A. Martin's experiments of a lat/lon grid:
>> 1. 0.25deg rotated regional grid of Arctic ocean with CORE forcing
>> (1958-2004) work with new code for
>> SEAICE_clipVelocities = .false.,
>> SEAICE_zetaMin = 0.,
>> SEAICE_advSnow=.true.
>> both SEAICE_no_slip=.true. and .false.
>> different types of open boundary conditions for sea ice and varying
>> oceanic conditions
>> 2. 2deg global ("isotropic") grid works
>> 3. 1deg and 0.5deg regional ("isotropic") grid of the Weddell Sea
>> works for at least 10years (Olaf Klatt's experiment)
>>
>> B. Patrick's ECCO-GODAE 1deg global runs (are they all iteration 75?)
>> 1. blows up after transition from old to new seaice code in  
>> iteration 75
>> 2. runs (iteration 75?) with SEAICE_OLD_AND_BAD_DISCRETIZATION  
>> defined
>> and old defaults (e.g. SEAICEadvSnow = .FALSE., SEAICE_clipVelocities
>> = .TRUE.,, the rest shouldn't matter)
>> 3. blows up with  with SEAICE_OLD_AND_BAD_DISCRETIZATION defined and
>>     SEAICEadvSnow = .TRUE.,
>>     SEAICEadvSalt = .TRUE.,
>>     SEAICEadvAge  = .TRUE.,
>>     SEAICE_clipVelocities = .TRUE.,
>>
>> C. Hong's CS510 experiments
>> 1.a pure forward run works with SEAICE_OLD_AND_BAD_DISCRETIZATION
>> undefined and use
>>    SEAICEadvSnow      = .TRUE.,
>>    SEAICEadvSalt      = .TRUE.,
>>    SEAICEuseFlooding  = .TRUE.,
>> 2. adjoint run breaks with new code, but works with old code (pre-
>> checkpoint61j)
>> 3. any news?
>>
>> So in summary, forward simulations seem to work, except for Patrick's
>> ECCO-GODAE (which is probably the most important one). What's
>> different for that configurations? The only thing I can think of is,
>> that Patrick probably does not scale dy with cos(lat), so that there
>> are elongated grid boxes near the poles. Could that be a problem
>> numerically for the new code? I find it hard to believe, but it is
>> true that my lat/lon-runs are all with nearly "quadratic" grid cells,
>> and so are the CS-grid cells.
>>
>> How can we continue with this? How can I help?
>>
>> Martin
>>
>>
>>
>>
>> _______________________________________________
>> MITgcm-devel mailing list
>> MITgcm-devel at mitgcm.org
>> http://mitgcm.org/mailman/listinfo/mitgcm-devel
>
> Dimitris Menemenlis <menemenlis at jpl.nasa.gov>
> Jet Propulsion Lab, California Institute of Technology
> MS 300-323, 4800 Oak Grove Dr, Pasadena CA 91109-8099, USA
> tel: 818-354-1656;  cell: 818-625-6498;  fax: 818-393-6720
>
> _______________________________________________
> MITgcm-devel mailing list
> MITgcm-devel at mitgcm.org
> http://mitgcm.org/mailman/listinfo/mitgcm-devel




More information about the MITgcm-devel mailing list