[MITgcm-devel] seaice code beyond checkoint61j blows for ECCO-GODAE
Dimitris Menemenlis
menemenlis at jpl.nasa.gov
Fri May 1 21:12:47 EDT 2009
Martin, some news. Problem, as you guessed appears to be associated
with free slip sea ice boundary conditions.
Integrations with free slip failed after different amounts of time
both in the Weddell Sea and in the Arctic domains. Curiously the
CS510 did not blow up but could be pure luck. The one case where we
dumped everything to examine where it blows up, we find that it occurs
at "one" vice point next to jagged coastline but in a region where
heff is zero.
Integration with no-slip boundary conditions and with masking of
etaMeanZ are stable, at least for the short intervals
that we have tested. An will now run a 1992-present integration with
no-slip boundary condtions. Michael is running no-slip int he Weddell
domain and hong will submit the CS510 adjoint with no-slip, see what
happens.
More news on Monday, D.
On Apr 30, 2009, at 12:52 AM, Martin Losch wrote:
> Dimitris,
> I am not quite sure about my free slip implementation.
> If you are using free slip for sea ice, then try modifying the code
> like this:
> after the definition of etaMeanZ (shear viscosity on zeta/vorticity-
> points):
> DO J=1,sNy+1
> DO I=1,sNx+1
> etaMeanZ (I,J,bi,bj) =
> & ( ETA (I,J ,bi,bj) + ETA (I-1,J ,bi,bj)
> & + ETA (I,J-1,bi,bj) + ETA (I-1,J-1,bi,bj) )
> & / MAX(1.D0,maskC(I,J, k,bi,bj)+maskC(I-1,J, k,bi,bj)
> & + maskC(I,J-1,k,bi,bj)
> +maskC(I-1,J-1,k,bi,bj) )
> ENDDO
> ENDDO
> insert another loop that masks etaMeanZ in the case of free slip:
> IF (.NOT.SEAICE_no_slip) THEN
> DO J=1,sNy+1
> DO I=1,sNx+1
> etaMeanZ (I,J,bi,bj) = etaMeanZ(I,J,bi,bj)
> & *maskC(I,J, k,bi,bj)*maskC(I-1,J, k,bi,bj)
> & *maskC(I,J-1,k,bi,bj)*maskC(I-1,J-1,k,bi,bj)
> ENDDO
> ENDDO
> ENDIF
> This should take care of any free slip conditions (basically then all
> offdiagonal sigma_12 are zero on the boundaries). What I have
> implemented should do the same thing, but it's far more complicated
> and maybe I made a mistake, that I cannot see now.
>
> M.
> On Apr 30, 2009, at 9:33 AM, Martin Losch wrote:
>
>> Hi Dimitris,
>>
>> that's bad news and this is really hard to debug, because
>> apparently, the solver techique (LSOR) is very sensitive to the
>> properties of the matrix it is trying to invert.
>> One way to make sure that the matrix is alway diagonally dominant is
>> to set SEAICE_zetaMin to a value large than zero (such as 4e8), but
>> Jinlun keeps reminding us, that that should not be necessary and is
>> even wrong.
>>
>> Can you figure out, where in the domain it blows up? Usually near or
>> at the boundary, where additionally there are large gradients in ice
>> volume/concentration/strength?
>>
>> The only thing I can think of are boundary conditions. Do you use no
>> slip or free slip for cs510 and the Arctic configuration? Try the
>> other.
>>
>> In the meantime I'll revisit the code and see if I can find
>> anything. Maybe it would be good idea, if someone other then myself
>> would also have a look and compare to what I have written in the
>> appendix of ceaice_part1.tex
>>
>> Martin
>>
>>
>> On Apr 30, 2009, at 12:23 AM, Dimitris Menemenlis wrote:
>>
>>> Martin,
>>>
>>> The Arctic CS510 configuration crashes after a few days of
>>> integration for:
>>> new code with CS510 defaults
>>> and metric terms off
>>> and clip_velocities true or false
>>> and use_saltplume true or false
>>>
>>> It runs OK with OLD_AND_BAD_DISRETIZATION
>>> the test that we are running is with
>>> use_saltplume=TRUE
>>> clipvelocities=.FALSE.
>>>
>>> curious why Hong's global CS510 worked.
>>> the main difference between the setups is the open boundary
>>> conditions
>>> (only oceanic here since there is no sea ice at the boundaries)
>>> and the fact that global integration uses reafreshwaterforcing and
>>> nonlinear free surface.
>>>
>>> when we run the crashing new configuration with debuglevel=3
>>> it stops at second time step with following error message:
>>> ABNORMAL END: S/R SEAICE_LSR did not converge (uIce)
>>>
>>> for good measure we are doing all of above integrations with
>>> deltat=120
>>> even though the OLD_AND_SUPERBAD code run stably with deltat=1200
>>>
>>> what else can we try?
>>>
>>> An and Dimitris
>>>
>>> On Apr 29, 2009, at 11:51 AM, Martin Losch wrote:
>>>
>>>> Hi all,
>>>> can I summarize, what has happened so far in terms of new sea ice
>>>> code? I am a little confused as to what works and what not.
>>>> Correct me
>>>> if I am wrong:
>>>>
>>>> A. Martin's experiments of a lat/lon grid:
>>>> 1. 0.25deg rotated regional grid of Arctic ocean with CORE forcing
>>>> (1958-2004) work with new code for
>>>> SEAICE_clipVelocities = .false.,
>>>> SEAICE_zetaMin = 0.,
>>>> SEAICE_advSnow=.true.
>>>> both SEAICE_no_slip=.true. and .false.
>>>> different types of open boundary conditions for sea ice and varying
>>>> oceanic conditions
>>>> 2. 2deg global ("isotropic") grid works
>>>> 3. 1deg and 0.5deg regional ("isotropic") grid of the Weddell Sea
>>>> works for at least 10years (Olaf Klatt's experiment)
>>>>
>>>> B. Patrick's ECCO-GODAE 1deg global runs (are they all iteration
>>>> 75?)
>>>> 1. blows up after transition from old to new seaice code in
>>>> iteration 75
>>>> 2. runs (iteration 75?) with SEAICE_OLD_AND_BAD_DISCRETIZATION
>>>> defined
>>>> and old defaults (e.g. SEAICEadvSnow = .FALSE.,
>>>> SEAICE_clipVelocities
>>>> = .TRUE.,, the rest shouldn't matter)
>>>> 3. blows up with with SEAICE_OLD_AND_BAD_DISCRETIZATION defined
>>>> and
>>>> SEAICEadvSnow = .TRUE.,
>>>> SEAICEadvSalt = .TRUE.,
>>>> SEAICEadvAge = .TRUE.,
>>>> SEAICE_clipVelocities = .TRUE.,
>>>>
>>>> C. Hong's CS510 experiments
>>>> 1.a pure forward run works with SEAICE_OLD_AND_BAD_DISCRETIZATION
>>>> undefined and use
>>>> SEAICEadvSnow = .TRUE.,
>>>> SEAICEadvSalt = .TRUE.,
>>>> SEAICEuseFlooding = .TRUE.,
>>>> 2. adjoint run breaks with new code, but works with old code (pre-
>>>> checkpoint61j)
>>>> 3. any news?
>>>>
>>>> So in summary, forward simulations seem to work, except for
>>>> Patrick's
>>>> ECCO-GODAE (which is probably the most important one). What's
>>>> different for that configurations? The only thing I can think of
>>>> is,
>>>> that Patrick probably does not scale dy with cos(lat), so that
>>>> there
>>>> are elongated grid boxes near the poles. Could that be a problem
>>>> numerically for the new code? I find it hard to believe, but it is
>>>> true that my lat/lon-runs are all with nearly "quadratic" grid
>>>> cells,
>>>> and so are the CS-grid cells.
>>>>
>>>> How can we continue with this? How can I help?
>>>>
>>>> Martin
>>>>
>>>>
>>>>
>>>>
>>>> _______________________________________________
>>>> MITgcm-devel mailing list
>>>> MITgcm-devel at mitgcm.org
>>>> http://mitgcm.org/mailman/listinfo/mitgcm-devel
>>>
>>> Dimitris Menemenlis <menemenlis at jpl.nasa.gov>
>>> Jet Propulsion Lab, California Institute of Technology
>>> MS 300-323, 4800 Oak Grove Dr, Pasadena CA 91109-8099, USA
>>> tel: 818-354-1656; cell: 818-625-6498; fax: 818-393-6720
>>>
>>> _______________________________________________
>>> MITgcm-devel mailing list
>>> MITgcm-devel at mitgcm.org
>>> http://mitgcm.org/mailman/listinfo/mitgcm-devel
>>
>> _______________________________________________
>> MITgcm-devel mailing list
>> MITgcm-devel at mitgcm.org
>> http://mitgcm.org/mailman/listinfo/mitgcm-devel
>
> _______________________________________________
> MITgcm-devel mailing list
> MITgcm-devel at mitgcm.org
> http://mitgcm.org/mailman/listinfo/mitgcm-devel
Dimitris Menemenlis <menemenlis at jpl.nasa.gov>
Jet Propulsion Lab, California Institute of Technology
MS 300-323, 4800 Oak Grove Dr, Pasadena CA 91109-8099, USA
tel: 818-354-1656; cell: 818-625-6498; fax: 818-393-6720
More information about the MITgcm-devel
mailing list