[MITgcm-devel] seaice code beyond checkoint61j blows for ECCO-GODAE
Martin Losch
Martin.Losch at awi.de
Tue May 5 03:23:33 EDT 2009
Hi Dimitris, and JPL crew,
thanks for making all of these tests. The trouble with the LSR solver
is, that you never know when it's going to blow up, so I agree it
could be pure luck that the CS510 did not blow up (but then again I
have 47year integration with free slip for ice that does not blow up
either).
Could you clarify for me:
1. Free slip with latest code fails for WedSea and Arctic, but not for
CS510
2. masking of etaMeanZ as suggested in my previous email is meant only
for the free slip case (SEAICE_no_slip=.false.), so your statements
are confusing me: What did you actually do?
a. SEAICE_no_slip=.true. with suggested masking (== no slip without
suggested masking, as masking should have no effect)
b. SEAICE_no_slip=.false. with suggested masking
Just to clarify the free-slip/no-slip boundary conditions:
- no-slip means that u(i,j+/-1)+u(i,j)=0 (if u(i,j+/-1) is the point
on land), so that the tangential velocity is exactly zero on the
boundary. Then the derivative u(i,j+1)-u(i,j) becomes -2*u(i,j), etc
and the average is trivially 0.
- free-slip means that sigma_12 = 0, and e_12=0 (no lateral stress) on
the boundaries (that is on Z-points), so that u(i,j+/-1)-u(i,j)=0
alone is NOT correct (for the viscosity in the ocean we can do that,
because there the viscosity tensor is diagonal). If you have a look at
the appendix of the manuscript ceaice_part1.tex (attached), you'll see
that all contributions to sigma_12 on Z points are multiplied by
etaMeanZ, so that setting etaMeanZ on the boundary trivially makes
sigma_12 satisfy the free-slip boundary conditions (hence my
suggestion for masking etaMeanZ). I think I have it correct for e_12
(in seaice_calc_strainrates.F) by masking e_12, but for some reason
that I do not remember, I did it differently for meanEtaZ.
Maybe it is a good idea, if someone other than myself has a look at
the code (seaice_lsr.F, seaice_calc_strainrates.F,
seaice_calc_viscosities.F), as four eyes see more that two, especially
if the first two are blinded by prejudice and conceit (o:
Martin
-------------- next part --------------
A non-text attachment was scrubbed...
Name: ceaice_part1_app.pdf
Type: application/pdf
Size: 142238 bytes
Desc: not available
URL: <http://mitgcm.org/pipermail/mitgcm-devel/attachments/20090505/ee7bad62/attachment.pdf>
-------------- next part --------------
On May 2, 2009, at 3:12 AM, Dimitris Menemenlis wrote:
> Martin, some news. Problem, as you guessed appears to be associated
> with free slip sea ice boundary conditions.
> Integrations with free slip failed after different amounts of time
> both in the Weddell Sea and in the Arctic domains. Curiously the
> CS510 did not blow up but could be pure luck. The one case where we
> dumped everything to examine where it blows up, we find that it
> occurs at "one" vice point next to jagged coastline but in a region
> where heff is zero.
>
> Integration with no-slip boundary conditions and with masking of
> etaMeanZ are stable, at least for the short intervals
> that we have tested. An will now run a 1992-present integration
> with no-slip boundary condtions. Michael is running no-slip int he
> Weddell domain and hong will submit the CS510 adjoint with no-slip,
> see what happens.
>
> More news on Monday, D.
>
> On Apr 30, 2009, at 12:52 AM, Martin Losch wrote:
>
>> Dimitris,
>> I am not quite sure about my free slip implementation.
>> If you are using free slip for sea ice, then try modifying the code
>> like this:
>> after the definition of etaMeanZ (shear viscosity on zeta/vorticity-
>> points):
>> DO J=1,sNy+1
>> DO I=1,sNx+1
>> etaMeanZ (I,J,bi,bj) =
>> & ( ETA (I,J ,bi,bj) + ETA (I-1,J ,bi,bj)
>> & + ETA (I,J-1,bi,bj) + ETA (I-1,J-1,bi,bj) )
>> & / MAX(1.D0,maskC(I,J, k,bi,bj)+maskC(I-1,J, k,bi,bj)
>> & + maskC(I,J-1,k,bi,bj)
>> +maskC(I-1,J-1,k,bi,bj) )
>> ENDDO
>> ENDDO
>> insert another loop that masks etaMeanZ in the case of free slip:
>> IF (.NOT.SEAICE_no_slip) THEN
>> DO J=1,sNy+1
>> DO I=1,sNx+1
>> etaMeanZ (I,J,bi,bj) = etaMeanZ(I,J,bi,bj)
>> & *maskC(I,J, k,bi,bj)*maskC(I-1,J, k,bi,bj)
>> & *maskC(I,J-1,k,bi,bj)*maskC(I-1,J-1,k,bi,bj)
>> ENDDO
>> ENDDO
>> ENDIF
>> This should take care of any free slip conditions (basically then all
>> offdiagonal sigma_12 are zero on the boundaries). What I have
>> implemented should do the same thing, but it's far more complicated
>> and maybe I made a mistake, that I cannot see now.
>>
>> M.
>> On Apr 30, 2009, at 9:33 AM, Martin Losch wrote:
>>
>>> Hi Dimitris,
>>>
>>> that's bad news and this is really hard to debug, because
>>> apparently, the solver techique (LSOR) is very sensitive to the
>>> properties of the matrix it is trying to invert.
>>> One way to make sure that the matrix is alway diagonally dominant is
>>> to set SEAICE_zetaMin to a value large than zero (such as 4e8), but
>>> Jinlun keeps reminding us, that that should not be necessary and is
>>> even wrong.
>>>
>>> Can you figure out, where in the domain it blows up? Usually near or
>>> at the boundary, where additionally there are large gradients in ice
>>> volume/concentration/strength?
>>>
>>> The only thing I can think of are boundary conditions. Do you use no
>>> slip or free slip for cs510 and the Arctic configuration? Try the
>>> other.
>>>
>>> In the meantime I'll revisit the code and see if I can find
>>> anything. Maybe it would be good idea, if someone other then myself
>>> would also have a look and compare to what I have written in the
>>> appendix of ceaice_part1.tex
>>>
>>> Martin
>>>
>>>
>>> On Apr 30, 2009, at 12:23 AM, Dimitris Menemenlis wrote:
>>>
>>>> Martin,
>>>>
>>>> The Arctic CS510 configuration crashes after a few days of
>>>> integration for:
>>>> new code with CS510 defaults
>>>> and metric terms off
>>>> and clip_velocities true or false
>>>> and use_saltplume true or false
>>>>
>>>> It runs OK with OLD_AND_BAD_DISRETIZATION
>>>> the test that we are running is with
>>>> use_saltplume=TRUE
>>>> clipvelocities=.FALSE.
>>>>
>>>> curious why Hong's global CS510 worked.
>>>> the main difference between the setups is the open boundary
>>>> conditions
>>>> (only oceanic here since there is no sea ice at the boundaries)
>>>> and the fact that global integration uses reafreshwaterforcing and
>>>> nonlinear free surface.
>>>>
>>>> when we run the crashing new configuration with debuglevel=3
>>>> it stops at second time step with following error message:
>>>> ABNORMAL END: S/R SEAICE_LSR did not converge (uIce)
>>>>
>>>> for good measure we are doing all of above integrations with
>>>> deltat=120
>>>> even though the OLD_AND_SUPERBAD code run stably with deltat=1200
>>>>
>>>> what else can we try?
>>>>
>>>> An and Dimitris
>>>>
>>>> On Apr 29, 2009, at 11:51 AM, Martin Losch wrote:
>>>>
>>>>> Hi all,
>>>>> can I summarize, what has happened so far in terms of new sea ice
>>>>> code? I am a little confused as to what works and what not.
>>>>> Correct me
>>>>> if I am wrong:
>>>>>
>>>>> A. Martin's experiments of a lat/lon grid:
>>>>> 1. 0.25deg rotated regional grid of Arctic ocean with CORE forcing
>>>>> (1958-2004) work with new code for
>>>>> SEAICE_clipVelocities = .false.,
>>>>> SEAICE_zetaMin = 0.,
>>>>> SEAICE_advSnow=.true.
>>>>> both SEAICE_no_slip=.true. and .false.
>>>>> different types of open boundary conditions for sea ice and
>>>>> varying
>>>>> oceanic conditions
>>>>> 2. 2deg global ("isotropic") grid works
>>>>> 3. 1deg and 0.5deg regional ("isotropic") grid of the Weddell Sea
>>>>> works for at least 10years (Olaf Klatt's experiment)
>>>>>
>>>>> B. Patrick's ECCO-GODAE 1deg global runs (are they all iteration
>>>>> 75?)
>>>>> 1. blows up after transition from old to new seaice code in
>>>>> iteration 75
>>>>> 2. runs (iteration 75?) with SEAICE_OLD_AND_BAD_DISCRETIZATION
>>>>> defined
>>>>> and old defaults (e.g. SEAICEadvSnow = .FALSE.,
>>>>> SEAICE_clipVelocities
>>>>> = .TRUE.,, the rest shouldn't matter)
>>>>> 3. blows up with with SEAICE_OLD_AND_BAD_DISCRETIZATION defined
>>>>> and
>>>>> SEAICEadvSnow = .TRUE.,
>>>>> SEAICEadvSalt = .TRUE.,
>>>>> SEAICEadvAge = .TRUE.,
>>>>> SEAICE_clipVelocities = .TRUE.,
>>>>>
>>>>> C. Hong's CS510 experiments
>>>>> 1.a pure forward run works with SEAICE_OLD_AND_BAD_DISCRETIZATION
>>>>> undefined and use
>>>>> SEAICEadvSnow = .TRUE.,
>>>>> SEAICEadvSalt = .TRUE.,
>>>>> SEAICEuseFlooding = .TRUE.,
>>>>> 2. adjoint run breaks with new code, but works with old code (pre-
>>>>> checkpoint61j)
>>>>> 3. any news?
>>>>>
>>>>> So in summary, forward simulations seem to work, except for
>>>>> Patrick's
>>>>> ECCO-GODAE (which is probably the most important one). What's
>>>>> different for that configurations? The only thing I can think of
>>>>> is,
>>>>> that Patrick probably does not scale dy with cos(lat), so that
>>>>> there
>>>>> are elongated grid boxes near the poles. Could that be a problem
>>>>> numerically for the new code? I find it hard to believe, but it is
>>>>> true that my lat/lon-runs are all with nearly "quadratic" grid
>>>>> cells,
>>>>> and so are the CS-grid cells.
>>>>>
>>>>> How can we continue with this? How can I help?
>>>>>
>>>>> Martin
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> _______________________________________________
>>>>> MITgcm-devel mailing list
>>>>> MITgcm-devel at mitgcm.org
>>>>> http://mitgcm.org/mailman/listinfo/mitgcm-devel
>>>>
>>>> Dimitris Menemenlis <menemenlis at jpl.nasa.gov>
>>>> Jet Propulsion Lab, California Institute of Technology
>>>> MS 300-323, 4800 Oak Grove Dr, Pasadena CA 91109-8099, USA
>>>> tel: 818-354-1656; cell: 818-625-6498; fax: 818-393-6720
>>>>
>>>> _______________________________________________
>>>> MITgcm-devel mailing list
>>>> MITgcm-devel at mitgcm.org
>>>> http://mitgcm.org/mailman/listinfo/mitgcm-devel
>>>
>>> _______________________________________________
>>> MITgcm-devel mailing list
>>> MITgcm-devel at mitgcm.org
>>> http://mitgcm.org/mailman/listinfo/mitgcm-devel
>>
>> _______________________________________________
>> MITgcm-devel mailing list
>> MITgcm-devel at mitgcm.org
>> http://mitgcm.org/mailman/listinfo/mitgcm-devel
>
> Dimitris Menemenlis <menemenlis at jpl.nasa.gov>
> Jet Propulsion Lab, California Institute of Technology
> MS 300-323, 4800 Oak Grove Dr, Pasadena CA 91109-8099, USA
> tel: 818-354-1656; cell: 818-625-6498; fax: 818-393-6720
>
> _______________________________________________
> MITgcm-devel mailing list
> MITgcm-devel at mitgcm.org
> http://mitgcm.org/mailman/listinfo/mitgcm-devel
More information about the MITgcm-devel
mailing list