[MITgcm-support] Sea ice dynamics causing model to output NaN's after changing grid configuration

Martin Losch Martin.Losch at awi.de
Mon Nov 25 12:38:14 EST 2019


Hi again,

the source of this problem (NaN in the solution) was NaNs already in the wind forcing files (so that’s something to check before really digging into the code, as I did). In this context, I recommend that atmospheric forcing fields (from reanalysis) should be defined over the entire model domain. Especially wind forcing may be averaged to velocity grid points in pkg/exf, so that masking your atmospheric data with a land-sea mask may lead to unexpected results near the coasts.

Martin

> On 22. Nov 2019, at 16:07, Martin Losch <Martin.Losch at awi.de> wrote:
> 
> Dear Mari,
> 
> if it happens in the first time step, the problem shouldn’t be too hard to find, but without the complete setup, I cannot do much. Can you provide a minimal configuration (not to this list, but somewhere to download), that reproduces your problem?
> In the meantime, can you set the debugLevel = 5 to get more output, also dumpFreq=1. (or smaller than deltaT)? Then you might be able to figure out, where exactly NaNs appear first. Often the sea ice code yields NaNs because it’s called at the beginning of the timestep and picks up NaN from the previous timestep or the initialization (if there are any). 
> 
> Martin
> 
> PS. None of the following may be related to your specific problem, but I comment anyway:
> - you use insufficent forcing files (downward radiation, specific humidity are missing, although they are required by the flags in EXF_OPTIONS.h), this can lead to very large fluxes of heat.
> 
> precipfile        = 'EmP.bin',
> implies wrong sign (but I don’t know what’s in EmP, I am assuming E minus P, so evap minus precip), precipfile expects positive values for precipitation.
> 
> # this turns off useful checks (.e.g if heat fluxes are ridiculuous, that will usually lead to model explosion)
> useExfCheckRange  = .FALSE.,
> 
> # don’t set these, unless you know what you are doing, they default to the global “deltaT” anyway
> SEAICE_deltaTtherm = 200.,
> SEAICE_deltaTdyn   = 200.,
> 
> from your parameters I estimate a cfl number of 0.04 for u-velocity = 1 m/s, so that should be OK for the first few timesteps, but when you have large adjustment problem, this may lead to instability.
> 
> # as rule of them the ratio of two adjacent dz should not exceed 1.4 (for numerical accuracy), ie. dz(k+1)/dz(k) < 1.4
> delZ=5,10,10,10,25,15*50,4*200,6*400,
> 
> # why do yo do that? that can mask problems
> checkIniTemp=.false.,
> 
> The grid you are using will have rectangular cells because dx = 0.08333 * cos(lat) * pi/180 * 6371 km = 4.9km, and dy = 0.08333 * pi/180 * 6371 km = 9.2 km. It can be difficult to find viscosity parameters that will damp enough (for the y-direction), but be small enough for the x-direction to avoid numerical instability due to too large viscosity. I would use a grid with nearly quadratic cells if possible (i.e. have dy also scaled by cos(lat))
> 
> 
> 
>> On 21. Nov 2019, at 17:58, Mari Fjalstad Jensen <Mari.F.Jensen at uib.no> wrote:
>> 
>> Dear Martin and David (or others)
>> 
>> Did you ever solve the problem below? I am having the same problem, NaN's after on time-step when sea ice dynamics are enabled. The simulation runs fine when sea ice dynamics are turned off, or if there is no wind forcing.
>> 
>> In addition, I recently switched from a cartesian domain with smooth simple bathymetry (flat bottom + sloping sides) to a spherical polar- realistic bathymetry domain. In the previous setup, the sea ice package was working fine in the presence of wind and sea ice dynamics.
>> 
>> Attached are data* and *OPTIONS files
>> 
>> Cheers and thanks,
>> Mari
>> 
>> 
>> 
>> On 24/08/2019 08.55, Martin Losch wrote:
>>> Hi David,
>>> I would need more details to be able to help. Configuration (code directory, namelist files, version of model), and what exactly you changed from a working configuration
>>> 
>>> Martin
>>> 
>>>> On 24. Aug 2019, at 00:07, David Vishny <davidvish at gmail.com> wrote:
>>>> 
>>>> To whom it may concern,
>>>> 
>>>> I recently changed the grid configuration in my model setup, and now the model outputs all NaN’s after a single time-step if sea ice dynamics are enabled, even though there is no sea ice to begin with. If I disable sea ice dynamics, it appears that no problems occur. Thus, my advisor Malte Jansen and I believe the LSR solver is producing NaN’s.
>>>> 
>>>> I know one parameter related to the LSR solver is the LSR error. For any given grid configuration, could the solver crash due to an LSR error that is either too high or too low? Are there any other parameters I should be playing around with that are related to the LSR solver?
>>>> 
>>>> 
>>>> Thanks,
>>>> 
>>>> David Vishny
>>>> _______________________________________________
>>>> MITgcm-support mailing list
>>>> MITgcm-support at mitgcm.org
>>>> http://mailman.mitgcm.org/mailman/listinfo/mitgcm-support
>>> _______________________________________________
>>> MITgcm-support mailing list
>>> MITgcm-support at mitgcm.org
>>> http://mailman.mitgcm.org/mailman/listinfo/mitgcm-support
>> <data.seaice><data.pkg><data.exf><data.txt><EXF_OPTIONS.h><CPP_OPTIONS.h>_______________________________________________
>> MITgcm-support mailing list
>> MITgcm-support at mitgcm.org
>> http://mailman.mitgcm.org/mailman/listinfo/mitgcm-support
> 
> _______________________________________________
> MITgcm-support mailing list
> MITgcm-support at mitgcm.org
> http://mailman.mitgcm.org/mailman/listinfo/mitgcm-support



More information about the MITgcm-support mailing list