[MITgcm-support] Weird numerical noise leads to the model blowing up
michael schaferkotter
schaferk at bellsouth.net
Wed Apr 9 09:58:13 EDT 2014
were your experiments done in hydrostatic mode?
On Apr 9, 2014, at 7:13 AM, Fancer Lancer wrote:
> Good day MITgcmers,
>
> I use the linear potential theory to specify a small-amplitude internal wave in 2D case. I performed a lot of model runs for waves with different wavelengths for the various pycnocline depths. To cut it short most of them work pretty fine: initial wave propagates as it supposed to be predicted. But there are some experiments, which suddenly blow up with weird noise around the pycnocline (see the attached gif animated figure).
>
> Does someone know what is the possible reason of that strange behavior?
>
> It should be noted a few facts about the model setup:
> 1) Since the linear theory is used to setup the velocity field the transition layer (pycnocline) between two water layers is supposed to be as thin as it possible. So ideally there shouldn't be the transition layer at all, but should be just the density jump. All the experiments work pretty fine even with the abrupt density jump (no pycnlocline), but some of them suddenly blow up (see the gif-figure).
> 2) The linear theory implies that fluid interface perturbations should be infinitely small that's why I use the variable vertical resolution, which is huge around the fluid interface. But it should be noted, that I specify the corresponding time-step to satisfy the CFL condition and the most of the model runs work pretty well.
>
> Any suggestion concerning the problem would be appreciated.
> Many thanks in advance.
> Sincerely,
> -Sergey
>
> <Test3.gif>_______________________________________________
> MITgcm-support mailing list
> MITgcm-support at mitgcm.org
> http://mitgcm.org/mailman/listinfo/mitgcm-support
More information about the MITgcm-support
mailing list