[MITgcm-support] Weird numerical noise leads to the model blowing up
Fancer Lancer
fancer.lancer at gmail.com
Wed Apr 9 08:13:02 EDT 2014
Good day MITgcmers,
I use the linear potential theory to specify a small-amplitude internal
wave in 2D case. I performed a lot of model runs for waves with different
wavelengths for the various pycnocline depths. To cut it short most of them
work pretty fine: initial wave propagates as it supposed to be predicted.
But there are some experiments, which suddenly blow up with weird noise
around the pycnocline (see the attached gif animated figure).
Does someone know what is the possible reason of that strange behavior?
It should be noted a few facts about the model setup:
1) Since the linear theory is used to setup the velocity field the
transition layer (pycnocline) between two water layers is supposed to be as
thin as it possible. So ideally there shouldn't be the transition layer at
all, but should be just the density jump. All the experiments work pretty
fine even with the abrupt density jump (no pycnlocline), but some of them
suddenly blow up (see the gif-figure).
2) The linear theory implies that fluid interface perturbations should be
infinitely small that's why I use the variable vertical resolution, which
is huge around the fluid interface. But it should be noted, that I specify
the corresponding time-step to satisfy the CFL condition and the most of
the model runs work pretty well.
Any suggestion concerning the problem would be appreciated.
Many thanks in advance.
Sincerely,
-Sergey
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mitgcm.org/pipermail/mitgcm-support/attachments/20140409/7812c8c3/attachment-0001.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Test3.gif
Type: image/gif
Size: 3041445 bytes
Desc: not available
URL: <http://mitgcm.org/pipermail/mitgcm-support/attachments/20140409/7812c8c3/attachment-0001.gif>
More information about the MITgcm-support
mailing list