[MITgcm-support] Domain decompositions affecting simulations outcome
Giordano, Fabio
fgiordano at ogs.it
Mon May 20 10:57:35 EDT 2024
Hi Martin,
Thanks for the very useful reply.
As regards your question ("I am assuming that the dotted lines are the same
domain decompositions with the blank tiles excluded (exch2?)."), yes, the
dotted lines refer exactly to the same setups, but with the EXCH2 package
activated.
I ran some more tests. As you suggested, I tried to change the flags in
eesupp/inc/CPP_EEOPTIONS.h; however, as it is written in the documentation (
https://mitgcm.readthedocs.io/en/latest/getting_started/getting_started.html#preprocessor-execution-environment-options),
these updates did not solve the problem (as shown in Figure 1), since my
runs are non-hydrostatic, therefore using the 3D conjugate gradient solver.
Moreover, as you noticed, the simulation is more than 10 times slower, so
the option is not feasible.
I also tried to look at higher frequency outputs to better detect when and
where the issue starts. I tested 4 runs, keeping the same number of tiles
along the longitude (nPx = 26), as follows:
1: 260 processes, with 10 tiles in latitude and therefore sNx = 30;
2: 390 processes, with 15 tiles in latitude, sNx = 20;
3: 520 processes, with 20 tiles in latitude, sNx = 15;
4: 650 processes, with 25 tiles in latitude, sNx = 12.
In Figures 2-4 I plotted maps of the differences between them (hourly
diagnostics).
Figure 2 highlights an issue with the OBCS package: in all the simulations
a sponge layer of 15 cells of width is active at the southern boundary,
where a strip of non-zero values appears at the inner boundary of the
sponge layer.
In simulations where the tile dimension is larger than the sponge layer
thickness, these discrepancies arise in the domain interior (Figure 3). In
other words, the problem at the boundary originates earlier and somehow
overwrites the issue in the domain interior, but, at the end of the story,
the result is qualitatively the same (Figure 4), with very remarkable
differences after some days of simulation (e.g. if we want to compare SST
with satellite observations).
I also did a year-long test to check the long-term behaviour of the
differences: as expected, they do not diverge, keeping within the same
range observed in the monthly tests.
Please let me know if you have other suggestions for further tests, or if
we simply have to take into account this intrinsic numerical "variability"
(therefore preferring longer-term averages when comparing model outputs).
Thank you
Sincerely,
Fabio Giordano
Sezione di Oceanografia
Istituto Nazionale di Oceanografia e di Geofisica Sperimentale - OGS
via Beirut n. 2
34151 Trieste - Italia
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.mitgcm.org/pipermail/mitgcm-support/attachments/20240520/400dca8a/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Figure3.png
Type: image/png
Size: 58854 bytes
Desc: not available
URL: <http://mailman.mitgcm.org/pipermail/mitgcm-support/attachments/20240520/400dca8a/attachment-0004.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Figure1.png
Type: image/png
Size: 178367 bytes
Desc: not available
URL: <http://mailman.mitgcm.org/pipermail/mitgcm-support/attachments/20240520/400dca8a/attachment-0005.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Figure2.png
Type: image/png
Size: 67229 bytes
Desc: not available
URL: <http://mailman.mitgcm.org/pipermail/mitgcm-support/attachments/20240520/400dca8a/attachment-0006.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Figure4.png
Type: image/png
Size: 229205 bytes
Desc: not available
URL: <http://mailman.mitgcm.org/pipermail/mitgcm-support/attachments/20240520/400dca8a/attachment-0007.png>
More information about the MITgcm-support
mailing list