[MITgcm-support] Domain decompositions affecting simulations outcome

Giordano, Fabio fgiordano at ogs.it
Mon May 13 12:30:16 EDT 2024


Dear MITgcm community,

I am a PhD student at the National Institute of Oceanography and Applied
Geophysics – OGS (Italy).
I am running a batch of simulations for the northern Adriatic Sea (a
semi-enclosed basin in the Mediterranean) and I am testing some possible
domain (made of 494 cells in the x-direction, 300 in y-, 1/128° resolution)
decompositions getting, unexpectedly, slightly different results. I am
using the HPC infrastructure G100 (
https://www.hpc.cineca.it/systems/hardware/galileo100/).
I did some tests, simulating just 30 days, starting from January 1st 2020,
with atmospheric forcing from a Limited Area Model (2.2 km resolution) and
initial and boundary conditions from the EU Copernicus Marine Service
Mediterranean Reanalysis.

I ran 8 different setups, changing *only* the domain decomposition (same
forcing, parameters, namelists etc.): I chose 4 different tilings and then
doubled the experiments by running them also with the EXCH2 package,
obtaining the following setups:

   1.

   95 processors (19 × 5 in x and y respectively) → 72 processors with EXCH2
   2.

   190 processors (19 × 10 in x and y respectively) → 130 processors with
   EXCH2
   3.

   520 processors (26 × 20 in x and y respectively) → 339 processors with
   EXCH2
   4.

   760 processors (38 × 20 in x and y respectively) → 474 processors with
   EXCH2

The domain and the different decompositions are shown in Figures 1÷4.

>From these setups I plotted in Figure 5 sea surface temperature time
series, with daily frequency, for a single cell in the open sea (upper
panel), shown by the black dot in Figures 1÷4. I also plotted the
horizontal average over the whole upper level (lower panel), which seems to
smooth/balance the differences. The plots clearly show different outputs
after a few days.
To check whether it is a machine-dependent issue, I also run the last setup
(474 processors) twice and got identical results (identical STDOUTs).


My question is: did I mess up something or is this an expected behaviour
related to the domain decomposition? And if so, what is the reason?

Thank you very much for the support.

Kind regards,

Fabio Giordano
Sezione di Oceanografia
Istituto Nazionale di Oceanografia e di Geofisica Sperimentale - OGS
via Beirut n. 2
34151 Trieste - Italia
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.mitgcm.org/pipermail/mitgcm-support/attachments/20240513/307fc876/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Figure04_760p.png
Type: image/png
Size: 173377 bytes
Desc: not available
URL: <http://mailman.mitgcm.org/pipermail/mitgcm-support/attachments/20240513/307fc876/attachment-0005.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Figure05_8testsComparison.png
Type: image/png
Size: 325047 bytes
Desc: not available
URL: <http://mailman.mitgcm.org/pipermail/mitgcm-support/attachments/20240513/307fc876/attachment-0006.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Figure01_095p.png
Type: image/png
Size: 137287 bytes
Desc: not available
URL: <http://mailman.mitgcm.org/pipermail/mitgcm-support/attachments/20240513/307fc876/attachment-0007.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Figure02_190p.png
Type: image/png
Size: 144594 bytes
Desc: not available
URL: <http://mailman.mitgcm.org/pipermail/mitgcm-support/attachments/20240513/307fc876/attachment-0008.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Figure03_520p.png
Type: image/png
Size: 164749 bytes
Desc: not available
URL: <http://mailman.mitgcm.org/pipermail/mitgcm-support/attachments/20240513/307fc876/attachment-0009.png>


More information about the MITgcm-support mailing list