<html><head><meta http-equiv="content-type" content="text/html; charset=utf-8"></head><body style="overflow-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;">Hi Fabio,<div><br></div><div>thanks for your clear illustration of the problem. I am afraid that these differences can really be due to the domain decomposition, because the domain decomposition can be change the order of summation global sums in the global pressure solver. </div><div>In eesupp/inc/CPP_EEOPTIONS.h you will find this:</div><div><br><blockquote type="cite"><div>C-- Always cumulate tile local-sum in the same order by applying MPI allreduce</div><div>C to array of tiles ; can get slower with large number of tiles (big set-up)</div><div>#define GLOBAL_SUM_ORDER_TILES</div><div><br></div><div>C-- Alternative way of doing global sum without MPI allreduce call</div><div>C but instead, explicit MPI send & recv calls. Expected to be slower.</div><div>#undef GLOBAL_SUM_SEND_RECV</div><div><br></div><div>C-- Alternative way of doing global sum on a single CPU</div><div>C to eliminate tiling-dependent roundoff errors. Note: This is slow.</div><div>#undef CG2D_SINGLECPU_SUM</div></blockquote><div><br></div>You can try to run with CG2D_SINGLECPU_SUM defined to make sure that the summation does not change, but this may significantly slow down your runs.</div><div><br></div><div>I am assuming that the dotted lines are the same domain decompositions with the blank tiles excluded (exch2?). </div><div>I am suprised that that also gives different results, and I assume that also this changes the order of summation a little bit although it is all about summing zeros.</div><div><br></div><div>Martin<br><div><br><blockquote type="cite"><div>On 13. May 2024, at 18:30, Giordano, Fabio <fgiordano@ogs.it> wrote:</div><br class="Apple-interchange-newline"><div><meta http-equiv="Content-Type" content="text/html; charset=utf-8"><div dir="ltr">
<span></span><p lang="en-GB" style="line-height:100%;margin-bottom:0in;background:transparent">
Dear MITgcm community,</p><p lang="en-GB" style="line-height:100%;margin-bottom:0in;background:transparent">I am a
PhD student at the National Institute of Oceanography and Applied
Geophysics – OGS (Italy).<br>
I am running a batch of simulations
for the northern Adriatic Sea (a semi-enclosed basin in the
Mediterranean) and I am testing some possible domain (made of 494
cells in the x-direction, 300 in y-, 1/128° resolution)
decompositions getting, unexpectedly, slightly different results. I
am using the HPC infrastructure G100
(<a href="https://www.hpc.cineca.it/systems/hardware/galileo100/" target="_blank">https://www.hpc.cineca.it/systems/hardware/galileo100/</a>).<br>
I did
some tests, simulating just 30 days, starting from January 1<sup>st</sup>
2020, with atmospheric forcing from a Limited Area Model (2.2 km
resolution) and initial and boundary conditions from the EU
Copernicus Marine Service Mediterranean Reanalysis.</p><p lang="en-GB" style="line-height:100%;margin-bottom:0in;background:transparent">I ran 8
different setups, changing <b>only</b> the domain decomposition (same
forcing, parameters, namelists etc.): I chose 4 different tilings and
then doubled the experiments by running them also with the EXCH2
package, obtaining the following setups:</p>
<ol><li><p lang="en-GB" style="line-height:100%;margin-bottom:0in;background:transparent">95
processors (19 × 5 in x and y respectively) → 72 processors with
EXCH2</p>
</li><li><p lang="en-GB" style="line-height:100%;margin-bottom:0in;background:transparent">190
processors (19 × 10 in x and y respectively) → 130 processors
with EXCH2</p>
</li><li><p lang="en-GB" style="line-height:100%;margin-bottom:0in;background:transparent">520
processors (26 × 20 in x and y respectively) → 339 processors
with EXCH2</p>
</li><li><p lang="en-GB" style="line-height:100%;margin-bottom:0in;background:transparent">760
processors (38 × 20 in x and y respectively) → 474 processors
with EXCH2</p>
</li></ol><p lang="en-GB" style="line-height:100%;margin-bottom:0in;background:transparent">The
domain and the different decompositions are shown in Figures 1÷4.</p><p lang="en-GB" style="line-height:100%;margin-bottom:0in;background:transparent">From
these setups I plotted in Figure 5 sea surface temperature time
series, with daily frequency, for a single cell in the open sea
(upper panel), shown by the black dot in Figures 1÷4. I also plotted
the horizontal average over the whole upper level (lower panel),
which seems to smooth/balance the differences. The plots clearly show
different outputs after a few days.<br>
To check whether it is a
machine-dependent issue, I also run the last setup (474 processors)
twice and got identical results (identical STDOUTs).</p><p lang="en-GB" style="line-height:100%;margin-bottom:0in;background:transparent"><br>
My
question is: did I mess up something or is this an expected behaviour
related to the domain decomposition? And if so, what is the
reason?<br>
<br>
Thank you very much for the support.</p><p lang="en-GB" style="line-height:100%;margin-bottom:0in;background:transparent">Kind
regards,<br><font face="monospace"><br>Fabio Giordano</font><font color="#888888"></font></p><div style="color:rgb(34,34,34)"><font color="#888888"><font face="monospace">Sezione di Oceanografia</font></font></div><div><font color="#888888"><font face="monospace"><font color="#222222">Istituto Nazionale di Oceanografia e di Geofisica Sperimentale - OGS</font><br></font></font></div><div style="color:rgb(34,34,34)"><font color="#888888"><font face="monospace">via Beirut n. 2</font></font></div><div style="color:rgb(34,34,34)"><font color="#888888"><font face="monospace">34151 Trieste - Italia</font></font></div><div><br class="webkit-block-placeholder"></div>
</div>
<span id="cid:f_lw56gy5o1"><Figure04_760p.png></span><span id="cid:f_lw56gy5j0"><Figure05_8testsComparison.png></span><span id="cid:f_lw56gy5x4"><Figure01_095p.png></span><span id="cid:f_lw56gy5v3"><Figure02_190p.png></span><span id="cid:f_lw56gy5r2"><Figure03_520p.png></span>_______________________________________________<br>MITgcm-support mailing list<br>MITgcm-support@mitgcm.org<br>http://mailman.mitgcm.org/mailman/listinfo/mitgcm-support<br></div></blockquote></div><br></div></body></html>