<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1">
<style type="text/css" style="display:none;"><!-- P {margin-top:0;margin-bottom:0;} --></style>
</head>
<body dir="ltr">
<div id="divtagdefaultwrapper" style="font-size:12pt;color:#000000;font-family:Calibri,Helvetica,sans-serif;" dir="ltr">
<p style="margin-top:0;margin-bottom:0">Hi Jean-Michel,</p>
<p style="margin-top:0;margin-bottom:0"><br>
</p>
<p style="margin-top:0;margin-bottom:0">Thanks for your reply. </p>
<p style="margin-top:0;margin-bottom:0"><br>
</p>
<p style="margin-top:0;margin-bottom:0">I tested the mpi commands and found one that worked to run the mulitple processes, so I have it up and running. </p>
<p style="margin-top:0;margin-bottom:0"><br>
</p>
<p style="margin-top:0;margin-bottom:0">Do you have a feel for the best setting to get optimise the speed-up of the coupled setup? I.e. how many processes to use for the ocean and atmosphere?</p>
<p style="margin-top:0;margin-bottom:0"><br>
</p>
<p style="margin-top:0;margin-bottom:0">Thanks again,</p>
<p style="margin-top:0;margin-bottom:0"><br>
</p>
<p style="margin-top:0;margin-bottom:0">Chris</p>
<p style="margin-top:0;margin-bottom:0"><br>
</p>
</div>
<hr style="display:inline-block;width:98%" tabindex="-1">
<div id="divRplyFwdMsg" dir="ltr"><font face="Calibri, sans-serif" style="font-size:11pt" color="#000000"><b>From:</b> MITgcm-support <mitgcm-support-bounces@mitgcm.org> on behalf of Jean-Michel Campin <jmc@mit.edu><br>
<b>Sent:</b> 28 June 2019 18:48:08<br>
<b>To:</b> mitgcm-support@mitgcm.org<br>
<b>Subject:</b> Re: [MITgcm-support] Issue running cpl_aim+ocn test case</font>
<div> </div>
</div>
<div class="BodyFragment"><font size="2"><span style="font-size:11pt;">
<div class="PlainText">Hi Chris,<br>
<br>
The numbers that are printed out (specialy: MPI_GROUP_World= -2013265920)<br>
do not look too good.<br>
<br>
Here is what I would suggest:<br>
<br>
1) provide more info regarding: <br>
a) platform/computer/OS<br>
b) which compiler are you using and which version<br>
c) which MPI are you using (+ version ?)<br>
Might help in figuring out what is wrong.<br>
<br>
2) Did you try, using same compiler and MPI (same "modules" if using module)<br>
and same optfile, to run a simple verification experiment with MPI ?<br>
You can even try with testreport:<br>
> cd verification<br>
> ./testreport -MPI 4 -of {same optfile} -t global_ocean.cs32x15<br>
Just to make sure everything works well in the same environment but without <br>
coupling interfaces.<br>
<br>
Cheers,<br>
Jena-Michel<br>
<br>
On Wed, Jun 26, 2019 at 04:22:21PM +0000, Christopher O'Reilly wrote:<br>
> Hi,<br>
> <br>
> <br>
> I am trying to get the "cpl_aim+ocn" verification case running.<br>
> <br>
> <br>
> The following step runs and successfully produces the 3 executables (well it seems to anyway):<br>
> <br>
> <br>
> ./run_cpl_test 1 -of $OPTFILE<br>
> <br>
> <br>
> I then run step 2 and step 3 but on runnign step 3 I get the following output:<br>
> <br>
> <br>
> execute 'mpirun -np 1 ./build_cpl/mitgcmuv : -np 1 ./build_ocn/mitgcmuv : -np 1 ./build_atm/mitgcmuv' :<br>
> MITCPLR_init1: 0 Coupler Bcast cbuf=Coupler x<br>
> MITCPLR_init1: 0 Coupler coupler=Coupler 0<br>
> MITCPLR_init1: 0 Coupler MPI_Comm_group MPI_GROUP_World= -2013265920 ierr= 0<br>
> MITCPLR_init1: 0 Coupler component num= 1 MPI_COMM= 1140850688 1140850688<br>
> At line 15 of file mitcplr_initcomp.f<br>
> Fortran runtime error: Actual string length is shorter than the declared one for dummy argument 'carg' (8/32)<br>
> EESET_PARMS: Unable to open parameter file "eedata"<br>
> EESET_PARMS: Unable to open parameter file "eedata"<br>
> EESET_PARMS: Error reading parameter file "eedata"<br>
> EESET_PARMS: Error reading parameter file "eedata"<br>
> EEDIE: earlier error in multi-proc/thread setting<br>
> PROGRAM MAIN: ends with fatal Error<br>
> STOP ABNORMAL END: PROGRAM MAIN<br>
> EEDIE: earlier error in multi-proc/thread setting<br>
> PROGRAM MAIN: ends with fatal Error<br>
> STOP ABNORMAL END: PROGRAM MAIN<br>
> --------------------------------------------------------------------------<br>
> mpirun noticed that the job aborted, but has no info as to the process<br>
> that caused that situation.<br>
> --------------------------------------------------------------------------<br>
> <br>
> I'm sure this is probably something obvious but I'm not sure why it's not finding the "eedata" files (which seem to be located in the in the input_atm adn input_ocn folders).<br>
> <br>
> Has anyone encountered this before?<br>
> <br>
> Kindest regards,<br>
> <br>
> Chris<br>
> <br>
<br>
> _______________________________________________<br>
> MITgcm-support mailing list<br>
> MITgcm-support@mitgcm.org<br>
> <a href="http://mailman.mitgcm.org/mailman/listinfo/mitgcm-support">http://mailman.mitgcm.org/mailman/listinfo/mitgcm-support</a><br>
<br>
_______________________________________________<br>
MITgcm-support mailing list<br>
MITgcm-support@mitgcm.org<br>
<a href="http://mailman.mitgcm.org/mailman/listinfo/mitgcm-support">http://mailman.mitgcm.org/mailman/listinfo/mitgcm-support</a><br>
</div>
</span></font></div>
</body>
</html>