[MITgcm-support] Independent Tiling

Chris Horvat horvat at fas.harvard.edu
Wed Mar 19 08:09:05 EDT 2014


Oops, replied to the wrong email.

To update this support thread, exp2 appears to output as expected, but even
with a clean build and clean code directory and newest checkout the issue
persists in my runs even when using a modified data file from exp2.

Topography/boundary conditions dont appear to be the culprit as they are
shown as expected by the model (aka depth.*, rbc_* are as expected). Pretty
stumped.

Chris

On Mar 19, 2014 8:00 AM, "Chris Horvat" <horvat at fas.harvard.edu> wrote:

> Hey jean-michel, can I drop by your office today? I'll be at the noon
> seminar at mit, can meet any other time. Really need to get this issue
> sorted out (aka find my dumb mistake)
>
> Chris
> On Mar 17, 2014 12:44 PM, "Jean-Michel Campin" <jmc at ocean.mit.edu> wrote:
>
>> Hi Chris,
>>
>> I would sugest to start with one of the simple verification experiment
>> (e.g., exp2) using a clean fresh downloaded version of the code.
>> If the problem with MPI shows up again there, then you could report the
>> full sequence of command you typed and what you are getting (log files
>> and output).
>> Otherwise, we will know that the problem is in your customised set-up
>> and we should investigate this.
>>
>> Cheers,
>> Jean-Michel
>>
>> On Mon, Mar 17, 2014 at 12:07:10PM -0400, Chris Horvat wrote:
>> > Hi Jody and Martin,
>> >
>> > I was hoping that you would both be right but I am in fact compiling
>> with
>> > mpi turned on. STDOUT reports
>> >
>> > nTiles =    1 ; /* Total no. tiles per process ( = nSx*nSy ) */
>> >
>> > (PID.TID 0000.0001)   nProcs =   12 ; /* Total no. processes ( =
>> nPx*nPy )
>> > */
>> >
>> > (PID.TID 0000.0001) nThreads =    1 ; /* Total no. threads per process
>> ( =
>> > nTx*nTy ) */
>> >
>> > (PID.TID 0000.0001) usingMPI =    T ; /* Flag used to control whether
>> MPI
>> > is in use */
>> >
>> >
>> > and a few lines later...
>> >
>> >
>> > ======= Starting MPI parallel Run =========
>> >
>> >
>> > I've attached STDOUT.0000 to this email. I'm also submitting jobs with
>> >
>> > mpirun -np 12 ../build/mitgcmuv > xx 2>errors.txt
>> >
>> >
>> > Any thoughts?
>> >
>> >
>> > On Mar 14, 2014 1:21 AM, "Martin Losch" <Martin.Losch at awi.de> wrote:
>> >
>> > > Chris,
>> > >
>> > > in order to compiler with MPI you need to specify "-mpi" at the
>> genmake2
>> > > step:
>> > >
>> > > genmake2 -mpi (+all the options that you normally use)
>> > >
>> > > Martin
>> > >
>> > > On Mar 14, 2014, at 2:34 AM, Klymak Jody <jklymak at uvic.ca> wrote:
>> > >
>> > > > Hi Chris.  Did you compile w MPI turned on?   Cheers.  Jody
>> > > >
>> > > > Sent from my iPhone
>> > > >
>> > > > On Mar 13, 2014, at 9:40, Chris Horvat <horvat at fas.harvard.edu>
>> wrote:
>> > > >
>> > > >> Hi everyone,
>> > > >>
>> > > >> I've recently "upgraded" simulations to a larger domain, and now am
>> > > tiling results and spreading execution over several CPUs. At the
>> moment,
>> > > however, all of the tiles are producing the exact same output, so it
>> > > appears that the model is just simulating each small tile as its own
>> > > channel.
>> > > >>
>> > > >> This appears to be the case even when all packages are turned off.
>> I've
>> > > attached my data file and SIZE.h to this email, any suggestions?
>> > > >>
>> > > >>
>> > > >> Chris
>> > > >>
>> > > >> <data>
>> > > >> <SIZE.h>
>> > > >> _______________________________________________
>> > > >> MITgcm-support mailing list
>> > > >> MITgcm-support at mitgcm.org
>> > > >> http://mitgcm.org/mailman/listinfo/mitgcm-support
>> > > > _______________________________________________
>> > > > MITgcm-support mailing list
>> > > > MITgcm-support at mitgcm.org
>> > > > http://mitgcm.org/mailman/listinfo/mitgcm-support
>> > >
>> > >
>> > > _______________________________________________
>> > > MITgcm-support mailing list
>> > > MITgcm-support at mitgcm.org
>> > > http://mitgcm.org/mailman/listinfo/mitgcm-support
>> > >
>>
>>
>> > _______________________________________________
>> > MITgcm-support mailing list
>> > MITgcm-support at mitgcm.org
>> > http://mitgcm.org/mailman/listinfo/mitgcm-support
>>
>>
>> _______________________________________________
>> MITgcm-support mailing list
>> MITgcm-support at mitgcm.org
>> http://mitgcm.org/mailman/listinfo/mitgcm-support
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mitgcm.org/pipermail/mitgcm-support/attachments/20140319/10269b17/attachment-0001.htm>


More information about the MITgcm-support mailing list