[MITgcm-support] mpi problem after cvs update

Jean-Michel Campin jmc at ocean.mit.edu
Sat Jan 23 12:49:35 EST 2010


Hi Michael,

One thing that could be useful to know is what was the version of
the code you updated from (or when did you do the previous update).
Otherwise, the code is tested everyday, with and without MPI, 
and all the tests from last night went normally.
Can be caused by:
a) something in the build process. Did you try a full "make Clean" 
before making the new executable ? And could you try to run one of 
the simple verification experiment with MPI.
b) either in some pieces of code that we don't test (would need to know
more about the type of set-up/packages/options you are using).

Thanks,
Jean-Michel

On Sat, Jan 23, 2010 at 11:06:58AM -0600, m. r. schaferkotter wrote:
> all;
> i did cvs update yesterday (jan 22), and now i/m getting these error  
> messages after building and attempting to run my (moments earlier  
> successful) job.
>
> schaferk:sapphire01% more r.001.err
> aborting job:
> Fatal error in MPI_Allreduce: Invalid MPI_Op, error stack:
> MPI_Allreduce(714).......: MPI_Allreduce(sbuf=0x7fffffffb28c,  
> rbuf=0x7fffffffb2ec, count=1, dtype=0x4c00081b, MPI_SUM, M
> PI_COMM_WORLD) failed
> MPIR_SUM_check_dtype(388): MPI_Op MPI_SUM operation not defined for this 
> datatype
> aborting job:
> Fatal error in MPI_Allreduce: Invalid MPI_Op, error stack:
> MPI_Allreduce(714).......: MPI_Allreduce(sbuf=0x7fffffffb28c,  
> rbuf=0x7fffffffb2ec, count=1, dtype=0x4c00081b, MPI_SUM, M
> PI_COMM_WORLD) failed
>
>
> fortunately, i moved aside the old executable and the job runs with  
> that.
>
>
> what/s up with this?
>
> michael schaferkotter
>
>
> _______________________________________________
> MITgcm-support mailing list
> MITgcm-support at mitgcm.org
> http://mitgcm.org/mailman/listinfo/mitgcm-support



More information about the MITgcm-support mailing list