[MITgcm-support] MPI speed issues

Martin Losch Martin.Losch at awi.de
Sat Jun 15 03:02:26 EDT 2013


Hi Jason,

from my experience, yes, also for non-hydrostatic runs, but it also depends on the overlap. The larger the Olx/y, the more overhead, and then the scaling might become bad before that. I just ran a scaling test for the ice dynamics solver (just 2D), where I needed OLx/y=5 and then the scaling tapered off around domain sizes of 50 by 50 grid points. (it makes sense to me, as you have an overhead of (60*60 - 50*50)/50*50 = 44 % extra grid points). The same is true for a domain size of 30 by 30 grid points and an overlap Olx/y=3.

Martin
On Jun 14, 2013, at 6:01 PM, Jason Goodman wrote:

> 
> On Jun 14, 2013, at 11:43 AM, Jody Klymak <jklymak at uvic.ca> wrote:
> 
>> 
>> On Jun 14, 2013, at  8:08 AM, Martin Losch <Martin.Losch at awi.de> wrote:
>> 
>>> but usually the model doesnt scale for domain sizes below 30x30 grid points.
>> 
>> Thats a useful rule of thumb (says the guy who just ran 128 processors on 47x16)!
> 
> Does that rule of thumb apply for nonhydrostatic simulations with high vertical resolution?
> _______________________________________________
> MITgcm-support mailing list
> MITgcm-support at mitgcm.org
> http://mitgcm.org/mailman/listinfo/mitgcm-support




More information about the MITgcm-support mailing list