[MITgcm-support] exch2 with lat lon grid
Dimitris Menemenlis
dmenemenlis at gmail.com
Wed Dec 31 13:12:32 EST 2014
Matt, are you writing to individual files for each tile?
Probably not useful in your specific case but for the record there is a set of
MPI_IO routines that are very efficient on lustre file systems that were developed
for the hi-res LLC simulations: MITgcm_contrib/llc_hires/llc_1080/code-async/
Cheers, Dimitris
> On Dec 29, 2014, at 6:27 AM, Jean-Michel Campin <jmc at ocean.mit.edu> wrote:
>
> Hi Matt,
>
> In your case (using pkg/exch2 with blank-tiles), the domain tile mapping
> is set according to tile-size (from SIZE.h) and "dimsFacets=" from data.exch2;
>
> So, if the total number of tiles ( nSx*nPx*nSy*nPy ) is not a prime number,
> may be if you manage to split it between nSx*nPx and nSy*nPy
> you might be able to keep these 2 numbers < 1000 ?
>
> Otherwise, we will need to change this "I3.3" to by-pass this limitation.
>
> Cheers,
> Jean-Michel
>
> On Sat, Dec 27, 2014 at 10:43:40AM -0800, Matthew Mazloff wrote:
>> Hello
>>
>> EXCH2 package works great!
>>
>> Bringing me to my next question:
>>
>> Is there a trick for using more than 999 cores? I assume setting nPy>999 is an issue since the mdsio routines have statements with:
>>
>> WRITE(dataFName,'(2A,I3.3,A,I3.3,A)')
>> & pfName(1:pIL),'.',iG,'.',jG,'.data'
>>
>> Is there an easy fix (e.g. to ini_procs.F) for this issue?
>>
>> Thanks!
>> Matt
>>
>>
>>
>> On Dec 21, 2014, at 1:00 PM, Matthew Mazloff <mmazloff at ucsd.edu> wrote:
>>
>>> Hi Patrick
>>>
>>> I saw that but it seemed too simple -- glad to hear that works! So to confirm, there is no need to set any fancy topology files, the trick is just to set
>>> nPy = Number of cores needed,
>>>
>>> and then in data.exch2
>>> W2_mapIO = 1,
>>> so the code knows it was nPy that was set to "Number of cores needed"
>>> and then:
>>> dimsFacets = NX, NY,
>>> so it can rebuild the grid.
>>>
>>> and that is it?
>>>
>>> I'll give it a try -- sounds great!
>>>
>>> Thanks
>>> Matt
>>>
>>>
>>>
>>>
>>>
>>>
>>> On Dec 21, 2014, at 12:09 PM, Patrick Heimbach <heimbach at mit.edu> wrote:
>>>
>>>> Hi Matt,
>>>>
>>>> yes, a simple example is the parallel version of verification/global_ocean.90x40x15/
>>>> i.e. take a look at
>>>> code/SIZE.h_mpi
>>>> input/data.exch2.mpi
>>>>
>>>> p.
>>>>
>>>> On Dec 21, 2014, at 2:54 PM, Matthew Mazloff <mmazloff at ucsd.edu> wrote:
>>>>
>>>>> Hello
>>>>>
>>>>> I have a lat-lon grid with a good number of blank tiles. Is it possible to use exch2 such that I can request fewer cores? Is there an example of how to implement this for a simple grid?
>>>>>
>>>>> Thanks!
>>>>> Matt
>>>>>
>>>>>
>>>>> _______________________________________________
>>>>> MITgcm-support mailing list
>>>>> MITgcm-support at mitgcm.org
>>>>> http://mitgcm.org/mailman/listinfo/mitgcm-support
>>>>
>>>>
>>>> ---
>>>> Patrick Heimbach | heimbach at mit.edu | http://www.mit.edu/~heimbach
>>>> MIT | EAPS 54-1420 | 77 Massachusetts Ave | Cambridge MA 02139 USA
>>>> FON +1-617-253-5259 | FAX +1-617-253-4464 | SKYPE patrick.heimbach
>>>>
>>>> _______________________________________________
>>>> MITgcm-support mailing list
>>>> MITgcm-support at mitgcm.org
>>>> http://mitgcm.org/mailman/listinfo/mitgcm-support
>>>
>>>
>>> _______________________________________________
>>> MITgcm-support mailing list
>>> MITgcm-support at mitgcm.org
>>> http://mitgcm.org/mailman/listinfo/mitgcm-support
>>
>
>> _______________________________________________
>> MITgcm-support mailing list
>> MITgcm-support at mitgcm.org
>> http://mitgcm.org/mailman/listinfo/mitgcm-support
>
>
> _______________________________________________
> MITgcm-support mailing list
> MITgcm-support at mitgcm.org
> http://mitgcm.org/mailman/listinfo/mitgcm-support
More information about the MITgcm-support
mailing list