[MITgcm-support] segmentation fault

Dimitris Menemenlis menemenlis at jpl.nasa.gov
Fri May 18 09:04:10 EDT 2018


Andreas, I have done something similar quite a few times (i.e., increase 
horizontal and/or vertical resolution in regional domains with obcs 
cut-out from a global set-up) and did not have same issue.  If helpful I 
can dig out and commit to contrib some examples that you can compare 
your set-up against.  Actually, you remind me that I already promised to 
do this for Gael but that it fell off the bottom of my todo list :-(

Do you have any custom routines in your "code" directory?  Have you 
tried compiling and linking with array-bound checks turned on?

Dimitris Menemenlis
On 05/17/2018 11:45 PM, Andreas Klocker wrote:
> Matt,
> 
> I cut all the unnecessary packages and still have the same issue.
> 
> I also checked 'size mitgcmuv' and compared the size to other runs which 
> work - it asks for about half the size of other runs which work fine 
> (same machine, same queue, same compiling options, etc).
> 
> The tiles are already down to 32x32 grid points, and I'm happily running 
> configurations with a tile size almost twice as big and the same amount 
> of vertical layers.
> 
> I will try some different tile sizes but I think the problem must be 
> somewhere else..
> 
> Andreas
> 
> 
> 
> 
> On 18/05/18 00:35, Matthew Mazloff wrote:
>> Sounds like a memory issue. I think your executable has become too big 
>> for you machine. You will need to reduce tile size or do something 
>> else (e.g. reduce number of diagnostics or cut a package)
>>
>> and check
>> size mitgcmuv
>> to get a ballpark idea of how much memory you are requesting
>>
>> Matt
>>
>>
>>
>>> On May 16, 2018, at 11:27 PM, Andreas Klocker 
>>> <andreas.klocker at utas.edu.au <mailto:andreas.klocker at utas.edu.au>> wrote:
>>>
>>> Hi guys,
>>>
>>> I've taken a working 1/24 degree nested simulation (of Drake Passage)
>>> with 42 vertical layers and tried to increase the vertical layers to 150
>>> (without changing anything else apart from obviously my boundary files
>>> for OBCS and recompiling with 150 vertical layers). Suddenly I get the
>>> following error message:
>>>
>>> forrtl: severe (174): SIGSEGV, segmentation fault occurred
>>> Image              PC                Routine Line        Source
>>> libirc.so <http://libirc.so>          00002BA1704BC2C9  Unknown 
>>> Unknown  Unknown
>>> libirc.so <http://libirc.so>          00002BA1704BAB9E  Unknown 
>>> Unknown  Unknown
>>> libifcore.so.5     00002BA1722B5F3F  Unknown Unknown  Unknown
>>> libifcore.so.5     00002BA17221DD7F  Unknown Unknown  Unknown
>>> libifcore.so.5     00002BA17222EF43  Unknown Unknown  Unknown
>>> libpthread.so.0    00002BA1733B27E0  Unknown Unknown  Unknown
>>> mitgcmuv_drake24_  00000000004E61BC  mom_calc_visc_ 3345  mom_calc_visc.f
>>> mitgcmuv_drake24_  0000000000415127  mom_vecinv_ 3453  mom_vecinv.f
>>> mitgcmuv_drake24_  0000000000601C33  dynamics_ 3426  dynamics.f
>>> mitgcmuv_drake24_  0000000000613C2B  forward_step_ 2229  forward_step.f
>>> mitgcmuv_drake24_  000000000064581E  main_do_loop_ 1886  main_do_loop.f
>>> mitgcmuv_drake24_  000000000065E500  the_main_loop_ 1904  the_main_loop.f
>>> mitgcmuv_drake24_  000000000065E6AE  the_model_main_ 2394 
>>>  the_model_main.f
>>> mitgcmuv_drake24_  00000000005C6439  MAIN__ 3870  main.f
>>> mitgcmuv_drake24_  0000000000406776  Unknown Unknown  Unknown
>>> libc.so.6          00002BA1737E2D1D  Unknown Unknown  Unknown
>>> mitgcmuv_drake24_  0000000000406669  Unknown Unknown  Unknown
>>>
>>> First this error pointed to a line in mom_calc_visc.f on which
>>> calculations regarding the Leith viscosity are done. As a test I then
>>> used a Smagorinsky viscosity instead and now it crashes with the same
>>> error, but pointing to a line where Smagorinsky calculations are done. I
>>> assume I must be chasing a way more fundamental problem than one related
>>> to these two viscosity choices...but I'm not sure what this might be....
>>>
>>> Has anyone got any idea of what could be going wrong here?
>>>
>>> Thanks in advance!
>>>
>>> Andreas
>>>
>>>
>>>
>>> University of Tasmania Electronic Communications Policy (December, 2014).
>>> This email is confidential, and is for the intended recipient only. 
>>> Access, disclosure, copying, distribution, or reliance on any of it 
>>> by anyone outside the intended recipient organisation is prohibited 
>>> and may be a criminal offence. Please delete if obtained in error and 
>>> email confirmation to the sender. The views expressed in this email 
>>> are not necessarily the views of the University of Tasmania, unless 
>>> clearly intended otherwise.
>>>
>>> _______________________________________________
>>> MITgcm-support mailing list
>>> MITgcm-support at mitgcm.org <mailto:MITgcm-support at mitgcm.org>
>>> http://mailman.mitgcm.org/mailman/listinfo/mitgcm-support
>>
>>
>>
>> _______________________________________________
>> MITgcm-support mailing list
>> MITgcm-support at mitgcm.org
>> http://mailman.mitgcm.org/mailman/listinfo/mitgcm-support
> 
> -- 
> ===============================================================
> Dr. Andreas Klocker
> Physical Oceanographer
> 
> ARC Centre of Excellence for Climate System Science
> &
> Institute for Marine and Antarctic Studies
> University of Tasmania
> 20 Castray Esplanade
> Battery Point, TAS
> 7004 Australia
> 
> M:     +61 437 870 182
> W:http://www.utas.edu.au/profiles/staff/imas/andreas-klocker
> skype: andiklocker
> ===============================================================
> 
> 
> 
> _______________________________________________
> MITgcm-support mailing list
> MITgcm-support at mitgcm.org
> http://mailman.mitgcm.org/mailman/listinfo/mitgcm-support
> 


More information about the MITgcm-support mailing list