[MITgcm-devel] heff_max...more sea ice issues

Matthew Mazloff mmazloff at MIT.EDU
Fri Dec 22 11:26:21 EST 2006


Hi Patrick,  thanks for the advice.  I will try C-grid plus EVP in  
early January. (Martin's exp #3)  Right now I am running with  
dynamics off.  The only thing I have tried is C-Grid and LSR with  
LSR_ERROR = 1e-3 and SEAICE_deltaTdyn = 900 and, as I said before,  
this crashed after 6 months.  This setup for the LSR dynamics slowed  
the model by 17%, perhaps the EVP dynamics will be faster.

Thanks for the reference Jinlun, I'll check it out.

Martin, you're right, I am using a spherical grid and my deltaT is  
900.  Thank you for all the info, your comments were very helpful.   
In regards to your "why is the ice  model slow? It's a 2D model? I  
don't get it!", has anyone profiled the seaice dynamics to see where  
the time (17% of the run in my case) is going...maybe one would find  
some surprises...or is it just that there are exchanges at every  
iteration?

Thank you all very much for all the info, I really appreciate it,
-Matt

On Dec 22, 2006, at 5:38 AM, Patrick Heimbach wrote:

>
> HI there,
>
> since Matt is burning up 600 procs at a time,
> there's not too much room for experimenting,
> so I would skip all the stuff having to do with LSR
> and B-grid (i.e. 1 and 2 in Martin's list) and jump
> right to C-Grid plus EVP.
> I think Matt has tried SEAICE_deltaTevp = 60 sec
> which might be too big given his timestep of
> around 1200 sec.
>
> -p.
>
>
>
> Quoting Martin Losch <Martin.Losch at awi.de>:
>
>> Hi Matt,
>>
>> I am sorry, but I will now make life more difficult for you:
>>
>> 1. LSR is an implicit scheme, BUT is does not treat the Coriolis  
>> term  implicitly (so it is different from the alogoritm described  
>> in  Jinlun's paper Zhang+Hibler (1998), or rather it uses only the  
>> first  two parts of that algorithm). From my experience with  
>> explicit  Coriolis terms, I would guess that you are limited by  
>> approximately  1h (at least that's true for the ocean) for  
>> SEAICE_deltaTdyn (I  usually set this to deltaTmom for asynchronos  
>> time stepping). I am  not sure that I can take 12h time steps, but  
>> see next point:
>>
>> 2. I don't see why changing the time step should save you any  
>> time  anyway: The seaice model (including the lsr or evp solver)  
>> is called  at every time step and changing the time step can only  
>> affect the  stability (if you do an asynchonous time stepping  
>> scheme). We have  not (yet) implemented a scheme that calls the  
>> seaice model or parts  of it only every so often (depending on the  
>> time step). This is  commonly done in other models and seems to be  
>> good way of reducing  the computational cost (if the ice model is  
>> slow, why is the ice  model slow? It's a 2D model? I don't get it!).
>>
>> 3. the EVP solver requires a very short time step. For example,  
>> for  my 2deg model I use SEAICE_deltaTevp=60s which is a 20th of  
>> the  deltaTmom=SEAICE_deltaTdyn. According to the manual of CICE  
>> (Hunke's  ice model) one should use something like a 30th to a  
>> 20th of the  physical time step. Now EVP steps forward the stress  
>> evolution  equations n-times (in my case 20 times) within on  
>> SEAICE_deltaTdyn  time step. For a large domain (and high  
>> resolution), this is usually  still faster than LSR, because in  
>> LSR call the routine lsr (or  seaice_lsr) twice and in each call  
>> you do two iterations, one for  Uice and one for Vice, so you have  
>> 4 iterations and if one is only 5  steps, then you still have 20  
>> steps, which are more complicated than  the seaice_evp dynamics.
>>
>> 4. metric terms: the dynamics include only the metric terms for  
>> the  spherical grid. I know that this is not exact for a general   
>> orthogonal coordinate system, but as far as I know Matt is using  
>> a  spherical grid anyway. Including all metric terms (or finding  
>> a  "vector invariant" formulation of the stress) is still a "to do".
>>
>> Without knowing the details of your configuration I would  
>> recommend  these experiments:
>> Use SEAICE_deltaTdyn = deltaT(mom) and SEAICE_deltaTtherm = deltaT  
>> (tracer) in any case (your deltaT is probably on the order  
>> 600-900sec  isn't it?).
>> 1. try the B-grid code (I expect that is will be even worse, but  
>> if  not that's a lot of help for me in terms of debugging. I am  
>> praying  that this experiment will not help you (o:)
>> 2. Try LSR with higher precision maybe even better than 1e-4  
>> (will  make you model even slower, but maybe stable)
>> 3. Try EVP with SEAICE_deltaTevp = deltaT(mom)/20
>> 4. with C-grid: turn off the ice-ocean stress, for that you'll  
>> have  to edit seaice_ocean_stress: comment out the lines that say  
>> fu=(1- areaw)*fu + areaw*fuice etc. (But note: Dimitris has found  
>> in his  high res cubed sphere experiments that using ice-ocean  
>> stress in the  C-grid is stable, while on the B-grid it's not, so  
>> I personally don't  think that turning off the stress will help).
>> 5. with C-grid: try the Hibler and Bryan stress, that's could  
>> even  more stable (and is more physical too, but I have in the  
>> back of my  mind that Dimitris had problems with this in his high  
>> res runs).
>>
>> Martin
>>
>> On 22 Dec 2006, at 01:30, Jinlun Zhang wrote:
>>
>>> Matt,
>>> This is a paper that gives a clue about EVP time step.
>>> http://www.mrcc.uqam.ca/V_f/Publications/articles/  
>>> Saucier2004_ClimDyn.pdf.
>>> Jinlun
>>>
>>> Matthew Mazloff wrote:
>>>
>>>> Dimitris and Jinlun,
>>>>
>>>> Thank you very much for the info.  I will try some runs (in  
>>>> early   January) and let you know how how things work out.
>>>>
>>>> -Matt
>>>>
>>>> On Dec 21, 2006, at 6:47 PM, Jinlun Zhang wrote:
>>>>
>>>>>
>>>>>
>>>>> Matthew Mazloff wrote:
>>>>>
>>>>>> Thanks for the help...but I am a bit confused.  Two things
>>>>>>
>>>>>> 1) Re model efficiency and time stepping...I see there are  
>>>>>> 3    parameters.  I am guessing  SEAICE_deltaTtherm should be  
>>>>>> the   ocean  dynamics time-step as the forcing comes from  
>>>>>> this.  The   other time  stepping parameters are   
>>>>>> SEAICE_deltaTdyn  and   SEAICE_deltaTevp which  I assume are  
>>>>>> the timesteps for each   dynamic solver (LSR and EVP)   
>>>>>> respectively.  And as I  understand  it LSR can use the  
>>>>>> "large"  timestep, but the EVP  should use the  "small"  
>>>>>> timestep...is this  correct?  And I am  not using both at  the  
>>>>>> same time obviously, but you  are saying  I should try both   
>>>>>> independently because it is not obvious   which is faster.
>>>>>
>>>>>
>>>>> Correct.
>>>>>
>>>>>>
>>>>>> 2)More important than efficiency (right now anyway) is    
>>>>>> stability.   Jinlun, your first email seemed to suggest I try   
>>>>>> LSR  with a half day  time step and LSR_ERROR=1e-4, or try  
>>>>>> EVP  with  "small" timestep.  Are  either of these methods  
>>>>>> likely to  be more  stable?
>>>>>
>>>>>
>>>>> Although we may use half day time step for LSR, but it is  
>>>>> better  to  use the same ocean dynamics time step for LSR for   
>>>>> consistency, and  particularly when the code blows up. And  
>>>>> using  1e-4. I would think,  from the heff_max figure, that the  
>>>>> problem  is most likely due to  the surface ocean stress  
>>>>> calculation that  causes instability.  However, you might also  
>>>>> want to try EVP. I  don't have much  experience with EVP, but  
>>>>> people have been  telling me that very  small time steps should  
>>>>> be used for  stability and for getting rid  of unphysical  
>>>>> elastic waves. I  read one paper about high-res.  (~10km)  
>>>>> Hudson Bay simulation,  the time step is as small as a few   
>>>>> seconds.
>>>>>
>>>>> Jinlun
>>>>>
>>>>> _______________________________________________
>>>>> MITgcm-devel mailing list
>>>>> MITgcm-devel at mitgcm.org
>>>>> http://mitgcm.org/mailman/listinfo/mitgcm-devel
>>>>
>>>>
>>>> _______________________________________________
>>>> MITgcm-devel mailing list
>>>> MITgcm-devel at mitgcm.org
>>>> http://mitgcm.org/mailman/listinfo/mitgcm-devel
>>>
>>>
>>> -- 
>>>
>>> Jinlun Zhang
>>> Polar Science Center, Applied Physics Laboratory
>>> University of Washington, 1013 NE 40th St, Seattle, WA 98105-6698
>>>
>>> Phone: (206)-543-5569;  Fax: (206)-616-3142
>>> zhang at apl.washington.edu
>>> http://psc.apl.washington.edu/pscweb2002/Staff/zhang/zhang.html
>>>
>>>
>>>
>>>
>>> _______________________________________________
>>> MITgcm-devel mailing list
>>> MITgcm-devel at mitgcm.org
>>> http://mitgcm.org/mailman/listinfo/mitgcm-devel
>>
>> _______________________________________________
>> MITgcm-devel mailing list
>> MITgcm-devel at mitgcm.org
>> http://mitgcm.org/mailman/listinfo/mitgcm-devel
>>
>
>
>
> --------------------------------------------------------
> Patrick Heimbach   Massachusetts Institute of Technology
> FON: +1/617/253-5259                  EAPS, Room 54-1518
> FAX: +1/617/253-4464             77 Massachusetts Avenue
> mailto:heimbach at mit.edu               Cambridge MA 02139
> http://www.mit.edu/~heimbach/                        USA
>
> _______________________________________________
> MITgcm-devel mailing list
> MITgcm-devel at mitgcm.org
> http://mitgcm.org/mailman/listinfo/mitgcm-devel




More information about the MITgcm-devel mailing list