[MITgcm-support] The way to accelerate the calculation speed or to expand deltaT of MITgcm

Matthew Mazloff mmazloff at ucsd.edu
Thu Oct 28 17:01:15 EDT 2010


Hi Christopher,

My smallest level is 10m too. I do on rare occasion get   
advcfl_wvel_max that reaches 1, yet my model stays stable.  And for my  
domain advcfl_wvel_max is usually order 1e-2.  I suppose the set-up  
and the level of stratification plays a large role on suppressing  
vertical motions.

I believe I am using explicit vertical advection....this is the  
default, correct?
I am using implicit diff and visc...

-Matt






On Oct 28, 2010, at 1:38 PM, Christopher L. Wolfe wrote:

>
> Hi Matt,
>
> With 72 vertical levels, how do you avoid blowing your vertical cdf  
> condition? I've got only 20 levels (smallest ~10m) and a time step  
> of 500 s and my vertical cdf numbers hover around 0.2. Are you using  
> implicit vertical advection?
>
> Cheers,
> Christopher
>
> On Oct 28, 2010, at 11:11 AM, Matthew Mazloff wrote:
>
>> Hi Liang,
>>
>> There are several things to try.  First though, did you try any  
>> scaling experiments - 48 cpus may actually be too many.  Obviously  
>> it depends on your architecture, but due to overlaps usually  
>> efficient tile sizes for most machines is about 30 x 30.  I would  
>> therefore guess that your setup would run better on ~12 cpus, not  
>> 48...
>>
>> Other things to try is limiting output by setting  debugLevel=0, in  
>> PARM01
>> changing way I/O is done by trying
>> useSingleCpuIO=.TRUE.,
>> again in PARM01
>>
>> Also -- I have a setup running with 6km resolution and 72 vertical  
>> levels -- very similar to yours -- and I use a timestep of 1200.0  
>> seconds.  Perhaps increasing your diff and visc will give you a  
>> more stable solution -- I use:
>>  viscAz =1.E-4,
>>  viscAh =1.E2,
>>  viscA4 =1.E9,
>>  diffKhT=1.E0,
>>  diffKzT=1.E-5,
>>  diffKhS=1.E0,
>>  diffKzS=1.E-5,
>>  bottomDragQuadratic=1.E-3,
>>  tempAdvScheme=30,
>>  saltAdvScheme=30,
>>
>> Still -- 30 years is a long run -- it will likely take significant  
>> time to run...
>>
>> good luck,
>> -Matt
>>
>>
>>
>>
>> On Oct 28, 2010, at 5:27 AM, 姊佹湅 wrote:
>>
>>> Dear Sir:
>>>         I'm trying to use the MITgcm to run an experiment.To  
>>> simulate this experiment
>>> successfully it is necessary to set the grid interval to be very  
>>> small.Now the number of the horizontal grids is 114*104,and the  
>>> number of the vertical grids is 139.The minimum interval of the  
>>> horizontal grids is 5 km ,and the minimum interval of the vertical  
>>> grids is 10m.And now I find that the maximum deltaT in the data  
>>> file to keep the model steady is 60s. The minimum time that is  
>>> needed is 30 years,so the minimum number of timesteps (nTimeSteps  
>>> in the data file)is 16000000.Now I have used 48 cpu to run the  
>>> model,and it needs 210 days to finish the calculation.The time is  
>>> too long!! So can you tell me how to  set the model parameters or  
>>> modify the model code to  accelerate the calculation speed or to  
>>> expand deltaT.
>>> Thank You,
>>> With Regards,
>>> Liang
>>>     Some of the parameters I set are as follow:
>>>     The grids of x and y  are as follow:
>>>     nx=104;
>>>     ny=114;
>>>     dx(1:20)=50;
>>>     dx(21:27)=[40:-5:10];
>>>     dx(28:67)=5;
>>>     dx(68:74)=[10:5:40];
>>>     dx(75:nx)=50;
>>>     dx=dx*1000;
>>>     dy(1:38)=25;
>>>     dy(39:42)=[21:-4:9];
>>>     dy(43:92)=5;
>>>     dy(93:96)=[9:4:21];
>>>     dy(97:ny)=25;
>>>     dy=dy*1000;
>>>     The settings of the data file are as follow:
>>>       &PARM01
>>>   tRef=139*20.,
>>>  sRef=139*35.,
>>>  viscAh=0.,
>>>  viscAhGrid=0.,
>>>  viscC2Leith=0.,
>>>  viscA4=0.,
>>>  useFullLeith=.TRUE.,
>>>  viscC4Leith=2.0,
>>>  viscC4Leithd=2.0,
>>>  viscAz=1.E-3,
>>>  bottomDragLinear=0.E-4,
>>>  bottomDragQuadratic=0.003,
>>>  no_slip_sides=.TRUE.,
>>>  no_slip_bottom=.TRUE.,
>>>  diffK4T=0.E4,
>>>  diffKhT=0.E-2,
>>>  diffKzT=0.E-3,
>>>  diffK4S=0.E4,
>>>  diffKhS=0.E-2,
>>>  diffKzS=0.E-5,
>>>  f0=4.97e-5,
>>>  beta=2.30e-11,
>>>  rhonil=1035.,
>>>  rhoConstFresh=1000.,
>>>  eosType = 'JMD95Z',
>>>  rigidLid=.FALSE.,
>>>  implicitFreeSurface=.TRUE.,
>>>  hFacMin=0.05,
>>>  nonHydrostatic=.TRUE.,
>>>  readBinaryPrec=64,
>>> #- not safe to use globalFiles in multi-processors runs
>>> #globalFiles=.TRUE.,
>>>  tempAdvScheme=33,
>>>  &
>>> # Elliptic solver parameters
>>>  &PARM02
>>>  cg2dMaxIters=300,
>>>  cg2dTargetResidual=1.E-13,
>>>  cg3dMaxIters=20,
>>>  cg3dTargetResidual=1.E-8,
>>>  &
>>> # Time stepping parameters
>>>  &PARM03
>>>  nIter0=0,
>>>  nTimeSteps=16000000,
>>>  deltaT=60.0,
>>>  abEps=0.01,
>>>  pChkptFreq=600000.0,
>>>  chkptFreq=600000.0,
>>>  dumpFreq=6000000.0,
>>>  tauThetaClimRelax=100000.0,
>>>  monitorFreq=60000.,
>>>  &
>>> # Gridding parameters
>>>  &PARM04
>>>  usingCartesianGrid=.TRUE.,
>>>  delXfile='dx.bin',
>>>  delYfile='dy.bin',
>>>   
>>> delZ 
>>> = 
>>> 100,100,100,100,100,100,100,100,100,100,90,80,70,60,50,40,30,20,102 
>>> *10,20,30,40,50,60,70,80,90,100,100,100,100,100,100,100,100,100,100,100 
>>> ,
>>>  &
>>> # Input datasets
>>>  &PARM05
>>>  bathyFile='topog.slope',
>>>  hydrogThetaFile='T.init',
>>>  surfQfile='Qnet.forcing',
>>>  thetaClimFile='SST.bin',
>>>  diffKrFile='diffK.bin',
>>>  &
>>>
>>>  <ATT00001.txt>
>>
>> _______________________________________________
>> MITgcm-support mailing list
>> MITgcm-support at mitgcm.org
>> http://mitgcm.org/mailman/listinfo/mitgcm-support
>
> <ATT00001.txt>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mitgcm.org/pipermail/mitgcm-support/attachments/20101028/206d2253/attachment.htm>


More information about the MITgcm-support mailing list