[MITgcm-support] SSH blow-up at Orlanski BC with small deltaT

Yilang Xu yxu at whoi.edu
Mon Jul 22 12:09:41 EDT 2019


Hi Martin, 

Thanks very much for your reply. Yes I solved this problem. 

I think the issue lies in viscAhGrid. For the cases with initial stratification, I kept a relatively large time step, but increased viscAhGrid a little bit, and then SSH behaved fine. However, this solution did not work well for all the cases, especially when the initial stratification got higher. 

viscAhGrid is both time and grid dependent, so changing time step is not suitable for stability test. I further switched it to viscC2smag, and after adjusting its values (which were smaller than the suggested range), I can reproduce the previous results, and then this became more stable. 

Why viscAhGrid did not work well is still mysterious to me. But I now just use viscC2smag. Also, given that Orlanski is not fully compatible with some other modules, sponge layer can be a good solution. 

Best,
Yilang


On 7/22/19, 10:32, "MITgcm-support on behalf of Martin Losch" <mitgcm-support-bounces at mitgcm.org on behalf of Martin.Losch at awi.de> wrote:

    Hi Yilang,
    
    did you get an answer to this? Could you figure out something yourself? 
    
    The usual procedure is to turn off suspicious code (in this case Orlanski) and see if the problem persists. With a grid spacing of 1km you should get away with larger time steps than 2s, at least after the initial adjustment, unless you excite fast waves.
    
    Maybe you can use a sponge layer (either the obcs-version or with rbcs) in order to damp the waves that you may have. 
    
    M.
    
    > On 7. Jun 2019, at 00:52, Yilang Xu <yxu at whoi.edu> wrote:
    > 
    > Hi everyone, 
    >  
    > The subject title sounds crazy but I did encounter this problem. I set up a high-resolution experiment (1~2 km for grid size in x-y, and 10m in z) and initialized it with linear stratification. The boundaries are Orlanski with net flow balanced. 
    >  
    > Initially I used deltaT=60s but the code blew up with NaNs due to CFL condition. Then I decrease deltaT to 30s, the run could come to a normal end, but SSH increased/decreased to unreasonable values at Orlanski boundaries. I then used deltaT=5s, and SSH seemed to have smaller anomalies, but still blew up at boundaries. 
    > It seems that decreasing deltaT is the right way to ensure proper SSH behavior, and this did work for some cases with other kinds of initial stratifications. So I further decreased deltaT to 2s, but then I got much higher SSH blowups at boundaries than the case with deltaT=5s. 
    >  
    > According to this old post http://mailman.mitgcm.org/pipermail/mitgcm-support/2013-January/008122.html , this might be related to viscAhGrid, which is already used in my model and is calculated as below: 
    > viscAh = 0.25*L**2*viscAhGrid/deltaT
    > The formula above shows that decreasing deltaT increases viscAh. What is confusing to me is that decreasing deltaT can give me both better or worse results. I assume this is related to viscAhGrid, but there might be other SSH instability issues that matter (Orlanski, etc). 
    >  
    > I have included the data file below. Really appreciate any help on this problem.
    >  
    > Thanks,
    > Yilang
    >  
    >  
    > # ====================
    > # | Model parameters |
    > # ====================
    > #
    > # Continuous equation parameters
    > &PARM01
    > Tref = 50*-1.9,
    > Sref = 34.0000, 34.0612, 34.1224, 34.1837, 34.2449, 
    > 34.3061, 34.3673, 34.4286, 34.4898, 34.5510, 
    > 34.6122, 34.6735, 34.7347, 34.7959, 34.8571, 
    > 34.9184, 34.9796, 35.0408, 35.1020, 35.1633, 
    > 35.2245, 35.2857, 35.3469, 35.4082, 35.4694, 
    > 35.5306, 35.5918, 35.6531, 35.7143, 35.7755, 
    > 35.8367, 35.8980, 35.9592, 36.0204, 36.0816, 
    > 36.1429, 36.2041, 36.2653, 36.3265, 36.3878, 
    > 36.4490, 36.5102, 36.5714, 36.6327, 36.6939, 
    > 36.7551, 36.8163, 36.8776, 36.9388, 37.0000,
    > viscAz=1.E-3,
    > # viscAh=10.0,
    > viscAhGrid=1.E-4,
    > no_slip_sides=.FALSE.,
    > no_slip_bottom=.FALSE.,
    > diffKhT=0.0,
    > diffKzT=1.E-6,
    > diffKhS=0.0,
    > diffKzS=1.E-6,
    > bottomDragQuadratic=2.5E-3,
    > eosType='JMD95Z', 
    >  HeatCapacity_cp = 3974.0,
    > rhoConst=1030.,
    > rhoNil=1030.,
    > gravity=9.81,
    > convertFW2Salt = 33.4,
    > rigidLid=.FALSE.,
    > implicitFreeSurface=.TRUE.,
    > exactConserv=.TRUE.,
    > hFacMin=0.05,
    > nonHydrostatic=.FALSE.,
    > readBinaryPrec=64,
    > implicitDiffusion  = .TRUE.,
    > implicitViscosity  = .TRUE.,
    > saltAdvScheme=33,
    > saltVertAdvScheme =33,
    > tempAdvScheme=33,
    > tempVertAdvScheme=33,
    > staggerTimeStep=.TRUE.,
    > # useCDScheme = .TRUE.,
    > &
    >  
    > # Elliptic solver parameters
    > &PARM02
    > cg2dMaxIters=1000,
    > cg2dTargetResidual=1.E-13,
    > cg3dMaxIters=400,
    > cg3dTargetResidual=1.E-13,
    > &
    >  
    > # Time stepping parameters
    > &PARM03
    > nIter0=0,
    > nTimeSteps=1728000,
    > deltaT=2.0,
    > abEps=0.1,
    > cAdjFreq = 0.,
    > # tauCD = 400000.,
    > pChkptFreq=1728000.0,
    > chkptFreq=0.0,
    > dumpFreq=432000.0,
    > taveFreq=0.0,
    > monitorFreq=36000.,
    > monitorSelect=2,
    > &
    >  
    > # Gridding parameters
    > &PARM04
    > usingSphericalPolarGrid=.TRUE.,
    > ygOrigin = -80.0,
    > delXfile='dxfile.bin',
    > delYfile='dyfile.bin',
    > delZ=50*10.0,
    > &
    >  
    > # Input datasets
    > &PARM05
    > bathyFile='bathy.box',
    > # hydrogThetaFile='temp_3d_end_state.bin',
    > # hydrogSaltFile='salt_3d_end_state.bin',
    > # uVelInitFile='u_3d_end_state.bin',
    > # vVelInitFile='v_3d_end_state.bin',
    > &
    >  
    > _______________________________________________
    > MITgcm-support mailing list
    > MITgcm-support at mitgcm.org
    > http://mailman.mitgcm.org/mailman/listinfo/mitgcm-support
    
    _______________________________________________
    MITgcm-support mailing list
    MITgcm-support at mitgcm.org
    http://mailman.mitgcm.org/mailman/listinfo/mitgcm-support
    
    




More information about the MITgcm-support mailing list