[MITgcm-support] seaice model

Chris Hill cnh at mit.edu
Wed Apr 28 09:52:03 EDT 2004


Martin,

 You could try starting from 32x15 and then:

  1 - change levels in cs32x15 to match 32x30, but leave everything else as
for 32x15.
  2 - use bathymetry from 32x30 
  3 - use your flat hydrography to start

 That could save you a bit of effort and it should not be broken. I'm not
sure about the pedigree of the bathymetry in 32x30 thoiugh, so it may not
save much time. 

Chris 

> -----Original Message-----
> From: mitgcm-support-bounces at mitgcm.org 
> [mailto:mitgcm-support-bounces at mitgcm.org] On Behalf Of Martin Losch
> Sent: Wednesday, April 28, 2004 9:45 AM
> To: mitgcm-support at mitgcm.org
> Subject: Re: [MITgcm-support] seaice model
> 
> Hi again,
> 
> I also tried cs32x15, which in principle was fine, but I 
> needed something with a higher vertical resolution. I had 
> hoped that I could be lazy, but I guess I'll have to create 
> something on my own )o:
> 
> Martin
> 
> On Wednesday, April 28, 2004, at 03:10 PM, Chris Hill wrote:
> 
> > Hi Martin
> >
> >  There is something weird, however don't worry you broke the wrong 
> > experiment. cs32x32x30 is a dormant experiment, evidently 
> we need to 
> > tidy it up or remove it.
> >
> >  Can you please try and break cs32x15 instead! That is the 
> one that is 
> > in active use by myself, Jean-Michel, Alistair, Dimitris etc... It 
> > includes input hydrography.
> >
> > Thanks,
> >
> > Chris
> >
> >> -----Original Message-----
> >> From: mitgcm-support-bounces at mitgcm.org 
> >> [mailto:mitgcm-support-bounces at mitgcm.org] On Behalf Of 
> Martin Losch
> >> Sent: Wednesday, April 28, 2004 3:56 AM
> >> To: mitgcm-support at mitgcm.org
> >> Subject: Re: [MITgcm-support] seaice model
> >>
> >> Dimitris, Jinlun,
> >>
> >> thanks for your comment, here's my reply:
> >>
> >> On Wednesday, April 28, 2004, at 02:40 AM, Dimitris 
> Menemenlis wrote:
> >>
> >>> Hi Martin, just going over your list of questions.  It will
> >> take some
> >>> time to address them all adequately, but here are some 
> suggestions.
> >>> Don't feel obliged to try all of them.  Just to get a
> >> discussion going.
> >>>
> >>>> I realize that this may not be a perfect domain, because
> >> it ends at
> >>>> 80degree N and S, and it is very coarse (4degx4deg, 15 
> layers 50 to
> >>>> 690m) but just to try it out.
> >>>
> >>> There exists a coarse resolution set-ups on the cube sphere that 
> >>> include the Arctic Ocean that you can use.  It's 32x32 x 6
> >> faces x 15
> >>> levels.  So the number of horizontal grid cells is 6144, as
> >> opposed to
> >>> 3600 for the 4x4.
> >> Last night, instead of getting a good night's sleep, it tried (and
> >> managed) to get the global_ocean.cs32x15 going with the seaice 
> >> package.
> >> But the problems I have there are the same, plus, with the cubed 
> >> sphere grid I no longer know what I am doing (o: But 
> essentially, the 
> >> same thing happens as with lat-lon grid. Ice thicknesses 
> become very 
> >> large, huge freshwater flux out of the ocean, strong horizontal 
> >> divergences in the ocean, large vertical velocities and 
> violation of 
> >> the vertical cfl-criterion. If I reduce the timestep even further, 
> >> the model runs longer but eventually it blows up, too (all 
> after less 
> >> than a year of integration).
> >>>
> >>> There is two flavors of ice model on this.  One is a 
> thermodynamic 
> >>> only, Winton formulation.  That's the default package of
> >>> verification/global_ocean.cs32x15 and Jean-Michel is the expert.
> >> Haven't tried the thSIce package, yet.
> >>>
> >>> The other is the dynamic/thermodynamic pkg/seaice, which is 
> >>> essentially Hibler (79-80) based plus snow.  An example
> >> configuration
> >>> for this lives under MITgcm_contrib/high_res_cube/README_ice
> >>> the package you have been using.
> >>>
> >>> It'd be interesting to compare the two thermodynamic formulations 
> >>> sometime, and also impact, if any, of dynamics ... but 
> hasn't been 
> >>> done in detail yet.
> >>>
> >>>> - I have to reduce the time step from 2 days (without
> >> seaice) to 12h
> >>>> to
> >>>> make the model run more than a few time steps (why is that?).
> >>>
> >>> What happens if you turn the sea-ice package off, and 
> just use the 
> >>> pkg/exf bulk formulae with your forcing?  Do you still need to 
> >>> reduce the time step?
> >> No, in fact I started with learning how to use the exf-package and 
> >> was able to reproduce the global_ocean.90x40x15 with that
> >> (almost) exactly,
> >> with a timestep of 172800s (2days). So that's fine. As soon as I 
> >> start using the bulkformulae in that package, I have strong flux 
> >> imbalances and my ocean warms considerably (everywhere near the 
> >> surface), but it still runs for at least 1000 years. So I 
> might have 
> >> a serious problem with the fluxes even withouth seaice, 
> but because 
> >> they result in a strong net warming, I don't think that 
> this should 
> >> cause excessive sea-ice growth lateron. Only when I use the seaice 
> >> package (that is turn it on in data.pkg), I have to reduce the 
> >> timestep to 12h!
> >>
> >> I have one suspicion: because of the coarse resolution, 
> also in the 
> >> vertical, vertical fluxes are poorly represented and the dense
> >> (cold,salty) water in the top 50m cannot be replaced by 
> warmer, fresh 
> >> water from below quickly enough, so that there is no 
> (warming) ocean 
> >> heat flux, that can balance atmospheric fluxes. The fact, 
> that with 
> >> implicit diffusion as convective scheme with a huge 
> ivdc_kappa=100, 
> >> makes the model last longest, supports this suspicion. 
> Also the lack 
> >> of convective flux may be compensated by unrealistically high 
> >> vertical velocities, which then lead to the violation of the 
> >> clf-criterion.
> >>
> >> I also tried to run the global_ocean.cs32x32x30 experiment 
> (because 
> >> of the higher vertical resolution), but that doesn't have any 
> >> reasonable T/S-initial condtions files, so I am not sure 
> whether any 
> >> inital adjustment processes will not mess up the whole 
> thing. It will 
> >> take a little to create initial conditions from levitus, 
> because, as 
> >> I said before, with the cubed sphere, I no longer know what I am 
> >> doing. The other thing I might try is design a 
> configuration that is 
> >> horizontally based on the global_ocean.90x40x15 
> experiments but has a 
> >> vertical resolution of 30 layers.
> >>
> >>>
> >>>> - the model blows up after 6-12 months because of a sudden
> >> violation
> >>>> of
> >>>> the vertical CFL-criterium, depending on the convection
> >> scheme I use.
> >>>> - with KPP the model blows up fastest, convective adjustment of 
> >>>> implicit diffusion make it last longer
> >>>> - ice builds up and melts properly according to seasons, but ice 
> >>>> thickness are huge (?): HEFF goes up to 20m at isolated places.
> >>>> - this is my preliminary analysis of what's going on:
> >>>> in winter sea ice builds up, there is a net fresh water
> >> flux out of
> >>>> the
> >>>> ocean which is huge (up to 8m/day in monthly averages!), as a 
> >>>> consequence surface salinities go up to 500 PSU, which
> >> leads to strong
> >>>> horizontal pressure gradients, strong horizontal velocity
> >> divergences
> >>>> and therefore to strong vertical velocities, so that 
> eventually the 
> >>>> vertical cfl-criterium is violated and ... booooom. The 
> preferred 
> >>>> areas for this process are the eastern Wedell Sea (what ever is
> >> resolved),
> >>>> the Norwegian Sea, if you can call it that and Hudson Bay
> >> (yes, it's
> >>>> in
> >>>> there). I guess the problem is that these areas are quite
> >> shallow in
> >>>> the model (only a few grid cells) so the that the high 
> salinities 
> >>>> cannot be convected or advected away. But the main problem
> >> is probably
> >>>> related to the strong freezing rates of a few meters a day.
> >>>
> >>> Freezing rates of a few meters a day doesn't sound right.
> >>> Is your ncep_tair.bin file in deg C or K?
> >>> If the former, then you would need to add a
> >>>  exf_offset_atemp=273.15
> >>> in the data.exf file.
> >>> More generally, can you check that your forcing files are 
> roughly in 
> >>> the ranges defined  pkg/exf/exf_fields.h
> >> I have done that before and did it again now. These are the values 
> >> directly out of the forcing field files, maybe I am not aware of a
> >> problem:
> >>>> air temperature:
> >> Min 230.56  Max 307.53  Mean 286.28  SD 13.549
> >>>> specific humidity
> >> Min 0  Max 0.021874  Mean 0.0066202  SD 0.0068824
> >>>> downward longwave radiation
> >> Min 103.75  Max 438.37  Mean 333.2  SD 69.426
> >>>> downward shortwave radiation
> >> Min 8.3916e-05  Max 434.69  Mean 191.38  SD 82.57
> >>>> evaporation (estimated form latent heat flux)
> >> Min -6.5648e-09  Max 1.2229e-07  Mean 3.3601e-08  SD 2.0769e-08
> >>>> precipation (corrected so E-P = 0 over one year)
> >> Min 2.9253e-09  Max 2.377e-07  Mean 3.5973e-08  SD 2.3988e-08
> >>
> >> The freshwater flux is a but funny, maybe I'll have to redo that 
> >> again.
> >> But in last nights cubed-sphere runs, evaporation was 
> computed by the 
> >> bulkformulae and precipitation was not corrected and I had 
> the same 
> >> problems.
> >>
> >> Jinlun wrote:
> >>> Dynamical instability may also cause 20m thick ice. You
> >> might want to
> >>> run a thermodynamics only case to see if the 20m-thick ice
> >> is due to
> >>> ice dynamics or thermodynamics.
> >> I am pretty sure that it is all thermodynamic growth, because 1. I 
> >> have tried to turn off dynames (useSEAICEdynamcs=.false.) 
> and got the 
> >> same problems, and 2. at 4degress resolution, the motion is very 
> >> sluggish.
> >>
> >>>
> >>>> PS. I also tried the identical configuration on my mac
> >> with g77, and
> >>>> got a segmentation fault in the seaice model, in one of
> >> the loops of
> >>>> the LSR solve. Clearly a compiler optimization problem because 
> >>>> reducing the optimization from -O3 to -O1 (no optimization,
> >> documentation on
> >>>> g77
> >>>> is sparse) made the problem go away. I had a look at the
> >> code and saw
> >>>> a
> >>>> lot of divisions. I replace all occurrences of
> >> /CSUICE(i,j,bi,bj) in
> >>>> the code by *RECIP_CSUICE(i,j,bi,bj) (a new field initialized to 
> >>>> 1/CSUICE in seaice_init) and the segmentation fault went
> >> away. It may
> >>>> be worth thinking about changing this throughout the code
> >> (CSTICE is
> >>>> another candidate). What do you think about this. I can
> >> volunteer to
> >>>> do
> >>>> this but I am afraid that it might break results in the
> >> verification
> >>>> experiments, because optimization is affected?
> >>>
> >>> I think that's a good idea.  Could you implement this and
> >> check it in
> >>> or send me the modifications and I'll check them in and 
> update the 
> >>> output files accordingly, if needed.
> >> I'll do that, as soon as I get around to it (probably this week). 
> >> I'll introduce new arrays recip_cstice/recip_csuice and 
> use those to 
> >> replace all divisions by cst/uice.
> >>>
> >>>> What are the policies regarding the sea-ice model? Did 
> you aim at 
> >>>> using as little information from the main code as
> >> possible, e.g. grid
> >>>> and masking information? I noticed that the masks maskC
> >> are not used
> >>>> anywhere, so that for example, after calling the seaice
> >> model, theta
> >>>> is no longer zero over land (doesn't matter, but is ugly),
> >> because it
> >>>> is not masked in growth.F.
> >>>
> >>> I agree that it's ugly.  I need to fix that.  What I really
> >> would like
> >>> to see is some values over land like -9999, so they can 
> be used for 
> >>> plotting purposes, without having to load a mask separately.
> >>> What do you think?
> >> Well, special values are nice, but I would think, for 
> consistency it 
> >> would be best to stick with the main model's convention of 
> zero over 
> >> land. Otherwise there's going to be much confusion.
> >>>
> >>>> All grid parameters are computed in seaice_init.F from the grid 
> >>>> parameters of the ocean models, instead of adopting the 
> ocean model 
> >>>> grid parameters (I know it's a different grid, but still 
> all fields 
> >>>> could be applied properly). Is this intentional in the sense you 
> >>>> didn't want to change the sea ice model too much or 
> didn't want any 
> >>>> interference or is it just because it  was quicker and 
> less error 
> >>>> prone that way (which I could perfectly understand, 
> because adi and 
> >>>> lsr are difficult to read, and I wouldn't like to have to
> >> modify these
> >>>> routines)?
> >>>
> >>> A bit of both.  This is work in progress.  The eventual
> >> objective is to
> >>> rewrite the ice model for generalized curvilinear 
> coordinates on the 
> >>> C-grid, so that it matches the ocean model exactly.  This
> >> is one of the
> >>> things that Jinlun is working on.
> >> I guess that would make reading the code much simpler.
> >> Especially, when
> >> one is accostumed to MITgcm conventions/naming.
> >>
> >>>
> >>> P.S. Don't use adi.  Only lsr works properly.  Maybe we 
> should get 
> >>> rid of adi all together.
> >>>
> >>>> in subroutiine growth.F: what does the variable 
> >>>> WATR(1-Olx:Snx+Olx,1-Oly:Sny+Oly,Nsx,Nsy),
> >>>> defined in SEAICE.h and part of common block /SALT_WATER/,
> >> do? It's a
> >>>> identical copy of SEAICE_SALT, as far as I can see, and
> >> it's not used
> >>>> anywhere else in the code.
> >>>
> >>> very good point, I will remove it
> >>
> >>
> >> Again, if anyone wants to try out the configuration based on
> >> global_ocean.90x40x15 (not that anyone should feel obliged 
> to do it) 
> >> there is an archived tar file (~4MB) with the model 
> configuration and 
> >> forcing field files at 
> >> http://mitgcm.org/~mlosch/global_with_seaice.tgz
> >>
> >> Cheers,
> >> Martin
> >>
> >>
> >>
> >> Martin Losch // mailto:mlosch at awi-bremerhaven.de 
> >> Alfred-Wegener-Institut für Polar- und Meeresforschung Postfach 
> >> 120161, 27515 Bremerhaven, Germany
> >> Tel./Fax: ++49(471)4831-1872
> >> http://www.awi-bremerhaven.de/People/show?mlosch
> >>
> >> _______________________________________________
> >> MITgcm-support mailing list
> >> MITgcm-support at mitgcm.org
> >> http://dev.mitgcm.org/mailman/listinfo/mitgcm-support
> >>
> >
> >
> > _______________________________________________
> > MITgcm-support mailing list
> > MITgcm-support at mitgcm.org
> > http://dev.mitgcm.org/mailman/listinfo/mitgcm-support
> 
> 
> _______________________________________________
> MITgcm-support mailing list
> MITgcm-support at mitgcm.org
> http://dev.mitgcm.org/mailman/listinfo/mitgcm-support
> 





More information about the MITgcm-support mailing list