[MITgcm-devel] exf: u/v-wind interpolation and cubed sphere

Patrick Heimbach heimbach at MIT.EDU
Fri Jun 23 10:49:03 EDT 2006


Martin,

haven't had time to look into this,
but I second your suspicion that something is wrong at pole.

The Arctic adjoint shows weird behaviour right over the pole,
everywhere else gradients are well behaved
(unfortunately this eventually propagates "outwards").
Didn't find time to look into the routines.
I had similar thought re. discarding wind vel. near pole.

More l8r...

-p.



On Fri, 2006-06-23 at 10:29, Martin Losch wrote:
> Since I seem to be the only one in this discussion I have to make  
> sure that I keep it alive (o:
> 
> The more I think about the code in exf_set_uv.F that deals with the  
> north pole, the more I think it's wrong (I need to be bit provocative  
> here in order to get more participants). I have a better (but more  
> expensive) suggestion for dealing with the North Pole. It's only  
> restriction is that the data-grid is a regular lat lon grid with no  
> wind-velocity grid-point at 90N:
> 
> 1. Do interpolation and rotation of vectors as is (without the offset  
> business, that is offset=0 always or file version 1.11)
> 2. discard all wind velocities on model grid that are within the  
> northern most latitude circle of the the data grid (yG/C > data_lat 
> (end)), leaving a gap of radius 90deg-data_lat(end) around the north  
> pole.
> 3. exchange u/vwind fields on model grid
> 4. interpolate discard values from model grid, e.g. using some radius  
> of influence that is large enough so that values from all sides of  
> the gap are used, e.g. twice or 2.5 times the radius of the gap.
> The advantage is that we don't have to worry about direction and  
> magnitude of the wind field because outside this gap everthing is  
> okay anyway. The disadvantage is that for 4. we need global indices  
> to access grid point from neighboring tiles, but that's not a  
> principle problem, is it? E.g., for the high res cubed sphere the  
> north pole is at the corner of 4 ajacent tiles.
> 
> Does this sound feasible?
> 
> Martin
> 
> On Jun 22, 2006, at 2:52 PM, Martin Losch wrote:
> 
> > Hi again,
> >
> > to follow up on this:
> >
> > I commented out the lines that reset the variable "offset=2" in the  
> > case when yC is is within dyC of the North Pole, so that offset=0  
> > always (effectively undoing the latest changes to this file and I  
> > don't get these complaints by exf_check_range.
> >
> > I have to admit that I don't understand how this offset-thing  
> > works, but why is the offset 2, when the wind is rotated near the  
> > pole. What's happening is that the wind at (i,j) is replaced by the  
> > wind at (i-offset,j-offset) which in at least one case means, that  
> > all the information about wind and direction is taken from across  
> > the pole. Is that correct/indended?
> >
> > Martin
> >
> > On Jun 22, 2006, at 11:28 AM, Martin Losch wrote:
> >
> >> Hi Dimitris,
> >>
> >> for benchmarking I successfully run the high_res cubed sphere on  
> >> our XD1 with 27 cpus (its the ss216t_85x85 configuration), with  
> >> monthly forcing, (the *194_92_12.bin files). We have moved the  
> >> entire configuration to an IBM p690 and reran the identical  
> >> experiment and get this:
> >>> STOP in S/R exf_check_range
> >>> STOP in S/R exf_check_range
> >>> ERROR: 0032-184 MPI was not finalized  in routine unknown, task 10
> >>> ERROR: 0032-184 MPI was not finalized  in routine unknown, task 11
> >> We didn't save the output in STDOUT.0021 and STDOUT.0023, but  
> >> appearently the u/vwinds are wrong (order 10e11) at i=85,j=1 for  
> >> one tile (I think it was tile 96), and consequently wrong at  
> >> i=85,j=86 for another tile because of exchanges, also at i=0,j=1  
> >> and i=1,j=1 at different tiles, and this is reproducible (on the  
> >> p690 in Hannover). Task 10 owns tiles 81-88 task 11 tiles 89-96; I  
> >> cannot be sure, but I suspect that the grid points that cause the  
> >> problem are the four points around the north pole?
> >>
> >> [BTW, should i change exf_check_range, so that the STOP command is  
> >> executed *after* the bi,bj-i,j-loops have finished? That way we  
> >> would get the numbers of more than one grid point before the  
> >> system stops. Works nicely for me.
> >>
> >> When we do not read uwind/vwind (that is, when these fields are  
> >> zero), the problem goes away.
> >>
> >> I suspect a problem in the interpolation routines, in particular  
> >> lines such as
> >>>                     if ( yC(i,j,bi,bj) .gt.
> >>>      &                  (90-dyC(i,j,bi,bj)*8.9946e-06) )
> >>>      &                  offset=2
> >> But that just a wild guess. Do you have any ideas?
> >>
> >> Martin
> >>
> >> PS: Unfortunately I cannot reprdouce this with a simpler cs-setup.
> >>
> >>
> >> _______________________________________________
> >> MITgcm-devel mailing list
> >> MITgcm-devel at mitgcm.org
> >> http://mitgcm.org/mailman/listinfo/mitgcm-devel
> >
> > _______________________________________________
> > MITgcm-devel mailing list
> > MITgcm-devel at mitgcm.org
> > http://mitgcm.org/mailman/listinfo/mitgcm-devel
> 
> _______________________________________________
> MITgcm-devel mailing list
> MITgcm-devel at mitgcm.org
> http://mitgcm.org/mailman/listinfo/mitgcm-devel
-- 
--------------------------------------------------------
Patrick Heimbach   Massachusetts Institute of Technology
FON: +1/617/253-5259                  EAPS, Room 54-1518
FAX: +1/617/253-4464             77 Massachusetts Avenue
mailto:heimbach at mit.edu               Cambridge MA 02139
http://www.mit.edu/~heimbach/                        USA




More information about the MITgcm-devel mailing list