[MITgcm-devel] time interpolation in exf_set_gen

Jean-Michel Campin jmc at ocean.mit.edu
Thu May 17 21:14:39 EDT 2007


Hi Patrick,

I think I found the source of the differences between
a) runoffperiod= 0 , no time interpolation
b) runoffperiod= 1 month, do time interpolation , with 12 
 identicals reccords in runoffFile.

It's a machine truncation problem, and I think it might be 
worth to change the current version:
 genfld = fac * gen0 + (exf_one - fac)* gen1
to:
 genfld = gen1 + fac * ( gen0 - gen1 )

reason is that fac + (exf_one - fac) is not exactly equal to 1,
and for a lot of fields, gen0 & gen1 at 1 given point have the
same magnitude (thinking of SWdown, LWdown, airTemp) so that
the difference gen0-gen1 is often much smaller than gen0 or gen1,
and the 2nd method will minimize the truncation error that 
we get on "fac".

For this runoff problem (and I guess, it's not specific to the runoff
but applies to any field when 12 identical months is compared 
to a single reccord case, period= 0), this change remove the differences.
Might also reduce the platform/compiler sensitivity of the output ? 

Cheers,
Jean-Michel



More information about the MITgcm-devel mailing list