[MITgcm-devel] JFNK

Martin Losch Martin.Losch at awi.de
Wed Oct 17 12:15:58 EDT 2012


Hi Jean-Michel,

my check-in's hopefully do not affect the adjoint, the code is not even shown to TAF (in my adjoint testreport lab_sea/build/seaice_dynsolver.f does not call seaice_jfnk, as it should not) , because I do not think that it will ever be adjointable (maybe with some approximation).

Let me know if I can help. I can see recomputation problems, but in parts of the code that appear (superficially) unrelated to my modifications: dwnslp_apply, salt_plume_frac, gmredi_slope_limit, advect, seaice_lsr, exf_bulkformulae

The problem is manifests itself in seaice_lsr.F, but just taking back my recent change (printing of ilcall) does not change anything.

Martin

On Oct 17, 2012, at 4:41 PM, Jean-Michel Campin wrote:

> Hi Martin,
> 
> I have few (at least 4) adjoint test that started this morning 
> but do not finish when running lab_sea (but only the standard test; 
> the additional evp, noseaice and noseaicedyn test are running fine)
> 
> Since it looks like you are the only one who check-in things to MITgcm
> yesterday, it might well be related.
> 
> The 4 tests are the 3 adjoint tests run on old aces cluster (32 bit,
> with g77, ifort+mpi & open64) and the one adjoint test run on new
> aces cluster (64 bit, using pgi+mpi).
> 
> I will try to narrow down the problem first.
> 
> Cheers,
> Jean-Michel
> 
> On Tue, Oct 16, 2012 at 11:37:05AM +0200, Martin Losch wrote:
>> Hi there,
>> 
>> I checked in what I have, am I am working on a better parallel version of fgmres. In the meantime, the preconditioner is giving me a headache, so the entire system is not really in the best state ... 
>> 
>> M.
>> 
>> On Oct 15, 2012, at 5:18 PM, Jean-Michel Campin wrote:
>> 
>>> Hi Martin,
>>> 
>>> This is good news. Please check-in the version you have.
>>> we can figure out later the multi-treaded issues (if there are some).
>>> Don't know much about alternative parallel algorithm, but may be others do.
>>> 
>>> Cheers,
>>> Jean-Michel
>>> 
>>> On Fri, Oct 12, 2012 at 04:43:37PM +0200, Martin Losch wrote:
>>>> Hi Jean-Michel, and others
>>>> 
>>>> I have implemented a JFNK solver for the sea ice dynamics (following Lemieux et al, 2010, 2012). I have not made too many tests yet, but I think that the serial version works.
>>>> 
>>>> Now I need to work on the parallel version and for that I need some help:
>>>> 1. I have an MPI version runing (almost) that will give you similar results for 1 or 2 cpu, but I have not yet done the multithreading (I hope to have moved all the computation to myThid=1).
>>>> 2. The JFNK requires a (F)GMRES implementation (with reverse communication). In the routine that I have, the orthogonalization of the Krylov subspaces is done with a modified Gram Schmidt method that requires many scale products and thus global sums. This will break any advantage that a JFNK might bring. The solution appears to use a different orthogonalization (I have seen Householder reflection), but I have absolutely no idea about these things. So far my search for availble parallel FGMRES routines has not been successful.
>>>> 
>>>> In order to continue (especially with 1.), I would like to check the present version in, but since it is not fully functional, will probably put a stop somewhere. Are you OK with that? Or should I wait, until I find good parallel FGMRES (maybe with your help?).
>>>> 
>>>> Martin
>>>> 
>>>> 
>>>> 
>>>> 
>>>> 
>>>> _______________________________________________
>>>> MITgcm-devel mailing list
>>>> MITgcm-devel at mitgcm.org
>>>> http://mitgcm.org/mailman/listinfo/mitgcm-devel
>>> 
>>> _______________________________________________
>>> MITgcm-devel mailing list
>>> MITgcm-devel at mitgcm.org
>>> http://mitgcm.org/mailman/listinfo/mitgcm-devel
>> 
>> 
>> _______________________________________________
>> MITgcm-devel mailing list
>> MITgcm-devel at mitgcm.org
>> http://mitgcm.org/mailman/listinfo/mitgcm-devel
> 
> _______________________________________________
> MITgcm-devel mailing list
> MITgcm-devel at mitgcm.org
> http://mitgcm.org/mailman/listinfo/mitgcm-devel




More information about the MITgcm-devel mailing list