[MITgcm-support] seance leakage at the cpu domain boundaries

Martin Losch Martin.Losch at awi.de
Tue Jul 22 05:42:46 EDT 2014


Hi Gregory,

frequently, these stripes appear with the sea ice dynamics, because the parallelization of the LSR solver is not ideal, but depends on the prescribed accuracy of the linear solve. There are a few solutions
1. you crank up the solver accuracy (default is 2e-4) in data.seaice:
 LSR_ERROR = 1.e-6,
2. you use a different solver. There is the EVP solver, which is usually very inaccurate and procudes some noise especially at high resolution (and very weak ice), but may be sufficient for your problem (for a discussion see Losch and Danilov, 2012, Lemieux et al, 2012). There’s also the JFNK solver, which can be very accurcate but expensive. At very high resolution, if may not converge making it even more expensive, so I am not sure if I want to recommend that. (See Lemieux et al 2012 or Losch et al. 2014 for a description). The documentation also describess these solvers: <http://mitgcm.org/public/r2_manual/latest/online_documents/node253.html>

Martin



Jean-François Lemieux, Dana Knoll, Bruno Tremblay, David M. Holland, and Martin Losch. A comparison of the Jacobian-free Newton-Krylov method and the EVP model for solving the sea ice momentum equation with a viscous-plastic formulation: a serial algorithm study.Journal of Computational Physics, 231(17), 5926-5944, doi: 10.1016/j.jcp.2012.05.024, 2012.

Martin Losch and Sergey Danilov. On solving the momemtum equations of dynamic sea ice models with implict solvers and the Elastic Viscous-Plastic technique. Ocean Modelling, 41, 42-52, doi:10.1016/j.ocemod.2011.10.002, 2012.

Martin Losch, Annika Fuchs, Jean-François Lemieux, and Anna Vanselow. A parallel Jacobian-free Newton-Krylov solver for a coupled sea ice-ocean model, J. Comp. Phys., 257(A), 901-911, doi:10.1016/j.jcp.2013.09.026, 2014.

On Jul 21, 2014, at 9:36 PM, Georgy Manucharyan <gmanucharyan at whoi.edu> wrote:

> Hi all,
> 
> I’m starting to work with the SEAICE package with the MITgcm_c64x. In the spin-down problem that I simulate I see the melting of ice at the boundaries of the computational domains assigned to each CPU. See a snapshot of the sea-ice area here: https://www.whoi.edu/fileserver.do?id=189124&pt=2&p=197829; in the lover half of the domain there are clear ’scars’ present in the sea-ice area which are collocated with the cpu-domain boundaries in the x-direction (surprisingly there are many more of this scars in the y-directions). I’m using 6x4 cpus with 100x100 grid boxes in each. Initial conditions were created and saved as in the gendata.m file provided with the code. I also tried different overlaps 4,5 — the issue is still there. The files used to generate an executable are here http://www.whoi.edu/fileserver.do?id=189144&pt=2&p=197829. 
> 
> I’d appreciate if someone from the developers team could take a look at this. 
> 
> Thanks in advance for your help,
> Georgy
> _______________________________________________
> MITgcm-support mailing list
> MITgcm-support at mitgcm.org
> http://mitgcm.org/mailman/listinfo/mitgcm-support




More information about the MITgcm-support mailing list