[MITgcm-support] Fw: Raw binary output error when using MPI
Yu-Kun Qian
qianyk at mail3.sysu.edu.cn
Mon Aug 5 23:53:56 EDT 2024
Hi Yanhong,
You may want to try ‘useSingleCpuIO=.FALSE.’. Then each core will output its data in a separate file like you said.
Hope this helps
—————————
Yu-Kun Qian
SCSIO, CAS
Guangzhou, China
qianyukun at scsio.ac.cn
qianyk at mail3.sysu.edu.cn
> 在 2024年8月6日,11:24,赖燕红 <yhlai at pku.edu.cn> 写道:
>
>
> Dear MITgcm community,
> The new attached files (U.*.data/meta) are the output examples, which can not be read with rdmds.m.
> Hope these files could help find out the problem.
>
> Thanks again.
>
>
> -----原始邮件-----
> 发件人: 赖燕红 <yhlai at pku.edu.cn>
> 发送时间: 2024-08-06 11:05:29 (星期二)
> 收件人: mitgcm-support at mitgcm.org
> 主题: Raw binary output error when using MPI
>
> Dear MITgcm community,
> I'm trying to run MITgcm in a new machine. However, there is something wrong in the raw binary output.
> I tried to run my case using mpi with 48 cores. However, the output files were not sperated into 48 subfiles, such as *.001.001.data/meta.
> Instead, the output file is just like *.data/meta. Thus, these output binary files can not been read by the provided code "rdmds" successfully.
> I'm not sure what causes this problem, could you help me check my code (presented below and also attached)? Thanks very much for your time!
> Best,
> Yanhong
> ##### SIZE.h file in code/
> & sNx = 8,
> & sNy = 16,
> & OLx = 4,
> & OLy = 4,
> & nSx = 1,
> & nSy = 1,
> & nPx = 48,
> & nPy = 1,
> & Nx = sNx*nSx*nPx,
> & Ny = sNy*nSy*nPy,
> & Nr = 53)
> ##### data file in input/
> # ====================
>
> # | Model parameters |
>
> # ====================
>
> # 6/8/2007: Hot Jupiter cubed sphere. Uses Adams Bashforth-3.
>
> # Based on the "data" file in Held-Suarez cubed-sphere test case
>
> # under directory input.impIGW/.
>
> #
>
> # This is for the UBERGRID
>
> #
>
> # Continuous equation parameters
>
> &PARM01
>
> tRef= 483.68 487.02 494.08 509.26
>
> 537.78 581.62 638.67 706.36
>
> 783.43 869.84 966.15 1073.26
>
> 1192.31 1316.09 1439.61 1574.72
>
> 1722.52 1884.20 2061.04 2254.49
>
> 2466.09 2697.55 2950.74 3227.69
>
> 3530.64 3862.02 4224.50 4621.00
>
> 5054.72 5529.15 6048.11 6615.77
>
> 7236.72 7915.94 8658.92 9550.06
>
> 10656.76 11891.71 13269.77 14807.52
>
> 16523.48 18438.28 20574.99 22959.30
>
> 25619.91 28588.85 31901.84 35598.75
>
> 39724.08 44327.47 49464.31 55196.43
>
> 61592.81 ,
>
> debugLevel=-1,
>
> sRef=53*0.0,
>
> viscAr=0.E1,
>
> viscAh=0.E6,
>
> viscA4=0.E17,
>
> no_slip_sides=.FALSE.,
>
> no_slip_bottom=.FALSE.,
>
> diffKhT=0.E3,
>
> diffKrT=0.,
>
> diffK4T=0.E17,
>
> diffKrS=0.E2,
>
> diffKhS=0.E3,
>
> diffK4S=0.E17,
>
> buoyancyRelation='ATMOSPHERIC',
>
> eosType='IDEALGAS',
>
> atm_Cp=13000.,
>
> rotationPeriod=66259.0,
>
> gravity=35.0,
>
> implicitFreeSurface=.true.,
>
> exactConserv=.true.,
>
> # rigidLid = .true.,
>
> nonlinFreeSurf=4,
>
> select_rStar=2,
>
> hFacInf=0.2,
>
> hFacSup=2.0,
>
> momForcing=.true.,
>
> tempForcing=.true.,
>
> saltStepping=.false.,
>
> # tempAdvScheme=77.,
>
> saltAdvection=.FALSE.,
>
> saltForcing=.FALSE.,
>
> saltStepping=.FALSE.,
>
> momViscosity=.FALSE.,
>
> useSingleCpuIO=.TRUE.,
>
> uniformLin_PhiSurf=.FALSE.,
>
> vectorInvariantMomentum=.TRUE.,
>
> useAbsVorticity=.TRUE.,
>
> SadournyCoriolis=.TRUE.,
>
> selectKEscheme=3,
>
> staggerTimeStep=.TRUE.,
>
> implicitIntGravWave=.FALSE.,
>
> readBinaryPrec=64,
>
> writeBinaryPrec=64,
>
> &
>
>
>
> # Elliptic solver parameters
>
> &PARM02
>
> cg2dMaxIters=200,
>
> #cg2dTargetResidual=1.E-12,
>
> cg2dTargetResWunit=1.E-17,
>
> cg3dMaxIters=40,
>
> cg3dTargetResidual=1.E-4,
>
> &
>
>
>
> # Time stepping parameters
>
> &PARM03
>
> deltaT=25,
>
> nIter0=0,
>
> nTimeSteps=10000000,
>
> tracforcingoutab = 1,
>
> abeps = 0.1,
>
> #forcing_In_AB=.FALSE.,
>
> #alph_AB=0.6,
>
> #beta_AB=0.,
>
> # alph_AB=0.5,
>
> # beta_AB=0.281105,
>
> # doAB_onGtGs=.FALSE.,
>
> #startFromPickupAB2=.TRUE.,
>
> pChkptFreq=5.d6,
>
> chkptFreq =1.d8,
>
> dumpFreq =5.d6,
>
> monitorFreq=864000000.,
>
> taveFreq=864000000.,
>
> &
>
>
>
> # Gridding parameters
>
> &PARM04
>
> usingCurvilinearGrid=.TRUE.,
>
> delR= 18484121.57857943 12789742.40305100 8849623.17743921 6123331.33183253
>
> 4236924.65177391 2931660.81206276 2028507.89744068 1403588.12078398
>
> 971186.56293696 671994.38785680 464973.95510233 321730.03648593
>
> 222615.08465448 134736.42286678 98428.61557291 71904.77643286
>
> 52528.39170566 38373.41651093 28032.82276321 20478.73823926
>
> 14960.27436890 10928.88666176 7983.84847231 5832.41810457
>
> 4260.73980043 3112.58612148 2273.82868173 1661.09359615
>
> 1213.47397777 886.47569176 647.59456442 473.08541427
>
> 345.60174141 252.47145666 184.43725477 159.35396547
>
> 108.56659285 73.96555867 50.39214851 34.33177112
>
> 23.38996338 15.93539655 10.85665928 7.39655587
>
> 5.03921485 3.43317711 2.33899634 1.59353965
>
> 1.08566593 0.73965559 0.50392149 0.34331771
>
> 0.23389963 ,
>
> radius_fromHorizGrid=6370.E3,
>
> Ro_SeaLevel=6.d7,
>
> rSphere=93326848,
>
> &
>
>
>
> # Input datasets
>
> &PARM05
>
> &
>
>
>
> <data>
> <SIZE.h>
> <U.0000000000.data>
> <U.0000000000.meta>
> _______________________________________________
> MITgcm-support mailing list
> MITgcm-support at mitgcm.org
> http://mailman.mitgcm.org/mailman/listinfo/mitgcm-support
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.mitgcm.org/pipermail/mitgcm-support/attachments/20240806/739925cb/attachment-0001.html>
More information about the MITgcm-support
mailing list