Originally Posted by mabinty
(Post 224013)
Dear all!!
I changed the procedure of parallelising with chtMultiregionFaom as described in the previous post a bit:
3) use the 0.001/air directory as initial directory 0/
4) run "decomposePar" => processor<n> has now a 0/ and constant/ directory
5) put processor<n>/constant/polyMesh and heater/ into processor<n>/0/air and processor<n>/0/ respectivly and copy the system/ directory into each processor<n>/ directory
6) run "mpirun.openmpi -np 4 chtMultiRegionFoam -parallel"
The simulation runs but aborts during the first time step of the solid calculation:
************************************************** *****
Region: air Courant Number mean: 0 max: 0
Region: air Courant Number mean: 0 max: 0
deltaT = 0.001199041
Time = 0.00119904
Solving for fluid region air
diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0
DILUPBiCG: Solving for h, Initial residual = 1, Final residual = 3.867589e-07, No Iterations 15
Min/max T:min(T) [0 0 0 1 0 0 0] 100 max(T) [0 0 0 1 0 0 0] 632.0841
GAMG: Solving for pd, Initial residual = 1, Final residual = 0.08695774, No Iterations 3
GAMG: Solving for pd, Initial residual = 0.03317856, Final residual = 0.002205762, No Iterations 3
diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0
time step continuity errors (air): sum local = 0.0002709278, global = -2.734944e-05, cumulative = -2.734944e-05
GAMG: Solving for pd, Initial residual = 0.6863633, Final residual = 0.03345041, No Iterations 3
GAMG: Solving for pd, Initial residual = 0.0922989, Final residual = 8.175851e-07, No Iterations 13
diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0
time step continuity errors (air): sum local = 6.847878e-06, global = 9.906124e-08, cumulative = -2.725038e-05
Solving for solid region heater
DICPCG: Solving for T, Initial residual = 1, Final residual = 3.015545e-07, No Iterations 1
3 additional processes aborted (not shown)
************************************************** ****************
with the following error message:
************************************************** *****************
aa@lws16:~/OpenFOAM/aa-1.5.x/run/chtMultiRegionFoam/simpleRegionHeaterParall02$ mpirun.openmpi -np 4 chtMultiRegionFoam -parallel > log.chtMultiRegionFoam
[0] #0 Foam::error::printStack(Foam::Ostream&)[3] #0 Foam::error::printStack(Foam::Ostream&) in "/home/aa/OpenFOAM/O in "/home/aa/OpenFOAM/OpenFOAM-1.5.x/lib/linux64GccDPOpt/libOpenFOAM.so"
[0] #1 Foam::sigSegv::sigSegvHandler(int)penFOAM-1.5.x/lib/linux64GccDPOpt/libOpenFOAM.so"
[3] #1 Foam::sigSegv::sigSegvHandler(int) in "/home/aa/OpenFOAM/OpenFOAM-1.5.x/lib/linux64GccDPOpt/libOpenFOAM.so"
[0] #2 in "/home/aa/OpenFOAM/OpenFOAM-1.5.x/lib/linux64GccDPOpt/libOpenFOAM.so"
[3] #2 ???? in "/lib/l in "/lib/libc.so.6"
[0] #3 ibc.so.6"
[3] #3 Foam::tmp<Foam::Field<Foam::typeOfSum<double, double>::type> > Foam::operator+<double, double>(Foam::UList<double> const&, Foam::UList<double> const&) in "/home/aa/OpenFOAM/OpenFOAM-1.5.x/applications/bin/linux64GccDPOpt/chtMultiRegionFoam"
[3] #4 Foam::tmp<Foam::Field<Foam::typeOfSum<double, double>::type> > Foam::operator+<double, double>(Foam::UList<double> const&, Foam::UList<double> const&) in "/home/aa/OpenFOAM/OpenFOAM-1.5.x/applications/bin/linux64GccDPOpt/chtMultiRegionFoam"
[0] #4 Foam::solidWallMixedTemperatureCoupledFvPatchScala rField::evaluate(Foam::Pstream::commsTypes) in "/home/aa/OpenFOAMFoam::solidWallMixedTemperatureCoupledFvPa tchScalarField::evaluate(Foam::Pstream::commsTypes )/OpenFOAM-1.5.x/applications/bin/linux64GccDPOpt/chtMultiRegionFoam"
[3] #5 Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>::GeometricBoundaryField::evaluate() in "/home/aa/OpenFOAM/OpenFOAM-1.5.x/applications/bin/linux64GccDPOpt/chtMultiRegionFoam"
[0] #5 Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>::GeometricBoundaryField::evaluate() in "/home/aa/OpenFOAM/OpenFOAM-1.5.x/applications/bin/linux64GccDPOpt/chtMultiRegionFoam"
[3] #6 Foam::fvMatrix<double>::solve(Foam::Istream&) in "/home/aa/OpenFOAM/OpenFOAM-1.5.x/applications/bin/linux64GccDPOpt/chtMultiRegionFoam"
[0] #6 Foam::fvMatrix<double>::solve(Foam::Istream&) in "/home/aa/OpenFOAM/OpenFOAM-1.5.x/lib/linux64GccDPOpt/libfiniteVolume.so"
[0] #7 in "/home/aa/OpenFOAM/OpenFOAM-1.5.x/lib/linux64GccDPOpt/libfiniteVolume.so"
[3] #7 Foam::lduMatrix::solverPerformance Foam::solve<double>(Foam::tmp<Foam::fvMatrix<doubl e> > const&) in "/home/aa/OpenFOAM/OpenFOAM-1.5.x/applications/bin/linux64GccDPOpt/chtMultiRegionFoam"
[0] #8 Foam::lduMatrix::solverPerformance Foam::solve<double>(Foam::tmp<Foam::fvMatrix<doubl e> > const&)main in "/home/aa/OpenFOAM/OpenFOAM-1.5.x/applications/bin/linux64GccDPOpt/chtMultiRegionFoam"
[3] #8 in "/home/aa/OpenFOAM/OpenFOAM-1.5.x/applications/bin/linux64GccDPOpt/chtMultiRegionFoam"
[0] #9 __libc_start_main in "/lib/libc.so.6"
[0] #10 main in "/home/aa/OpenFOAM/OpenFOAM-1.5.x/applications/bin/linux64GccDPOpt/chtMultiRegionFoam"
[3] #9 __libc_start_main in "/lib/libc.so.6"
[3] #10 _start in "/home/aa/OpenFOAM/OpenFOAM-1.5.x/applications/bin/linux64GccDPOpt/chtMultiRegionFoam"
[lws16:06857] *** Process received signal ***
[lws16:06857] Signal: Segmentation fault (11)
[lws16:06857] Signal code: (-6)
[lws16:06857] Failing at address: 0x3e800001ac9
[lws16:06857] [ 0] /lib/libc.so.6 [0x7f5b913ab0a0]
[lws16:06857] [ 1] /lib/libc.so.6(gsignal+0x35) [0x7f5b913ab015]
[lws16:06857] [ 2] /lib/libc.so.6 [0x7f5b913ab0a0]
[lws16:06857] [ 3] chtMultiRegionFoam(_ZN4FoamplIddEENS_3tmpINS_5Fiel dINS_9typeOfSumIT_T0_E4typeEEEEERKNS_5UListIS4_EER KNSA_IS5_EE+0x68) [0x435108]
[lws16:06857] [ 4] chtMultiRegionFoam [0x431f78]
[lws16:06857] [ 5] chtMultiRegionFoam(_ZN4Foam14GeometricFieldIdNS_12 fvPatchFieldENS_7volMeshEE22GeometricBoundaryField 8evaluateEv+0xc6) [0x44ed26]
[lws16:06857] [ 6] /home/aa/OpenFOAM/OpenFOAM-1.5.x/lib/linux64GccDPOpt/libfiniteVolume.so(_ZN4Foam8fvMatrixIdE5solveERNS_ 7IstreamE+0x19f) [0x7f5b933a1dcf]
[lws16:06857] [ 7] chtMultiRegionFoam(_ZN4Foam5solveIdEENS_9lduMatrix 17solverPerformanceERKNS_3tmpINS_8fvMatrixIT_EEEE+ 0x50) [0x44a2e0]
[lws16:06857] [ 8] chtMultiRegionFoam [0x440eec]
[lws16:06857] [ 9] /lib/libc.so.6(__libc_start_main+0xe6) [0x7f5b91396466]
[lws16:06857] [10] chtMultiRegionFoam [0x41c019]
[lws16:06857] *** End of error message ***
mpirun.openmpi noticed that job rank 0 with PID 6854 on node lws16 exited on signal 15 (Terminated).
************************************************** ****
When I decompose the domain into 4 subdomains, subdomain 0 and 3 have no interface with the solid region; whereas with 2 subdomains, both share patches with the solid region, the calculation aborts at the same time as before but the error message looks different:
************************************************** ******
aa@lws16:~/OpenFOAM/aa-1.5.x/run/chtMultiRegionFoam/simpleRegionHeaterParall$ mpirun.openmpi -np 2 chtMultiRegionFoam -parallel > log.chtMultiRegionFoam
[0] #0 Foam::error::printStack(Foam::Ostream&) in "/home/aa/OpenFOAM/OpenFOAM-1.5.x/lib/linux64GccDPOpt/libOpenFOAM.so"
[0] #1 Foam::sigFpe::sigFpeHandler(int) in "/home/aa/OpenFOAM/OpenFOAM-1.5.x/lib/linux64GccDPOpt/libOpenFOAM.so"
[0] #2 ?? in "/lib/libc.so.6"
[0] #3 double Foam::max<Foam::fvPatchField, double>(Foam::FieldField<Foam::fvPatchField, double> const&) in "/home/aa/OpenFOAM/OpenFOAM-1.5.x/applications/bin/linux64GccDPOpt/chtMultiRegionFoam"
[0] #4 Foam::dimensioned<double> Foam::max<double, Foam::fvPatchField, Foam::volMesh>(Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh> const&) in "/home/aa/OpenFOAM/OpenFOAM-1.5.x/applications/bin/linux64GccDPOpt/chtMultiRegionFoam"
[0] #5 main in "/home/aa/OpenFOAM/OpenFOAM-1.5.x/applications/bin/linux64GccDPOpt/chtMultiRegionFoam"
[0] #6 __libc_start_main in "/lib/libc.so.6"
[0] #7 _start in "/home/aa/OpenFOAM/OpenFOAM-1.5.x/applications/bin/linux64GccDPOpt/chtMultiRegionFoam"
[lws16:05764] *** Process received signal ***
[lws16:05764] Signal: Floating point exception (8)
[lws16:05764] Signal code: (-6)
[lws16:05764] Failing at address: 0x3e800001684
[lws16:05764] [ 0] /lib/libc.so.6 [0x7f4208af70a0]
[lws16:05764] [ 1] /lib/libc.so.6(gsignal+0x35) [0x7f4208af7015]
[lws16:05764] [ 2] /lib/libc.so.6 [0x7f4208af70a0]
[lws16:05764] [ 3] chtMultiRegionFoam(_ZN4Foam3maxINS_12fvPatchFieldE dEET0_RKNS_10FieldFieldIT_S2_EE+0x160) [0x44f7c0]
[lws16:05764] [ 4] chtMultiRegionFoam(_ZN4Foam3maxIdNS_12fvPatchField ENS_7volMeshEEENS_11dimensionedIT_EERKNS_14Geometr icFieldIS4_T0_T1_EE+0x20) [0x4806e0]
[lws16:05764] [ 5] chtMultiRegionFoam [0x441761]
[lws16:05764] [ 6] /lib/libc.so.6(__libc_start_main+0xe6) [0x7f4208ae2466]
[lws16:05764] [ 7] chtMultiRegionFoam [0x41c019]
[lws16:05764] *** End of error message ***
mpirun.openmpi noticed that job rank 0 with PID 5764 on node lws16 exited on signal 8 (Floating point exception).
************************************************** ****************
I greatly appreciate your comments!! Thx in advace,
Aram
|