rdrr84 |
September 28, 2011 07:08 |
Cyclic BC with parallel running
Hi,
we're using the last version of OpenFoam 2.0.x to do a compressible LES with rhoCentralFoam. We're facing a problem reading a mesh with cyclic BC and decomposed with Scotch (the only one present in 2.0). Actually, in serial running and decomposing the case in 3 processors the solver runs well. Trying with more processors (40 or 60) the code stops when reading the mesh ( I attach the error). With empty BC the case works well even on 40 processors.
Thanks for the help :rolleyes:
Code:
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time
Create mesh for time = 0
[29] #0 Foam::error:printStack(Foam::Ostream&)--------------------------------------------------------------------------
An MPI process has executed an operation involving a call to the
"fork()" system call to create a child process. Open MPI is currently
operating in a condition that could result in memory corruption or
other system errors; your MPI job may hang, crash, or produce silent
data corruption. The use of fork() (or system() or other calls that
create child processes) is strongly discouraged.
The process that invoked fork was:
Local host: node023 (PID 5648)
MPI_COMM_WORLD rank: 29
If you are *absolutely sure* that your application will successfully
and correctly survive a call to fork(), you may disable this warning
by setting the mpi_warn_on_fork MCA parameter to 0.
--------------------------------------------------------------------------
addr2line failed
[29] #1 Foam::sigSegv::sigHandler(int) addr2line failed
[29] #2 Uninterpreted: /lib64/libc.so.6 addr2line failed
[29] #3 Foam::processorPolyPatch::updateMesh(Foam::PstreamBuffers&) addr2line failed
[29] #4 Foam::polyBoundaryMesh::updateMesh()[13] #0 Foam::error::printStack(Foam::Ostream&) addr2line failed
[29] #5 Foam::polyMesh::polyMesh(Foam::IOobject const&) addr2line failed
[29] #6 Foam::fvMesh::fvMesh(Foam::IOobject const&) addr2line failed
[29] #7 Uninterpreted: NS_Compressible
[29] #8 __libc_start_main addr2line failed
[13] #1 Foam::sigSegv::sigHandler(int) addr2line failed
[29] #9 addr2line failed
[13] #2 Uninterpreted: Foam::UOPstream::write(char)
/lib64/libc.so.6 [node023:05648] *** Process received signal ***
[node023:05648] Signal: Segmentation fault (11)
[node023:05648] Signal code: (-6)
[node023:05648] Failing at address: 0x20700001610
[node023:05648] [ 0] /lib64/libc.so.6 [0x2aaaae5262d0]
[node023:05648] [ 1] /lib64/libc.so.6(gsignal+0x35) [0x2aaaae526265]
[node023:05648] [ 2] /lib64/libc.so.6 [0x2aaaae5262d0]
[node023:05648] [ 3] /home/ioba/data/software/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64Gcc44DPOpt/lib/libOpenFOAM.so(_ZN4Foam18processorPolyPatch10updateMeshERNS_14PstreamBuffersE+0x2da) [0x2aaaad6db47a]
[node023:05648] [ 4] /home/ioba/data/software/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64Gcc44DPOpt/lib/libOpenFOAM.so(_ZN4Foam16polyBoundaryMesh10updateMeshEv+0x2b1) [0x2aaaad6e24b1]
[node023:05648] [ 5] /home/ioba/data/software/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64Gcc44DPOpt/lib/libOpenFOAM.so(_ZN4Foam8polyMeshC2ERKNS_8IOobjectE+0x10ea) [0x2aaaad73347a]
[node023:05648] [ 6] /home/ioba/data/software/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64Gcc44DPOpt/lib/libfiniteVolume.so(_ZN4Foam6fvMeshC1ERKNS_8IOobjectE+0x19) [0x2aaaab069fb9]
[node023:05648] [ 7] NS_Compressible [0x420881]
[node023:05648] [ 8] /lib64/libc.so.6(__libc_start_main+0xf4) [0x2aaaae513994]
[node023:05648] [ 9] NS_Compressible(_ZN4Foam9UOPstream5writeEc+0xa9) [0x41d599]
[node023:05648] *** End of error message ***
addr2line failed
[13] #3 Foam::processorPolyPatch::updateMesh(Foam::PstreamBuffers&) addr2line failed
[13] #4 Foam::polyBoundaryMesh::updateMesh() addr2line failed
[13] #5 Foam::polyMesh::polyMesh(Foam::IOobject const&) addr2line failed
[13] #6 Foam::fvMesh::fvMesh(Foam::IOobject const&)--------------------------------------------------------------------------
mpirun noticed that process rank 29 with PID 5648 on node node023.cm.cluster exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------
[node015:31786] 1 more process has sent help message help-mpi-runtime.txt / mpi_init:warn-fork
[node015:31786] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
|