|
[Sponsors] |
Serial OK parallel failsmesh conversion problem |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
September 27, 2007, 08:28 |
Hi All,
I am having a proble
|
#1 |
Member
Radu Mustata
Join Date: Mar 2009
Location: Zaragoza, Spain
Posts: 99
Rep Power: 17 |
Hi All,
I am having a problem running a case of mine in parallel whilst the serial version is all fine (so far). The mesh is imported from fluent (with the new fluent3DMeshToaFoam utility) and has an internal wall. As I said, this doesnt seem to bother much the run in serial, but after decomposing it the run invariably finishes with an MPI error message like: Create mesh for time = 0 [oct11:07921] *** An error occurred in MPI_Recv [oct11:07921] *** on communicator MPI_COMM_WORLD [oct11:07921] *** MPI_ERR_TRUNCATE: message truncated [oct11:07921] *** MPI_ERRORS_ARE_FATAL (goodbye) [1] [1] [1] --> FOAM FATAL IO ERROR : Expected a ')' or a '}' while reading List, found on line 0 an error [1] [1] file: IOstream at line 0. [1] [1] From function Istream::readEndList(const char*) [1] in file db/IOstreams/IOstreams/Istream.C at line 159. [1] FOAM parallel run exiting [1] [0] ?? in "/lib/libc.so.6" [2] #3 ?? at pml_ob1_recvfrag.c:0 [2] #4 mca_btl_sm_component_progress in "/home/radu/OpenFOAM/OpenFOAM-1.4.1/src/openmpi-1.2.3/platforms/linux64GccDPOpt/ lib/openmpi/mca_btl_sm.so" [2] #5 mca_bml_r2_progress in "/home/radu/OpenFOAM/OpenFOAM-1.4.1/src/openmpi-1.2.3/platforms/linux64GccDPOpt/ lib/openmpi/mca_bml_r2.so" [2] #6 opal_progress in "/home/radu/OpenFOAM/OpenFOAM-1.4.1/src/openmpi-1.2.3/platforms/linux64GccDPOpt/ lib/libopen-pal.so.0" [2] #7 mca_pml_ob1_probe in "/home/radu/OpenFOAM/OpenFOAM-1.4.1/src/openmpi-1.2.3/platforms/linux64GccDPOpt/ lib/openmpi/mca_pml_ob1.so" [2] #8 MPI_Probe in "/home/radu/OpenFOAM/OpenFOAM-1.4.1/src/openmpi-1.2.3/platforms/linux64GccDPOpt/ lib/libmpi.so.0" [2] #9 Foam::IPstream::IPstream(int, int, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber) in "/home/radu/OpenFOAM/OpenFOAM-1.4.1/lib/linux64GccDPOpt/openmpi-1.2.3/libPstream .so" [2] #10 Foam::globalPoints::receivePatchPoints(Foam::HashS et<int,> >&) in "/home/radu/OpenFOAM/OpenFOAM-1.4.1/lib/linux64GccDPOpt/libOpenFOAM.so" [2] #11 Foam::globalPoints::globalPoints(Foam::polyMesh const&) in "/home/radu/OpenFOAM/OpenFOAM-1.4.1/lib/linux64GccDPOpt/libOpenFOAM.so" [2] #12 Foam::globalMeshData::updateMesh() in "/home/radu/OpenFOAM/OpenFOAM-1.4.1/lib/linux64GccDPOpt/libOpenFOAM.so" [2] #13 Foam::globalMeshData::globalMeshData(Foam::polyMes h const&) in "/home/radu/OpenFOAM/OpenFOAM-1.4.1/lib/linux64GccDPOpt/libOpenFOAM.so" [2] #14 Foam::polyMesh::globalData() const in "/home/radu/OpenFOAM/OpenFOAM-1.4.1/lib/linux64GccDPOpt/libOpenFOAM.so" [2] #15 Foam::polyMesh::polyMesh(Foam::IOobject const&) in "/home/radu/OpenFOAM/OpenFOAM-1.4.1/lib/linux64GccDPOpt/libOpenFOAM.so" [2] #16 Foam::fvMesh::fvMesh(Foam::IOobject const&) in "/home/radu/OpenFOAM/OpenFOAM-1.4.1/lib/linux64GccDPOpt/libfiniteVolume.so" [2] #17 main in "/home/radu/OpenFOAM/OpenFOAM-1.4.1/applications/bin/linux64GccDPOpt/icoFoam" [2] #18 __libc_start_main in "/lib/libc.so.6" [2] #19 Foam::regIOobject::readIfModified() in "/home/radu/OpenFOAM/OpenFOAM-1.4.1/applications/bin/linux64GccDPOpt/icoFoam" [oct11:07971] *** Process received signal *** [oct11:07971] Signal: Segmentation fault (11) [oct11:07971] Signal code: (-6) [oct11:07971] Failing at address: 0x47300001f23 [oct11:07971] [ 0] /lib/libc.so.6 [0x2aaaac61c110] [oct11:07971] [ 1] /lib/libc.so.6(gsignal+0x3b) [0x2aaaac61c07b] [oct11:07971] [ 2] /lib/libc.so.6 [0x2aaaac61c110] [oct11:07971] [ 3] /home/radu/OpenFOAM/OpenFOAM-1.4.1/src/openmpi-1.2.3/platforms/linux64GccDPOpt/l ib/openmpi/mca_pml_ob1.so [0x2aaab26b8c17] [oct11:07971] [ 4] /home/radu/OpenFOAM/OpenFOAM-1.4.1/src/openmpi-1.2.3/platforms/linux64GccDPOpt/l ib/openmpi/mca_btl_sm.so(mca_btl_sm_component_progress+0x1db) [0x2aaab2cd07cb] [oct11:07971] [ 5] /home/radu/OpenFOAM/OpenFOAM-1.4.1/src/openmpi-1.2.3/platforms/linux64GccDPOpt/l ib/openmpi/mca_bml_r2.so(mca_bml_r2_progress+0x2a) [0x2aaab28c426a] [oct11:07971] [ 6] /home/radu/OpenFOAM/OpenFOAM-1.4.1/src/openmpi-1.2.3/platforms/linux64GccDPOpt/l ib/libopen-pal.so.0(opal_progress+0x4a) [0x2aaaad93495a] [oct11:07971] [ 7] /home/radu/OpenFOAM/OpenFOAM-1.4.1/src/openmpi-1.2.3/platforms/linux64GccDPOpt/l ib/openmpi/mca_pml_ob1.so(mca_pml_ob1_probe+0x3c5) [0x2aaab26b61a5] [oct11:07971] [ 8] /home/radu/OpenFOAM/OpenFOAM-1.4.1/src/openmpi-1.2.3/platforms/linux64GccDPOpt/l ib/libmpi.so.0(MPI_Probe+0xf6) [0x2aaaad28fda6] [oct11:07971] [ 9] /home/radu/OpenFOAM/OpenFOAM-1.4.1/lib/linux64GccDPOpt/openmpi-1.2.3/libPstream. so(_ZN4Foam8IPstreamC1EiiNS_8IOstream12streamForma tENS1_13versionNumberE+0xee) [0x2aaaac82f24e] [oct11:07971] [10] /home/radu/OpenFOAM/OpenFOAM-1.4.1/lib/linux64GccDPOpt/libOpenFOAM.so(_ZN4Foam12 globalPoints18receivePatchPointsERNS_7HashSetIiNS_ 4HashIiEEEE+0x22c) [0x2aaaababc50c] [oct11:07971] [11] /home/radu/OpenFOAM/OpenFOAM-1.4.1/lib/linux64GccDPOpt/libOpenFOAM.so(_ZN4Foam12 globalPointsC1ERKNS_8polyMeshE+0x24f) [0x2aaaababccaf] [oct11:07971] [12] /home/radu/OpenFOAM/OpenFOAM-1.4.1/lib/linux64GccDPOpt/libOpenFOAM.so(_ZN4Foam14 globalMeshData10updateMeshEv+0x110) [0x2aaaabaae890] [oct11:07971] [13] /home/radu/OpenFOAM/OpenFOAM-1.4.1/lib/linux64GccDPOpt/libOpenFOAM.so(_ZN4Foam14 globalMeshDataC1ERKNS_8polyMeshE+0xe4) [0x2aaaabaaff64] [oct11:07971] [14] /home/radu/OpenFOAM/OpenFOAM-1.4.1/lib/linux64GccDPOpt/libOpenFOAM.so(_ZNK4Foam8 polyMesh10globalDataEv+0x55) [0x2aaaabad07f5] [oct11:07971] [15] /home/radu/OpenFOAM/OpenFOAM-1.4.1/lib/linux64GccDPOpt/libOpenFOAM.so(_ZN4Foam8p olyMeshC2ERKNS_8IOobjectE+0x1c02) [0x2aaaabad6f12] [oct11:07971] [16] /home/radu/OpenFOAM/OpenFOAM-1.4.1/lib/linux64GccDPOpt/libfiniteVolume.so(_ZN4Fo am6fvMeshC1ERKNS_8IOobjectE+0x19) [0x2aaaaae3cae9] [oct11:07971] [17] /home/radu/OpenFOAM/OpenFOAM-1.4.1/applications/bin/linux64GccDPOpt/icoFoam [0x412e07] [oct11:07971] [18] /lib/libc.so.6(__libc_start_main+0xda) [0x2aaaac6094ca] [oct11:07971] [19] /home/radu/OpenFOAM/OpenFOAM-1.4.1/applications/bin/linux64GccDPOpt/icoFoam(_ZN4 Foam11regIOobject14readIfModifiedEv+0x1a9) [0x412979] [oct11:07971] *** End of error message *** mpirun noticed that job rank 0 with PID 7969 on node oct11 exited on signal 15 (Terminated). 3 additional processes aborted (not shown checkMesh does say: Mesh stats points: 4079936 edges: 12145862 faces: 12129187 internal faces: 11788349 cells: 3986256 boundary patches: 4 point zones: 0 face zones: 0 cell zones: 3 Number of cells of each type: hexahedra: 3986256 prisms: 0 wedges: 0 pyramids: 0 tet wedges: 0 tetrahedra: 0 polyhedra: 0 Checking topology... Boundary definition OK. Point usage OK. Upper triangular ordering OK. Topological cell zip-up check OK. Face vertices OK. Number of identical duplicate faces (baffle faces): 77004 Face-face connectivity OK. Number of regions: 1 (OK). Checking patch topology for multiply connected surfaces ... Patch Faces Points Surface pared 182407 182695 ok (not multiply connected) inflow_top_lid 1836 1963 ok (not multiply connected) outflow_top_lid 2587 2750 ok (not multiply connected) pared_interior 154008 77742 multiply connected surface (shared edge) <<Writing 77718 conflicting points to set nonManifoldPoints Checking geometry... Domain bounding box: (-0.04 -0.04 -1.42109e-17) (0.04 0.04 0.08) Boundary openness (-2.89631e-16 -8.26677e-16 -8.0705e-16) OK. Max cell openness = 8.55581e-16 OK. Max aspect ratio = 323.357 OK. Minumum face area = 8.39926e-10. Maximum face area = 8.16213e-06. Face area magnitudes OK. Min volume = 6.33172e-14. Max volume = 1.06746e-08. Total volume = 0.000402107. Cell volumes OK. Mesh non-orthogonality Max: 32.6604 average: 5.2825 Non-orthogonality check OK. Face pyramids OK. Max skewness = 0.594768 OK. Min/max edge length = 2.04497e-05 0.00509539 OK. All angles in faces OK. Face flatness (1 = flat, 0 = butterfly) : average = 1 min = 0.999999 All face flatness OK. Mesh OK. Is that multiply connected surface (the internal wall) that is causing the trouble? Or should I look elsewhere? I am saying this because I did the import with the old "fluentMeshToFoam" and used the procedure described by Bernhard for "mesh with internal walls" and, having two patches instead of one did get rid of these "multiply connected surfaces" label, but the parallel run failed again. Sorry for the long post... Cheers, Radu |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
More DPM incompletes in parallel than in serial | Paul | FLUENT | 0 | December 16, 2008 09:27 |
Serial vs parallel different results | luca | OpenFOAM Bugs | 2 | December 3, 2008 10:12 |
Problem with Parallel not with Serial | iyer_arvind | OpenFOAM Running, Solving & CFD | 0 | September 18, 2006 06:03 |
Serial run OK parallel one fails | r2d2 | OpenFOAM Running, Solving & CFD | 2 | November 16, 2005 12:44 |
parallel Vs. serial | co2 | FLUENT | 1 | December 31, 2003 02:19 |