CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Community Contributions

[cfMesh] Fatal error when parallel running with mpi

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   August 2, 2019, 17:29
Default Fatal error when parallel running with mpi
  #1
Member
 
chen112p's Avatar
 
Junting Chen
Join Date: Feb 2016
Location: Ontario Canada
Posts: 37
Rep Power: 10
chen112p is on a distinguished road
Hello all, I have made sure the test cases can run in serial cartesianMesh.

I keeps getting "segmentation fault" when I ran them again with mpi:

mpirun -np 32 cartesianMesh -parallel

The fault usually comes very randomly:

...
Total number of cells 693956
Finished extracting polyMesh
Checking for irregular surface connections
Checking cells connected to surface vertices
Found 40289 boundary faces
Found 0 problematic vertices
Finished checking cells connected to surface vertices
Checking for non-manifold surface edges
--------------------------------------------------------------------------
A process has executed an operation involving a call to the
"fork()" system call to create a child process. Open MPI is currently
operating in a condition that could result in memory corruption or
other system errors; your job may hang, crash, or produce silent
data corruption. The use of fork() (or system() or other calls that
create child processes) is strongly discouraged.

The process that invoked fork was:

Local host: [[17085,1],27] (PID 18387)

If you are *absolutely sure* that your application will successfully
and correctly survive a call to fork(), you may disable this warning
by setting the mpi_warn_on_fork MCA parameter to 0.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 27 with PID 18387 on node nnode9 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------


I looked some threads but no one really reported this issue seems like. Thanks


Junting
chen112p is offline   Reply With Quote

Old   August 5, 2019, 03:55
Default
  #2
Senior Member
 
Kmeti Rao
Join Date: May 2019
Posts: 145
Rep Power: 8
Krao is on a distinguished road
Hi Junting Chen,

I hope the following link would help you to run your case in parallel.

How to run CfMesh in parallel?
Krao is offline   Reply With Quote

Old   August 5, 2019, 08:06
Default
  #3
Member
 
chen112p's Avatar
 
Junting Chen
Join Date: Feb 2016
Location: Ontario Canada
Posts: 37
Rep Power: 10
chen112p is on a distinguished road
Hello Krao,

I am able to run it on one pc with OpenMP. The issue is MPI.

Thanks,

Junting
chen112p is offline   Reply With Quote

Old   August 5, 2019, 09:33
Default
  #4
Senior Member
 
Kmeti Rao
Join Date: May 2019
Posts: 145
Rep Power: 8
Krao is on a distinguished road
Did you run preparePar? before running cartesianMesh in parallel

Krao
Krao is offline   Reply With Quote

Old   August 5, 2019, 11:46
Default
  #5
Member
 
chen112p's Avatar
 
Junting Chen
Join Date: Feb 2016
Location: Ontario Canada
Posts: 37
Rep Power: 10
chen112p is on a distinguished road
yes I did. I ran preparePar so I got 32 files.

Junting
chen112p is offline   Reply With Quote

Old   August 5, 2019, 12:29
Default
  #6
Member
 
chen112p's Avatar
 
Junting Chen
Join Date: Feb 2016
Location: Ontario Canada
Posts: 37
Rep Power: 10
chen112p is on a distinguished road
So my workflow is :
1. put aaa.stl in the folder
2. run 'surfaceGenerateBoundingBox aaa.stl aaa_boundingbox.stl xMin xMax yMin yMax zMin zMax
3. run 'surfaceFeatureEdges aaa_boundingbox.stl aaa_FeatureEdges.fms -angle 5'
4. run 'surfaceFeatureEdges aaa_boundingbox.stl aaa_FeatureEdges.vtk -angle 5'
5. write systme/meshDict:
- surfaceFile "aaa_FeatureEdges.fms;
- edgeFile "aaa_FeatureEdges.vtk";
6. write decomposeParDict:
- numberOfSubdomains 32;
7. run 'preparePar'
8. run 'mpirun -np 32 cartesianMesh -parallel > 01.cfMesh.log'

output log says following at the end:

Extracting edges
Starting topological adjustment of patches
No topological adjustment was needed
Starting geometrical adjustment of patches
Found 8 corners at the surface of the volume mesh
Found 649 edge points at the surface of the volume mesh
12 edge groups found!
--> FOAM Warning :
From function void edgeExtractor::extractEdges()
in file utilities/surfaceTools/edgeExtraction/edgeExtractor/edgeExtractor.C at line 2115
Found 0 points with inverted surface normals. Getting rid of them...
Starting untangling the surface of the volume mesh
Number of inverted boundary faces is 14
[nnode7:23282] *** An error occurred in MPI_Recv
[nnode7:23282] *** reported by process [1091829761,12]
[nnode7:23282] *** on communicator MPI_COMM_WORLD
[nnode7:23282] *** MPI_ERR_TRUNCATE: message truncated
[nnode7:23282] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[nnode7:23282] *** and potentially your MPI job)

Thanks a lot for your patience!

Junting Chen
chen112p is offline   Reply With Quote

Old   August 6, 2019, 03:10
Default
  #7
Senior Member
 
Kmeti Rao
Join Date: May 2019
Posts: 145
Rep Power: 8
Krao is on a distinguished road
Hi Junting Chen,

I have followed something like what you mentioned and was able to run the case with the following steps, can you try it out once.

1. put aaa.stl in the folder
2. run 'surfaceGenerateBoundingBox aaa.stl aaa_boundingbox.stl xMin xMax yMin yMax zMin zMax
3. run 'surfaceFeatureEdges aaa_boundingbox.stl aaa_FeatureEdges.fms -angle 5'
4. run 'FMSToSurface -exportFeatureEdges aaa_FeatureEdges.fms new_aaa_FeatureEdges.fms'

The 4th step mentioned above will generate a .vtk file along with new .fms file. Write this .vtk file under edge refinement of system/meshDict

5. write systme/meshDict: under edge refinement
-outputof4thstep.vtk

6. write decomposeParDict:
- numberOfSubdomains 32;
7. run 'preparePar'
8. run 'mpirun -n 32 cartesianMesh -parallel > 01.cfMesh.log'

This workflow, which I followed on OpenFOAM6 and got no errors, please let me know if you have any issues with the above steps.

Krao
Krao is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
"Failed Starting Thread 0" ebringley OpenFOAM Running, Solving & CFD 2 April 26, 2019 05:45
The problem when i use parallel computation for mesh deforming. Hiroaki Sumikawa OpenFOAM Running, Solving & CFD 0 November 20, 2018 02:58
Explicitly filtered LES saeedi Main CFD Forum 16 October 14, 2015 11:58
Problem running in parralel Val OpenFOAM Running, Solving & CFD 1 June 12, 2014 02:47
Issue with running in parallel on multiple nodes daveatstyacht OpenFOAM 7 August 31, 2010 17:16


All times are GMT -4. The time now is 04:10.