CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Meshing & Mesh Conversion

[mesh manipulation] multiple calls to refineMesh parallel w/ dict failing

Register Blogs Members List Search Today's Posts Mark Forums Read

Like Tree2Likes
  • 2 Post By Regis_

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   April 27, 2015, 16:30
Default multiple calls to refineMesh parallel w/ dict failing
  #1
New Member
 
Regis
Join Date: Jan 2012
Posts: 24
Rep Power: 14
Regis_ is on a distinguished road
I'm creating a simple mesh on a Cartesian domain through blockMesh and then I'm trying to do local refinements in sub-regions of the domain using refineMesh (w/ dictionary). The refinements are being done in parallel.
Below is part of the script I set up to create the mesh:

Code:
refineMeshLocal()
{
   i=1
   while [ $i -le $1 ]
   do
      cp system/topoSetDict.local.$i system/topoSetDict
      mpirun -np $cores topoSet -parallel > log.topoSet.local.$i 2>&1

      cp system/refineMeshDict.local system/refineMeshDict

      mpirun -np $cores refineMesh -parallel -dict -overwrite > log.refineMesh.local.$i 2>&1
      mpirun -np $cores checkMesh -parallel > log.checkMesh.local.$i 2>&1
      let i=i+1
   done
}

blockMesh > log.blockMesh 2>&1
checkMesh > log.checkMesh.background 2>&1

# Decomposing the mesh
cp system/decomposeParDict.$cores system/decomposeParDict
decomposePar -cellDist -force > log.decomposePar 2>&1

# Perform local ref (parallel)
refineMeshLocal 2
Quick remarks:
- the refinement region is being specified thought cellSet using topoSet
- the region specified in topoSet is within the domain;

So If do just one refinement, everything works fine. However, whenever I try to do more than a single refinement, I get either warnings or fatal errors on second/third/etc refineMesh' logfiles.

This is a typical error (using 64 processors, but same happens with more processors):

Code:
[35] processorPolyPatch::order : Dumping neighbour faceCentres to "/home/rus284/OpenFOAM_Run-2.0.1/DemoCases_Jan15/ST0g2l64c/processor35/procBoundary35to34_nbrFaceCentres.obj"
[61] processorPolyPatch::order : Dumping neighbour faceCentres to "/home/rus284/OpenFOAM_Run-2.0.1/DemoCases_Jan15/ST0g2l64c/processor61/procBoundary61to60_nbrFaceCentres.obj"
[35] 
[35] 
[61] 
[61] 
[61] --> FOAM FATAL ERROR: 
[61] in patch:procBoundary61to60 : Local size of patch is 48 (faces).
Received from neighbour 47 faceCentres!
[61] 
[61]     From function processorPolyPatch::order(const primitivePatch&, labelList&, labelList&) const
[61]     in file meshes/polyMesh/polyPatches/constraint/processor/processorPolyPatch.C at line 574.
[61] 
FOAM parallel run aborting
[61] 
[35] --> FOAM FATAL ERROR: 
[35] in patch:procBoundary35to34 : Local size of patch is 47 (faces).
Received from neighbour 45 faceCentres!
[35] 
[35]     From function processorPolyPatch::order(const primitivePatch&, labelList&, labelList&) const
[35]     in file meshes/polyMesh/polyPatches/constraint/processor/processorPolyPatch.C at line 574.
[35] 
FOAM parallel run aborting
[35] 
[61] #0  Foam::error::printStack(Foam::Ostream&)--------------------------------------------------------------------------
An MPI process has executed an operation involving a call to the
"fork()" system call to create a child process.  Open MPI is currently
operating in a condition that could result in memory corruption or
other system errors; your MPI job may hang, crash, or produce silent
data corruption.  The use of fork() (or system() or other calls that
create child processes) is strongly discouraged.  

The process that invoked fork was:

  Local host:          node41.cocoa5 (PID 9388)
  MPI_COMM_WORLD rank: 61

If you are *absolutely sure* that your application will successfully
and correctly survive a call to fork(), you may disable this warning
by setting the mpi_warn_on_fork MCA parameter to 0.
--------------------------------------------------------------------------
[35] #0  Foam::error::printStack(Foam::Ostream&) in "/home/rus284/OpenFOAM/OpenFOAM- in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
2.0.1/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[61] #1  [35] #1  Foam::error::abort()Foam::error::abort() in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[35] #2   in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[61] #2  Foam::Ostream& Foam::operator<< <Foam::error>(Foam::Ostream&, Foam::errorManip<Foam::error>)Foam::Ostream& Foam::operator<< <Foam::error>(Foam::Ostream&, Foam::errorManip<Foam::error>) in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/bin/refineMesh"
[35] #3  Foam::processorPolyPatch::order(Foam::PstreamBuffers&, Foam::PrimitivePatch<Foam::face, Foam::SubList, Foam::Field<Foam::Vector<double> > const&, Foam::Vector<double> > const&, Foam::List<int>&, Foam::List<int>&) const in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/bin/refineMesh"
[61] #3  Foam::processorPolyPatch::order(Foam::PstreamBuffers&, Foam::PrimitivePatch<Foam::face, Foam::SubList, Foam::Field<Foam::Vector<double> > const&, Foam::Vector<double> > const&, Foam::List<int>&, Foam::List<int>&) const in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[35] #4  Foam::polyTopoChange::reorderCoupledFaces(bool, Foam::polyBoundaryMesh const&, Foam::List<int> const&, Foam::List<int> const&, Foam::Field<Foam::Vector<double> > const&) in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[61] #4  Foam::polyTopoChange::reorderCoupledFaces(bool, Foam::polyBoundaryMesh const&, Foam::List<int> const&, Foam::List<int> const&, Foam::Field<Foam::Vector<double> > const&) in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libdynamicMesh.so"
[35] #5  Foam::polyTopoChange::compactAndReorder(Foam::polyMesh const&, bool, bool, bool, int&, Foam::Field<Foam::Vector<double> >&, Foam::List<int>&, Foam::List<int>&, Foam::List<Foam::objectMap>&, Foam::List<Foam::objectMap>&, Foam::List<Foam::objectMap>&, Foam::List<Foam::objectMap>&, Foam::List<Foam::objectMap>&, Foam::List<Foam::objectMap>&, Foam::List<Foam::objectMap>&, Foam::List<Foam::objectMap>&, Foam::List<Foam::Map<int> >&, Foam::List<int>&, Foam::List<int>&, Foam::List<Foam::Map<int> >&) in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libdynamicMesh.so"
[61] #5  Foam::polyTopoChange::compactAndReorder(Foam::polyMesh const&, bool, bool, bool, int&, Foam::Field<Foam::Vector<double> >&, Foam::List<int>&, Foam::List<int>&, Foam::List<Foam::objectMap>&, Foam::List<Foam::objectMap>&, Foam::List<Foam::objectMap>&, Foam::List<Foam::objectMap>&, Foam::List<Foam::objectMap>&, Foam::List<Foam::objectMap>&, Foam::List<Foam::objectMap>&, Foam::List<Foam::objectMap>&, Foam::List<Foam::Map<int> >&, Foam::List<int>&, Foam::List<int>&, Foam::List<Foam::Map<int> >&) in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libdynamicMesh.so"
[61] #6  Foam::polyTopoChange::changeMesh(Foam::polyMesh&, bool, bool, bool, bool) in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libdynamicMesh.so"
[35] #6  Foam::polyTopoChange::changeMesh(Foam::polyMesh&, bool, bool, bool, bool) in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/li in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libdynamicMesh.so"
[35] #7  Foam::refinementIterator::setRefinement(Foam::List<Foam::refineCell> const&)nux64GccDPOpt/lib/libdynamicMesh.so"
[61] #7  Foam::refinementIterator::setRefinement(Foam::List<Foam::refineCell> const&) in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libdynamicMesh.so"
[35] #8  Foam::multiDirRefinement::refineAllDirs(Foam::polyMesh&, Foam::List<Foam::Field<Foam::Vector<double> > >&, Foam::cellLooper const&, Foam::undoableMeshCutter&, bool) in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libdynamicMesh.so"
[61] #8  Foam::multiDirRefinement::refineAllDirs(Foam::polyMesh&, Foam::List<Foam::Field<Foam::Vector<double> > >&, Foam::cellLooper const&, Foam::undoableMeshCutter&, bool) in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libdynamicMesh.so"
[61] #9  Foam::multiDirRefinement::refineFromDict(Foam::polyMesh&, Foam::List<Foam::Field<Foam::Vector<double> > >&, Foam::dictionary const&, bool) in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libdynamicMesh.so"
[35] #9  Foam::multiDirRefinement::refineFromDict(Foam::polyMesh&, Foam::List<Foam::Field<Foam::Vector<double> > >&, Foam::dictionary const&, bool) in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libdynamicMesh.so"
[61] #10  Foam::multiDirRefinement::multiDirRefinement(Foam::polyMesh&, Foam::List<int> const&, Foam::dictionary const&) in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libdynamicMesh.so"
[35] #10  Foam::multiDirRefinement::multiDirRefinement(Foam::polyMesh&, Foam::List<int> const&, Foam::dictionary const&) in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/li in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libdynamicMesh.so"
[35] #11  bdynamicMesh.so"
[61] #11  

[35]  in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/bin/refineMesh"
[35] #12  __libc_start_main[61]  in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/bin/refineMesh"
[61] #12  __libc_start_main in "/lib64/libc.so.6"
[35] #13   in "/lib64/libc.so.6"
[61] #13  

[61]  in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/bin/refineMesh"
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 35 in communicator MPI_COMM_WORLD 
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[35]  in "/home/rus284/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/bin/refineMesh"
--------------------------------------------------------------------------
mpirun has exited due to process rank 61 with PID 9388 on
node node41.cocoa5 exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[node42.cocoa5:05595] 1 more process has sent help message help-mpi-runtime.txt / mpi_init:warn-fork
[node42.cocoa5:05595] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
[node42.cocoa5:05595] 1 more process has sent help message help-mpi-api.txt / mpi-abort
And this is a typical warning that sometimes shows up (64 processors):

Code:
--> FOAM Warning : 
    From function refinementIterator
    in file meshCut/meshModifiers/refinementIterator/refinementIterator.C at line 272
    stopped refining.Did not manage to refine a single cell
Wanted :0
Besides the error and warning, in the second/third/etc refineMesh logfiles, there's extra information that does not appear on the first refineMesh logfile:

Code:
[5] Global Coordinate system:
[5]      normal : (0 0 1)
[5]      tan1   : (1 0 0)
[5]      tan2   : (0 1 0)
This shows up for each processor. I just found it weird that such information is not printed in the first call to refineMesh.

Also, everything goes fine if a do global refinements instead, that is, leaving only refineMesh (without -dict) and checkMesh in the function I'm calling in the last line.

Bottom line is that I wasn't able to consistently get just errors or get just warnings. I'm kinda clueless now. Any thoughts? Let me know if you want to see any dictionary.

Regis

Last edited by Regis_; April 27, 2015 at 19:33. Reason: typos
Regis_ is offline   Reply With Quote

Old   April 30, 2015, 14:12
Default
  #2
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
Quick answer: According to the output, it seems that you're using OpenFOAM 2.0.1. This is a considerably old version of OpenFOAM and this is probably a bug that has already been fixed in 2.0.x, 2.1.x, 2.2.x or even in 2.3.x, I don't know in which one. If you had provided a test case, I or anyone else could test it with any of these versions of OpenFOAM.
wyldckat is offline   Reply With Quote

Old   June 4, 2015, 14:44
Default
  #3
New Member
 
Regis
Join Date: Jan 2012
Posts: 24
Rep Power: 14
Regis_ is on a distinguished road
Thanks Bruno for you reply and sorry for taking long to get back. I understand that the version I'm using is old. I'm using some other libraries written for this version and I will "port" them to a more recent version as soon as I have time.

So I figured out part of my problem. For the sake of future reference, here it is: I had a very coarse background mesh generated via blockMesh. Then using refineMesh several times with slightly larger regions of refinement was causing the issue, because I had less than one cell between these regions.

This bug report is totally related to my problem and the solution I just described came from there: http://www.openfoam.org/mantisbt/view.php?id=465.

Cheers!
wyldckat and Ramzy1990 like this.
Regis_ is offline   Reply With Quote

Reply

Tags
mesh, parallel, refinemesh

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
[mesh manipulation] Unstructured tetrahedral mesh Refinement using REFINEMESH DICT UTILITY.???? POSSIBLE? saddy OpenFOAM Meshing & Mesh Conversion 4 February 1, 2019 06:58
[mesh manipulation] refineMesh Dict challenger OpenFOAM Meshing & Mesh Conversion 2 January 14, 2011 00:18
Meshing support parallel multiple computers! Kevin Siemens 1 July 26, 2007 20:27
running multiple Fluent parallel jobs Michael Bo Hansen FLUENT 8 June 7, 2006 09:52
Fluent cases in parallel across multiple machines Riaan FLUENT 3 April 11, 2005 12:51


All times are GMT -4. The time now is 22:40.