|
[Sponsors] |
[mesh manipulation] Cannot get refineMesh to run in parallel |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
June 1, 2014, 09:01 |
Cannot get refineMesh to run in parallel
|
#1 |
New Member
Join Date: Nov 2010
Posts: 18
Rep Power: 15 |
Greetings everyone,
I am relatively new to OpenFOAM (2.1.0) and I have been struggling to get refneMesh to run in parallel. I have a portion of my mesh that I would like to refine further. This volume contains approximately 12,000,000 cells. Here is some background information. I started by importing a fluent mesh from ICEM-CFD. I ran checkMesh -allTopology and -allGeometry and everthing looked good. I then decomposed the mesh by using the decomposePar command (I am running on 120 cores). Next, I ran topoSet. I then tried to use the command refineMesh -parallel. However, I get the error message below. Based on the first line of the error message it looks like I'm attempting to run in parallel on 1 processor. Does anyone know why I am receiving this message and how I can run refineMesh in parallel? Why doesn't it recognize that I decomposed the mesh into 120 cores? I also tried the command refineMesh -np 120 -parallel and that didn't work either. Any help that anyone can provde would be very much appreciated. NOTE1: I get the same error message when I try to run topoSet in parallel. NOTE2: I can get refineMesh to work in serial but on a much smaller number of cells. When I run refineMesh in serial on the 12,000,000 cells I get an abort message, which I believe is due to a lack of memory. Thank you! Best Regards, Scott Code:
--> FOAM FATAL ERROR: bool IPstream::init(int& argc, char**& argv) : attempt to run parallel on 1 processor From function UPstream::init(int& argc, char**& argv) in file UPstream.C at line 81. FOAM aborting #0 Foam::error::printStack(Foam::Ostream&)-------------------------------------------------------------------------- An MPI process has executed an operation involving a call to the "fork()" system call to create a child process. Open MPI is currently operating in a condition that could result in memory corruption or other system errors; your MPI job may hang, crash, or produce silent data corruption. The use of fork() (or system() or other calls that create child processes) is strongly discouraged. The process that invoked fork was: Local host: hpc-ar-07 (PID 5123) MPI_COMM_WORLD rank: 0 If you are *absolutely sure* that your application will successfully and correctly survive a call to fork(), you may disable this warning by setting the mpi_warn_on_fork MCA parameter to 0. -------------------------------------------------------------------------- in "/apps/openfoam/gnu/OpenFOAM-2.1.1/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" #1 Foam::error::abort() in "/apps/openfoam/gnu/OpenFOAM-2.1.1/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" #2 Foam::UPstream::init(int&, char**&) in "/apps/openfoam/gnu/OpenFOAM-2.1.1/platforms/linux64GccDPOpt/lib/openmpi-system/libPstream.so" #3 Foam::argList::argList(int&, char**&, bool, bool) in "/apps/openfoam/gnu/OpenFOAM-2.1.1/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" #4 in "/apps/openfoam/gnu/OpenFOAM-2.1.1/platforms/linux64GccDPOpt/bin/refineMesh" #5 __libc_start_main in "/lib64/libc.so.6" #6 in "/apps/openfoam/gnu/OpenFOAM-2.1.1/platforms/linux64GccDPOpt/bin/refineMesh" [hpc-ar-07:05123] *** Process received signal *** [hpc-ar-07:05123] Signal: Aborted (6) [hpc-ar-07:05123] Signal code: (-6) [hpc-ar-07:05123] [ 0] /lib64/libc.so.6(+0x329a0) [0x2aaaac74a9a0] [hpc-ar-07:05123] [ 1] /lib64/libc.so.6(gsignal+0x35) [0x2aaaac74a925] [hpc-ar-07:05123] [ 2] /lib64/libc.so.6(abort+0x175) [0x2aaaac74c105] [hpc-ar-07:05123] [ 3] /apps/openfoam/gnu/OpenFOAM-2.1.1/platforms/linux64GccDPOpt/lib/libOpenFOAM.so(_ZN4Foam5error5abortEv+0x23b) [0x2aaaab7c9e3b] [hpc-ar-07:05123] [ 4] /apps/openfoam/gnu/OpenFOAM-2.1.1/platforms/linux64GccDPOpt/lib/openmpi-system/libPstream.so(_ZN4Foam8UPstream4initERiRPPc+0x251) [0x2aaaadd77341] [hpc-ar-07:05123] [ 5] /apps/openfoam/gnu/OpenFOAM-2.1.1/platforms/linux64GccDPOpt/lib/libOpenFOAM.so(_ZN4Foam7argListC1ERiRPPcbb+0x2a88) [0x2aaaab7dec18] [hpc-ar-07:05123] [ 6] refineMesh() [0x407a20] [hpc-ar-07:05123] [ 7] /lib64/libc.so.6(__libc_start_main+0xfd) [0x2aaaac736d1d] [hpc-ar-07:05123] [ 8] refineMesh() [0x406779] [hpc-ar-07:05123] *** End of error message *** Broken pipe Last edited by wyldckat; June 1, 2014 at 09:10. Reason: Added [CODE][/CODE] |
|
June 1, 2014, 09:13 |
|
#2 | |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,980
Blog Entries: 45
Rep Power: 128 |
Greetings Scott,
Even if OpenFOAM doesn't seem very user friendly, it does often (not always) give elucidating messages as to why something didn't work. In this case, it tells you right in the 2nd line: Quote:
Always keep in mind that the "-parallel" option usually requires that you launch with mpirun or foamJob Oh, and when in doubt, use "-help" to get more information from the application/script in question, for example: Code:
foamJob -help Bruno
__________________
|
||
June 3, 2014, 11:20 |
|
#3 |
New Member
Join Date: Nov 2010
Posts: 18
Rep Power: 15 |
Thank you so much for your help Bruno. I was able to get it to work by using mpirun.
thanks again, Scott |
|
Tags |
refinemesh -parallel |
Thread Tools | Search this Thread |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
SimpleFoam cannot open include file | Marija | OpenFOAM Running, Solving & CFD | 1 | October 28, 2020 10:35 |
Problem with foam-extend 4.0 ggi parallel run | Metikurke | OpenFOAM Running, Solving & CFD | 1 | December 6, 2018 15:51 |
problem during mpi in server: expected Scalar, found on line 0 the word 'nan' | muth | OpenFOAM Running, Solving & CFD | 3 | August 27, 2018 04:18 |
Parallel run: bool ipstream::init attempt to run parallel on 1 processor | SarahLee | OpenFOAM Running, Solving & CFD | 2 | January 11, 2017 03:24 |
Parallel Run on dynamically mounted partition | braennstroem | OpenFOAM Running, Solving & CFD | 14 | October 5, 2010 14:43 |