CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Community Contributions (https://www.cfd-online.com/Forums/openfoam-community-contributions/)
-   -   [PyFoam] running pyFoam(Plot)Runner.py in parallel (https://www.cfd-online.com/Forums/openfoam-community-contributions/115952-running-pyfoam-plot-runner-py-parallel.html)

Studi April 10, 2013 03:44

running pyFoam(Plot)Runner.py in parallel
 
Hello everybody!

I have troubles starting a case in parallel (only local on one machine with multiple cores). If I type
Code:

pyFoamRunner.py --procnr=2 simpleFoam
within my case I receive this error message:
Code:

PyFoam WARNING on line 144 of file /home/fem/OpenFOAM/PyFOAM-0.6.0/lib/python2.7/site-packages/PyFoam/Execution/ParallelExecution.py : which can not find a match for simpleFoam . Hoping for the best
[M21556:06485] Warning: could not find environment variable "FOAM_MPI_LIBBIN"
[M21556:06485] Warning: could not find environment variable "MPI_B"
--------------------------------------------------------------------------
mpirun was unable to launch the specified application as it could not find an executable:

Executable: UFFER_SIZE
Node: M21556

while attempting to start process rank 0.
--------------------------------------------------------------------------
2 total processes failed to start
Killing PID 6485

It doesn't work with any possible option. (--autosense-parallel, --procnr=N)

The weird thing is:
Without the parallel options (thus on a single core) pyFoamRunner.py works properly!
And on top:
If I start a parallel run "manually" with
Code:

mpirun -np 2 simpleFoam -parallel
it works without any problems, too!

Does anyone know, what to do? I really like pyFoam an would like to use it furthermore. Thanks an advance.


Regards
Sebastian

gschaider April 10, 2013 04:45

Quote:

Originally Posted by Studi (Post 419552)
Hello everybody!

I have troubles starting a case in parallel (only local on one machine with multiple cores). If I type
Code:

pyFoamRunner.py --procnr=2 simpleFoam
within my case I receive this error message:
Code:

PyFoam WARNING on line 144 of file /home/fem/OpenFOAM/PyFOAM-0.6.0/lib/python2.7/site-packages/PyFoam/Execution/ParallelExecution.py : which can not find a match for simpleFoam . Hoping for the best
[M21556:06485] Warning: could not find environment variable "FOAM_MPI_LIBBIN"
[M21556:06485] Warning: could not find environment variable "MPI_B"
--------------------------------------------------------------------------
mpirun was unable to launch the specified application as it could not find an executable:

Executable: UFFER_SIZE
Node: M21556

while attempting to start process rank 0.
--------------------------------------------------------------------------
2 total processes failed to start
Killing PID 6485

It doesn't work with any possible option. (--autosense-parallel, --procnr=N)

The weird thing is:
Without the parallel options (thus on a single core) pyFoamRunner.py works properly!
And on top:
If I start a parallel run "manually" with
Code:

mpirun -np 2 simpleFoam -parallel
it works without any problems, too!

Does anyone know, what to do? I really like pyFoam an would like to use it furthermore. Thanks an advance.


Regards
Sebastian

Hm. That is strange. The first warning says that "which simpleFoam" doesn't find an executable. And then it says that there is also no environment variable FOAM_MPI_LIBBIN (which may be OK for newer OF-installations. But then comes the weird part: the missing MPI_B and UFFER_SIZE (which it thinks that it is the executable) should be one string.

I think there is something problematic with the settings:
http://openfoamwiki.net/index.php/Co...yFoam#Settings

Check with pyFoamDumpConfiguration.py and look for the [MPI]-section. It should look somehow like this:
Code:

options_openmpi_post: ["-x","PATH","-x","LD_LIBRARY_PATH","-x","WM_PROJECT_DIR","-x","PYTHONPATH","-x","FOAM_MPI_LIBBIN","-x","MPI_BUFFER_SIZE","-x","MPI_ARCH_PATH"]
openmpi_add_prefix: False
options_openmpi_pre: ["--mca","pls","rsh","--mca","pls_rsh_agent","rsh"]

(especially MPI_BUFFER_SIZE should be one string)

To check which call to mpirun is actually used you can add this configuration option (will also print a lot of other things):
Code:

[Debug]
ParallelExecution: True

If MPI_BUFFER_SIZE looks OK in your configuration then I'm a bit surprised. What shell ("echo $SHELL") do you use?

JR22 April 10, 2013 06:59

This is working for me with OF-2.2 and PyFoam-0.6.0:
Code:

pyFoamPlotRunner.py mpirun -np 12 simpleFoam -parallel | tee log/simpleFoam.log
I even have the pipe-tee in there working to get the log. The 12 is because I am hyperthreading on an i7-3930k, and for some reason (maybe my specific setting) it works better than just 6 cores.

gschaider April 10, 2013 07:39

Quote:

Originally Posted by JR22 (Post 419597)
This is working for me with OF-2.2 and PyFoam-0.6.0:
Code:

pyFoamPlotRunner.py mpirun -np 12 simpleFoam -parallel | tee log/simpleFoam.log
I even have the pipe-tee in there working to get the log. The 12 is because I am hyperthreading on an i7-3930k, and for some reason (maybe my specific setting) it works better than just 6 cores.

The tee is superfuous: pyFoamPlotRunner.py will automatically generate a file PyFoamRunner.mpirun.logfile. And usually the --proc=X options work quite fine and can be easily adapted with the configuration options if they're not working on your system. The problem that Studi has is that one of the options automatically passed to mpirun is "broken" (the options can be quite useful when the mpirun starts the run on two different physical machines as some environment variables are then not passed to "the other side")

Studi April 10, 2013 09:18

Hello gschaider!

Direct hit with first shot! Yesterday I have edited the pyfoamrc file to get things working on different machines over network. I deleted the changes again and now it works.
There are two warnings again (see appending log), but the solver starts nonetheless.
I've read about the deprecated parameter on the openMPI homepage, but as long as it is working, this won't be any of my concerns. The same applies to the FOAM_MPI_LIBBIN.
Of course any suggestions for improvement are very welcome nevertheless!

Thanks a lot for helping me with this issue!

Reminding me of parallel computing with different machines: Is it necessary to distibute the according data on every node (with copying data on every node or via NFS)? I've read about MPI, that it can't "push" the data itself to every node automatically...


Regard
Sebastian

Code:

PyFoam WARNING on line 144 of file /home/fem/OpenFOAM/PyFOAM-0.6.0/lib/python2.7/site-packages/PyFoam/Execution/ParallelExecution.py : which can not find a match for simpleFoam . Hoping for the best
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
[M21556:04881] Warning: could not find environment variable "FOAM_MPI_LIBBIN"
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
/*---------------------------------------------------------------------------*\
| =========                |                                                |
| \\      /  F ield        | OpenFOAM: The Open Source CFD Toolbox          |
|  \\    /  O peration    | Version:  2.1.1                                |
|  \\  /    A nd          | Web:      www.OpenFOAM.org                      |
|    \\/    M anipulation  |                                                |
\*---------------------------------------------------------------------------*/
Build  : 2.1.1-221db2718bbb
Exec  : simpleFoam -parallel
Date  : Apr 10 2013
Time  : 14:55:32
Host  : "M21556"
PID    : 4884
Case  : /home/fem/Berechnung/tetraMesh/tetraMesh10pyF
nProcs : 2
Slaves :
1
(
"M21556.4885"
)

Pstream initialized with:
floatTransfer    : 0
nProcsSimpleSum  : 0
commsType        : nonBlocking
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0

Reading field p

Reading field U

Reading/calculating face flux field phi

Selecting incompressible transport model powerLaw
Selecting RAS turbulence model laminar
No field sources present


Starting time loop


gschaider April 10, 2013 12:20

Quote:

Originally Posted by Studi (Post 419632)
Hello gschaider!

Direct hit with first shot! Yesterday I have edited the pyfoamrc file to get things working on different machines over network. I deleted the changes again and now it works.
There are two warnings again (see appending log), but the solver starts nonetheless.
I've read about the deprecated parameter on the openMPI homepage, but as long as it is working, this won't be any of my concerns. The same applies to the FOAM_MPI_LIBBIN.
Of course any suggestions for improvement are very welcome nevertheless!

The hardcoded options are not necessarily the best, only the ones that worked for me on most machines. But you can easily override them (even on a per-OF-version basis if you have different MPIs for different versions)

Quote:

Originally Posted by Studi (Post 419632)
Thanks a lot for helping me with this issue!

Reminding me of parallel computing with different machines: Is it necessary to distibute the according data on every node (with copying data on every node or via NFS)? I've read about MPI, that it can't "push" the data itself to every node automatically...

Every processor has to be able to "see" its processorX-directory. If they're all in the same NFS-directory and every node can access that then you're fine. Problem is that for a large number of processors NFS might be the bottle-neck and you'll want to distribute these directories onto multiple machines. But I haven't done that and would suggest you ask elsewhere on the Board

ripudaman March 31, 2014 14:15

which can't find my solver
 
I have created my own solver which is based on solidDisplacementFoam in 2.3.x. I get the first warning that has been shown above:
Code:

PyFoam WARNING on line 144 of file /usr/local/lib/python2.7/dist-packages/PyFoam/Execution/ParallelExecution.py : which can not find a match for convergeFracWidthFoam . Hoping for the best
I am concerned regarding the implications of this warning.

Thanks in advance.

gschaider March 31, 2014 19:51

Quote:

Originally Posted by ripudaman (Post 483074)
I have created my own solver which is based on solidDisplacementFoam in 2.3.x. I get the first warning that has been shown above:
Code:

PyFoam WARNING on line 144 of file /usr/local/lib/python2.7/dist-packages/PyFoam/Execution/ParallelExecution.py : which can not find a match for convergeFracWidthFoam . Hoping for the best
I am concerned regarding the implications of this warning.

Thanks in advance.

If it runs then everything is fine. The main purpose of this warning is to give a hint if the solver is really not found

ripudaman March 31, 2014 22:56

The code is able to find my solver. However, it does not run as it should. I tried replacing my modified solver (convergeFracWidthFoam) with solidDisplacementFoam and the code worked using this command:
Code:

run=BasicRunner(argv=["solidDisplacementFoam","-case",work.name],lam=machine)
run.start()

as well as this code:
Code:

run=AnalyzedRunner(CONVERGED,silent=True,argv=["solidDisplacementFoam","-case",work.name],lam=machine)
run.start()

where the object CONVERGED is a custom LogAnalyzer object.

However when I replace solidDisplacementFoam with convergeFracWidthFoam in either of the above options, the code does not go through the iterations. In fact for the BasicRunner case it gives me the following error:
Code:

PyFoam WARNING on line 144 of file /usr/local/lib/python2.7/dist-packages/PyFoam/Execution/ParallelExecution.py : which can not find a match for convergeFracWidthFoam . Hoping for the best
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
[ubuntu:08918] Warning: could not find environment variable "PYTHONPATH"
[ubuntu:08918] Warning: could not find environment variable "FOAM_MPI_LIBBIN"
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
/*---------------------------------------------------------------------------*\
| =========                |                                                |
| \\      /  F ield        | OpenFOAM: The Open Source CFD Toolbox          |
|  \\    /  O peration    | Version:  2.3.x                                |
|  \\  /    A nd          | Web:      www.OpenFOAM.org                      |
|    \\/    M anipulation  |                                                |
\*---------------------------------------------------------------------------*/
Build  : 2.3.x-e0d5f5a218ab
Exec  : convergeFracWidthFoam -case /home/ripuvm/OpenFOAM/ripuvm-2.3.x/multiFrac/cases/noFracTraj/try2 -parallel
Date  : Mar 31 2014
Time  : 21:47:12
Host  : "ubuntu"
PID    : 8921
Case  : /home/ripuvm/OpenFOAM/ripuvm-2.3.x/multiFrac/cases/noFracTraj/try2
nProcs : 4
Slaves :
3
(
"ubuntu.8922"
"ubuntu.8923"
"ubuntu.8924"
)

Pstream initialized with:
floatTransfer      : 0
nProcsSimpleSum    : 0
commsType          : nonBlocking
polling iterations : 0
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0

Reading mechanical properties

Normalising E : E/rho

Calculating Lame's coefficients

Plane Strain

Reading thermal properties

Reading field D

Calculating stress field sigmaDex

Calculating stress field sigmaD

Calculating explicit part of div(sigma) divSigmaExp


Calculating displacement field

Iteration: 1

Time = 1


One = 0  Two = 0

One = 0  Two = 0

One = 0  Two = 0
[1] #0  [2] #0  Foam::error::printStack(Foam::Ostream&)Foam::error::printStack(Foam::Ostream&)[3] #0  Foam::error::printStack(Foam::Ostream&) in "/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #1  Foam::sigFpe::sigHandler(int) in "/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #1  Foam::sigFpe::sigHandler(int) in "/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[3] #1  Foam::sigFpe::sigHandler(int) in "/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] # in "/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[3] #2  2  in "/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #2  in "/lib/x86_64-linux-gnu/libc.so.6"
[1] #3  in "/lib/x86_64-linux-gnu/libc.so.6"
[3] #3  in "/lib/x86_64-linux-gnu/libc.so.6"
[2] #3


[3]  in "/home/ripuvm/OpenFOAM/ripuvm-2.3.x/platforms/linux64GccDPOpt/bin/convergeFracWidthFoam"
[3] #4  __libc_start_main[1]  in "/home/ripuvm/OpenFOAM/ripuvm-2.3.x/platforms/linux64GccDPOpt/bin/convergeFracWidthFoam"
[1] #4  __libc_start_main[2]  in "/home/ripuvm/OpenFOAM/ripuvm-2.3.x/platforms/linux64GccDPOpt/bin/convergeFracWidthFoam"
[2] #4  __libc_start_main in "/lib/x86_64-linux-gnu/libc.so.6"
[3] #5  in "/lib/x86_64-linux-gnu/libc.so.6"
[2] #5  in "/lib/x86_64-linux-gnu/libc.so.6"
[1] #5


[1] [3]  in "/home/ripuvm/OpenFOAM/ripuvm-2.3.x/platforms/linux64GccDPOpt/bin/convergeFracWidthFoam"
[ubuntu:08924] *** Process received signal ***
[ubuntu:08924] Signal: Floating point exception (8)
[ubuntu:08924] Signal code:  (-6)
[ubuntu:08924] Failing at address: 0x3e8000022dc
in "/home/ripuvm/OpenFOAM/ripuvm-2.3.x/platforms/linux64GccDPOpt/bin/convergeFr[ubuntu:08924] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f5ee6fb44a0]
[ubuntu:08924] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0x35) [0x7f5ee6fb4425]
[ubuntu:08924] [ 2] /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f5ee6fb44a0]
[ubuntu:08924] [ 3] convergeFracWidthFoam() [0x42b989]
[ubuntu:08924] [ 4] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xed) [0x7f5ee6f9f76d]
[ubuntu:08924] [ 5] convergeFracWidthFoam() [0x43523d]
[ubuntu:08924] *** End of error message ***
acWidthFoam"
[ubuntu:08922] *** Process received signal ***
[ubuntu:08922] Signal: Floating point exception (8)
[ubuntu:08922] Signal code:  (-6)
[ubuntu:08922] Failing at address: 0x3e8000022da
[ubuntu:08922] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f7c0749b4a0]
[ubuntu:08922] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0x35) [0x7f7c0749b425]
[ubuntu:08922] [ 2] /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f7c0749b4a0]
[ubuntu:08922] [ 3] convergeFracWidthFoam() [0x42b989]
[ubuntu:08922] [ 4] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xed) [0x7f7c0748676d]
[ubuntu:08922] [ 5] convergeFracWidthFoam() [0x43523d]
[ubuntu:08922] *** End of error message ***
[2]  in "/home/ripuvm/OpenFOAM/ripuvm-2.3.x/platforms/linux64GccDPOpt/bin/convergeFracWidthFoam"
[ubuntu:08923] *** Process received signal ***
[ubuntu:08923] Signal: Floating point exception (8)
[ubuntu:08923] Signal code:  (-6)
[ubuntu:08923] Failing at address: 0x3e8000022db
[ubuntu:08923] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f04fcf574a0]
[ubuntu:08923] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0x35) [0x7f04fcf57425]
[ubuntu:08923] [ 2] /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f04fcf574a0]
[ubuntu:08923] [ 3] convergeFracWidthFoam() [0x42b989]
[ubuntu:08923] [ 4] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xed) [0x7f04fcf4276d]
[ubuntu:08923] [ 5] convergeFracWidthFoam() [0x43523d]
[ubuntu:08923] *** End of error message ***
--------------------------------------------------------------------------
mpirun noticed that process rank 1 with PID 8922 on node ubuntu exited on signal 8 (Floating point exception).
--------------------------------------------------------------------------
3 total processes killed (some possibly by mpirun during cleanup)

The code works without the decomposition though.

I understand that this is a problem of creating a solver that can run in parallel. Can you help me out here?

gschaider April 1, 2014 04:29

Quote:

Originally Posted by ripudaman (Post 483120)
The code is able to find my solver. However, it does not run as it should. I tried replacing my modified solver (convergeFracWidthFoam) with solidDisplacementFoam and the code worked using this command:
Code:

run=BasicRunner(argv=["solidDisplacementFoam","-case",work.name],lam=machine)
run.start()

as well as this code:
Code:

run=AnalyzedRunner(CONVERGED,silent=True,argv=["solidDisplacementFoam","-case",work.name],lam=machine)
run.start()

where the object CONVERGED is a custom LogAnalyzer object.

However when I replace solidDisplacementFoam with convergeFracWidthFoam in either of the above options, the code does not go through the iterations. In fact for the BasicRunner case it gives me the following error:
Code:

PyFoam WARNING on line 144 of file /usr/local/lib/python2.7/dist-packages/PyFoam/Execution/ParallelExecution.py : which can not find a match for convergeFracWidthFoam . Hoping for the best
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
[ubuntu:08918] Warning: could not find environment variable "PYTHONPATH"
[ubuntu:08918] Warning: could not find environment variable "FOAM_MPI_LIBBIN"
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
/*---------------------------------------------------------------------------*\
| =========                |                                                |
| \\      /  F ield        | OpenFOAM: The Open Source CFD Toolbox          |
|  \\    /  O peration    | Version:  2.3.x                                |
|  \\  /    A nd          | Web:      www.OpenFOAM.org                      |
|    \\/    M anipulation  |                                                |
\*---------------------------------------------------------------------------*/
Build  : 2.3.x-e0d5f5a218ab
Exec  : convergeFracWidthFoam -case /home/ripuvm/OpenFOAM/ripuvm-2.3.x/multiFrac/cases/noFracTraj/try2 -parallel
Date  : Mar 31 2014
Time  : 21:47:12
Host  : "ubuntu"
PID    : 8921
Case  : /home/ripuvm/OpenFOAM/ripuvm-2.3.x/multiFrac/cases/noFracTraj/try2
nProcs : 4
Slaves :
3
(
"ubuntu.8922"
"ubuntu.8923"
"ubuntu.8924"
)

Pstream initialized with:
floatTransfer      : 0
nProcsSimpleSum    : 0
commsType          : nonBlocking
polling iterations : 0
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0

Reading mechanical properties

Normalising E : E/rho

Calculating Lame's coefficients

Plane Strain

Reading thermal properties

Reading field D

Calculating stress field sigmaDex

Calculating stress field sigmaD

Calculating explicit part of div(sigma) divSigmaExp


Calculating displacement field

Iteration: 1

Time = 1


One = 0  Two = 0

One = 0  Two = 0

One = 0  Two = 0
[1] #0  [2] #0  Foam::error::printStack(Foam::Ostream&)Foam::error::printStack(Foam::Ostream&)[3] #0  Foam::error::printStack(Foam::Ostream&) in "/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #1  Foam::sigFpe::sigHandler(int) in "/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #1  Foam::sigFpe::sigHandler(int) in "/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[3] #1  Foam::sigFpe::sigHandler(int) in "/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] # in "/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[3] #2  2  in "/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #2  in "/lib/x86_64-linux-gnu/libc.so.6"
[1] #3  in "/lib/x86_64-linux-gnu/libc.so.6"
[3] #3  in "/lib/x86_64-linux-gnu/libc.so.6"
[2] #3


[3]  in "/home/ripuvm/OpenFOAM/ripuvm-2.3.x/platforms/linux64GccDPOpt/bin/convergeFracWidthFoam"
[3] #4  __libc_start_main[1]  in "/home/ripuvm/OpenFOAM/ripuvm-2.3.x/platforms/linux64GccDPOpt/bin/convergeFracWidthFoam"
[1] #4  __libc_start_main[2]  in "/home/ripuvm/OpenFOAM/ripuvm-2.3.x/platforms/linux64GccDPOpt/bin/convergeFracWidthFoam"
[2] #4  __libc_start_main in "/lib/x86_64-linux-gnu/libc.so.6"
[3] #5  in "/lib/x86_64-linux-gnu/libc.so.6"
[2] #5  in "/lib/x86_64-linux-gnu/libc.so.6"
[1] #5


[1] [3]  in "/home/ripuvm/OpenFOAM/ripuvm-2.3.x/platforms/linux64GccDPOpt/bin/convergeFracWidthFoam"
[ubuntu:08924] *** Process received signal ***
[ubuntu:08924] Signal: Floating point exception (8)
[ubuntu:08924] Signal code:  (-6)
[ubuntu:08924] Failing at address: 0x3e8000022dc
in "/home/ripuvm/OpenFOAM/ripuvm-2.3.x/platforms/linux64GccDPOpt/bin/convergeFr[ubuntu:08924] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f5ee6fb44a0]
[ubuntu:08924] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0x35) [0x7f5ee6fb4425]
[ubuntu:08924] [ 2] /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f5ee6fb44a0]
[ubuntu:08924] [ 3] convergeFracWidthFoam() [0x42b989]
[ubuntu:08924] [ 4] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xed) [0x7f5ee6f9f76d]
[ubuntu:08924] [ 5] convergeFracWidthFoam() [0x43523d]
[ubuntu:08924] *** End of error message ***
acWidthFoam"
[ubuntu:08922] *** Process received signal ***
[ubuntu:08922] Signal: Floating point exception (8)
[ubuntu:08922] Signal code:  (-6)
[ubuntu:08922] Failing at address: 0x3e8000022da
[ubuntu:08922] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f7c0749b4a0]
[ubuntu:08922] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0x35) [0x7f7c0749b425]
[ubuntu:08922] [ 2] /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f7c0749b4a0]
[ubuntu:08922] [ 3] convergeFracWidthFoam() [0x42b989]
[ubuntu:08922] [ 4] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xed) [0x7f7c0748676d]
[ubuntu:08922] [ 5] convergeFracWidthFoam() [0x43523d]
[ubuntu:08922] *** End of error message ***
[2]  in "/home/ripuvm/OpenFOAM/ripuvm-2.3.x/platforms/linux64GccDPOpt/bin/convergeFracWidthFoam"
[ubuntu:08923] *** Process received signal ***
[ubuntu:08923] Signal: Floating point exception (8)
[ubuntu:08923] Signal code:  (-6)
[ubuntu:08923] Failing at address: 0x3e8000022db
[ubuntu:08923] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f04fcf574a0]
[ubuntu:08923] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0x35) [0x7f04fcf57425]
[ubuntu:08923] [ 2] /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f04fcf574a0]
[ubuntu:08923] [ 3] convergeFracWidthFoam() [0x42b989]
[ubuntu:08923] [ 4] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xed) [0x7f04fcf4276d]
[ubuntu:08923] [ 5] convergeFracWidthFoam() [0x43523d]
[ubuntu:08923] *** End of error message ***
--------------------------------------------------------------------------
mpirun noticed that process rank 1 with PID 8922 on node ubuntu exited on signal 8 (Floating point exception).
--------------------------------------------------------------------------
3 total processes killed (some possibly by mpirun during cleanup)

The code works without the decomposition though.

I understand that this is a problem of creating a solver that can run in parallel. Can you help me out here?

I'm pretty sure that PyFoam is not the problem here: it just starts the OpenFOAM-solver (which it obiously did) in parallel (which according to your output worked too). After that it just waits for the output of the program. If you run your program without PyFoam (mpirun -n 3 convergeFracWidthFoam -parallel) you'll see the same behaviour.

My guess is that it is the common DidASumCalculationAndDividedByItWhichFailsInParall elBecauseOnOneProcessorTheSumIsZeroAndIDidntDoARed uce-bug. But it is hard to tell from your output because it is from a Release-version: before you do anything else compile yourself a Debug version. Stack-traces are much clearer (they even include the line numbers of where the problem occurred) and a lot of common errors are uncovered because of the bound-checking

codder September 6, 2015 16:38

similar error message
 
Hi I use PyFoam for parametric evaluation (it's an amazing tool I'm only just figuring out).

So I have scripted a series of templated cases for my custom solver. The output I get everytime throws a "hoping for the best" warning error:

Code:

PyFoam WARNING on line 144 of file /usr/local/lib/python2.7/dist-packages/PyFoam/Execution/ParallelExecution.py : which can not find a match for apfSolidFoam . Hoping for the best
^C

 Interrupted by the Keyboard
Killing PID 5965
New case1

But the solver inevitably executes correctly. However, (as you can see above) I just returned to the office to find that PyFoam had hung at the end of exedution. My log looks like this:

Code:

--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
[namaste:05967] Warning: could not find environment variable "PYTHONPATH"
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
/*---------------------------------------------------------------------------*\
| =========                |                                                |
| \\      /  F ield        | foam-extend: Open Source CFD                    |
|  \\    /  O peration    | Version:    3.1                                |
|  \\  /    A nd          | Web:        http://www.extend-project.de      |
|    \\/    M anipulation  |                                                |
\*---------------------------------------------------------------------------*/
Build    : 3.1-f77b4801a214
Exec    : apfSolidFoam -case /home/eric/sharma/geomechanics/tutorials/apfSolidFoam/paper/caseTens0 -parallel
Date    : Sep 05 2015
Time    : 17:50:43
Host    : namaste
PID      : 5971
CtrlDict : /home/eric/foam/foam-extend-3.1/etc/controlDict
Case    : /home/eric/sharma/geomechanics/tutorials/apfSolidFoam/paper/caseTens0
nProcs  : 12
Slaves :
11
(
namaste.5972
namaste.5973
namaste.5974
namaste.5975
namaste.5976
namaste.5977
namaste.5978
namaste.5979
namaste.5980
namaste.5981
namaste.5982
)

Pstream initialized with:
floatTransfer    : 0
nProcsSimpleSum  : 0
commsType        : blocking
SigFpe  : Enabling floating point exception trapping (FOAM_SIGFPE).

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

...


Code:


End

Finalising parallel run
[namaste:05971] *** Process received signal ***
[namaste:05971] Signal: Segmentation fault (11)
[namaste:05971] Signal code: Invalid permissions (2)
[namaste:05971] Failing at address: 0x7f9c851eae50

I think the that seg faults it causes PyFoam to hang.

So like is there any option I through to PlotRunner to just continue through into the next case after that seg fault?

Because the seg fault is at the end of the run so I dont care. That would be hecca useful. (I am worried that it'd be default behavoir, but somehow it doesn't happen for me. Like the story of my life).

Thanks, Eric

PS I use this config:

Code:

    print("Running solver")
    machine = LAMMachine(nr=procnrs[i])
    PlotRunner(args=["--proc=%d"%procnrs[i],
                    "--progress",
                    "--no-continuity",
                    "--hardcopy",
                    "--non-persist",
                    "apfSolidFoam",
                    "-case",work.name])



All times are GMT -4. The time now is 11:53.