CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Running, Solving & CFD (https://www.cfd-online.com/Forums/openfoam-solving/)
-   -   MPI InterFoam (https://www.cfd-online.com/Forums/openfoam-solving/79567-mpi-interfoam.html)

metro August 27, 2010 03:42

MPI InterFoam
 
Hey All,

I am trying to run interFoam (with 1 node) using MPI. I am running openFoam 1.7.1 on Ubuntu luncid. My pc is an AMD phenom X6 (6core processor) 8GB Ram, but I keep getting the following error


metro@ubuntu:~/OpenFOAM/metro-1.7.1/run/run/August/twoPhasekelong$ [1] [3]
[3]
[3] --> FOAM FATAL IO ERROR:
[3] cannot open file
[3]
[3] file: /home/metro/OpenFOAM/metro-1.7.1/run/run/August/twoPhasekelong/processor3/system/controlDict at line 0.
[3]
[3] From function regIOobject::readStream()
[3] in file db/regIOobject/regIOobjectRead.C at line 61.
[3]
FOAM parallel run exiting
[3]
[4]
[4]
[4] --> FOAM FATAL IO ERROR:
[4] cannot open file
[4]
[4] file: /home/metro/OpenFOAM/metro-1.7.1/run/run/August/twoPhasekelong/processor4/system/controlDict at line 0.
[4]
[4] From function regIOobject::readStream()
[4] in file db/regIOobject/regIOobjectRead.C at line 61.
[4]
FOAM parallel run exiting
[4]
[5]
[5]
[5] --> FOAM FATAL IO ERROR:
[5] cannot open file
[5]
[5] file: /home/metro/OpenFOAM/metro-1.7.1/run/run/August/twoPhasekelong/processor5/system/controlDict at line 0.
[5]
[5] From function regIOobject::readStream()
[5] in file db/regIOobject/regIOobjectRead.C at line 61.
[5]
FOAM parallel run exiting
[5]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[0]
[0]
[0] --> FOAM FATAL ERROR:
[0] interFoam: cannot open case directory "/home/metro/OpenFOAM/metro-1.7.1/run/run/August/twoPhasekelong/processor0"
[0]
[0]
FOAM parallel run exiting
[0]
[2]
[2]
[2] --> FOAM FATAL IO ERROR:
[2] cannot open file
[2]
[2] file: /home/metro/OpenFOAM/metro-1.7.1/run/run/August/twoPhasekelong/processor2/system/controlDict at line 0.
[2]
[2] From function regIOobject::readStream()
[2] in file db/regIOobject/regIOobjectRead.C at line 61.
[2]
FOAM parallel run exiting
[2]

[1]
[1] --> FOAM FATAL IO ERROR:
[1] cannot open file
[1]
[1] file: /home/metro/OpenFOAM/metro-1.7.1/run/run/August/twoPhasekelong/processor1/system/controlDict at line 0.
[1]
[1] From function regIOobject::readStream()
[1] in file db/regIOobject/regIOobjectRead.C at line 61.
[1]
FOAM parallel run exiting
[1]
--------------------------------------------------------------------------
mpirun has exited due to process rank 2 with PID 11141 on
node ubuntu exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[ubuntu:11138] 5 more processes have sent help message help-mpi-api.txt / mpi-abort
[ubuntu:11138] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

I have set it up so that it will divide my mesh into 6 bits along the z axis. CAn any one please help me? Are there any additional settings I am meant to set up before using MPI.

Regards

Metro

nuria87 October 15, 2010 03:59

Hi!

Did you succeed in resolving the problem??? I have the same one...

Thank u!

AlmostSurelyRob November 7, 2010 09:49

Hello,

I am not sure if this is relevant, but I have seen similar errors when I forgot to run decomposePar. Have you executed decomposePar before running the interFoam solver?

preichl December 14, 2010 22:41

I saw this error when I incorrectly set the directories in the roots section of the system/decomposeParDict file (when running in parallel on a distributed system).

malaboss March 26, 2013 06:15

I find a similar error when I wanted to simulate a case with a bash script.

When I run decomposePar and then run mpirun in the script :

I receive :

cannot open case directory "/media/windows/OpenFOAM/vent-2.1.1/run/cylindre/turbulence/Spalart/pimple/domaine_elargi/cylindreRE1000000_53/processor0"

However, when I run decomposePar by hands, in the case, and then run mpirun with the script, everything is fine.

I just don't understand :confused:

Here is what I do in the script :
Code:


cd $2
#$2 is a parameter the user give when calling the script
if [ ! -d $chemin/processor0 ]
then
                decomposePar > logdecompose &
fi
mpirun -np $nbCPU pimpleFoam -parallel > log &


preichl March 26, 2013 15:19

Hi Malaboss,

Have you checked that the script is doing what you think it is doing?. I would try using echo statements to get it to write something to the screen when it enters each section.
Does this error message show up in the logdecompose file or in the log file and more importantly does the directory in the error message exist?

My suspicion is that the decompose part of the script is not actually being run but that is only a guess. I would try a simpler bash script with just the two commands in it and see how that goes.

I assume that the roots section of the system/decomposeParDict has been set correctly.

The other possibility is that the bash script is not inheriting settings but I think the other possibilities are more likely.

Cheers,

Paul.

malaboss March 26, 2013 17:36

Hi Paul, and thanks for answering

Quote:

Originally Posted by preichl (Post 416594)

Have you checked that the script is doing what you think it is doing?. I would try using echo statements to get it to write something to the screen when it enters each section.

Yes I already checked it, putting a lot of echos before using this script. I see we have the same reactions when we want to know where comes from an error :)
My variables are well initialized. As a proof, my code is working only if I run decomposePar before running the script (and obviously if I remove the decomposePar part in my code).


Quote:

Originally Posted by preichl (Post 416594)
Does this error message show up in the logdecompose file or in the log file and more importantly does the directory in the error message exist?

The error message does not show in the logDecompose file, but in the terminal in which i run the script. I think the decomposition is done right.
Also, the directory exists.


Quote:

Originally Posted by preichl (Post 416594)
I assume that the roots section of the system/decomposeParDict has been set correctly.

Actually I did not put anything in the roots part of decomposeParDict. I thought it was only useful when you wanted to work on different computers. In my case, I work on the same computer but with several processors.


I am thinking about something else : I ran decomposePar with "&" at the end, which means I start to decompose and immediately, I can run the following command lines.
Then I may not wait for the end of the decomposition, and run the mpirun part too early.
That could explain the fact that OpenFOAM refuses to open the processor directory.

I will check tomorrow morning (in France^^) and tell you if that is it.

Thank you again !

malaboss March 27, 2013 03:24

Hi Paul,
I just check my code today and dropped the "&" at the end of the decomposePar line.
Now everything is all right.
The problem was that I was trying to run my case before having finished my decomposition of the case.

Thanks for all !


All times are GMT -4. The time now is 18:01.