CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > OpenFOAM Running, Solving & CFD

MPI InterFoam

Register Blogs Members List Search Today's Posts Mark Forums Read

Like Tree1Likes
  • 1 Post By AlmostSurelyRob

Reply
 
LinkBack Thread Tools Display Modes
Old   August 27, 2010, 03:42
Default MPI InterFoam
  #1
New Member
 
Join Date: May 2010
Posts: 27
Rep Power: 7
metro is on a distinguished road
Hey All,

I am trying to run interFoam (with 1 node) using MPI. I am running openFoam 1.7.1 on Ubuntu luncid. My pc is an AMD phenom X6 (6core processor) 8GB Ram, but I keep getting the following error


metro@ubuntu:~/OpenFOAM/metro-1.7.1/run/run/August/twoPhasekelong$ [1] [3]
[3]
[3] --> FOAM FATAL IO ERROR:
[3] cannot open file
[3]
[3] file: /home/metro/OpenFOAM/metro-1.7.1/run/run/August/twoPhasekelong/processor3/system/controlDict at line 0.
[3]
[3] From function regIOobject::readStream()
[3] in file db/regIOobject/regIOobjectRead.C at line 61.
[3]
FOAM parallel run exiting
[3]
[4]
[4]
[4] --> FOAM FATAL IO ERROR:
[4] cannot open file
[4]
[4] file: /home/metro/OpenFOAM/metro-1.7.1/run/run/August/twoPhasekelong/processor4/system/controlDict at line 0.
[4]
[4] From function regIOobject::readStream()
[4] in file db/regIOobject/regIOobjectRead.C at line 61.
[4]
FOAM parallel run exiting
[4]
[5]
[5]
[5] --> FOAM FATAL IO ERROR:
[5] cannot open file
[5]
[5] file: /home/metro/OpenFOAM/metro-1.7.1/run/run/August/twoPhasekelong/processor5/system/controlDict at line 0.
[5]
[5] From function regIOobject::readStream()
[5] in file db/regIOobject/regIOobjectRead.C at line 61.
[5]
FOAM parallel run exiting
[5]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[0]
[0]
[0] --> FOAM FATAL ERROR:
[0] interFoam: cannot open case directory "/home/metro/OpenFOAM/metro-1.7.1/run/run/August/twoPhasekelong/processor0"
[0]
[0]
FOAM parallel run exiting
[0]
[2]
[2]
[2] --> FOAM FATAL IO ERROR:
[2] cannot open file
[2]
[2] file: /home/metro/OpenFOAM/metro-1.7.1/run/run/August/twoPhasekelong/processor2/system/controlDict at line 0.
[2]
[2] From function regIOobject::readStream()
[2] in file db/regIOobject/regIOobjectRead.C at line 61.
[2]
FOAM parallel run exiting
[2]

[1]
[1] --> FOAM FATAL IO ERROR:
[1] cannot open file
[1]
[1] file: /home/metro/OpenFOAM/metro-1.7.1/run/run/August/twoPhasekelong/processor1/system/controlDict at line 0.
[1]
[1] From function regIOobject::readStream()
[1] in file db/regIOobject/regIOobjectRead.C at line 61.
[1]
FOAM parallel run exiting
[1]
--------------------------------------------------------------------------
mpirun has exited due to process rank 2 with PID 11141 on
node ubuntu exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[ubuntu:11138] 5 more processes have sent help message help-mpi-api.txt / mpi-abort
[ubuntu:11138] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

I have set it up so that it will divide my mesh into 6 bits along the z axis. CAn any one please help me? Are there any additional settings I am meant to set up before using MPI.

Regards

Metro
metro is offline   Reply With Quote

Old   October 15, 2010, 03:59
Default
  #2
New Member
 
nuria llamas
Join Date: Jul 2010
Posts: 2
Rep Power: 0
nuria87 is on a distinguished road
Hi!

Did you succeed in resolving the problem??? I have the same one...

Thank u!
nuria87 is offline   Reply With Quote

Old   November 7, 2010, 10:49
Default
  #3
Senior Member
 
Robert Sawko
Join Date: Mar 2009
Posts: 117
Rep Power: 13
AlmostSurelyRob will become famous soon enough
Hello,

I am not sure if this is relevant, but I have seen similar errors when I forgot to run decomposePar. Have you executed decomposePar before running the interFoam solver?
cnzzuhsz likes this.
AlmostSurelyRob is offline   Reply With Quote

Old   December 14, 2010, 23:41
Default
  #4
Member
 
Paul Reichl
Join Date: Feb 2010
Location: Melbourne, Victoria, Australia
Posts: 33
Rep Power: 7
preichl is on a distinguished road
I saw this error when I incorrectly set the directories in the roots section of the system/decomposeParDict file (when running in parallel on a distributed system).
preichl is offline   Reply With Quote

Old   March 26, 2013, 07:15
Default
  #5
Member
 
Malik
Join Date: Dec 2012
Location: Austin, USA
Posts: 52
Rep Power: 4
malaboss is on a distinguished road
I find a similar error when I wanted to simulate a case with a bash script.

When I run decomposePar and then run mpirun in the script :

I receive :

cannot open case directory "/media/windows/OpenFOAM/vent-2.1.1/run/cylindre/turbulence/Spalart/pimple/domaine_elargi/cylindreRE1000000_53/processor0"

However, when I run decomposePar by hands, in the case, and then run mpirun with the script, everything is fine.

I just don't understand

Here is what I do in the script :
Code:
cd $2
#$2 is a parameter the user give when calling the script
if [ ! -d $chemin/processor0 ]
then
		decomposePar > logdecompose &
fi
mpirun -np $nbCPU pimpleFoam -parallel > log &
malaboss is offline   Reply With Quote

Old   March 26, 2013, 16:19
Default
  #6
Member
 
Paul Reichl
Join Date: Feb 2010
Location: Melbourne, Victoria, Australia
Posts: 33
Rep Power: 7
preichl is on a distinguished road
Hi Malaboss,

Have you checked that the script is doing what you think it is doing?. I would try using echo statements to get it to write something to the screen when it enters each section.
Does this error message show up in the logdecompose file or in the log file and more importantly does the directory in the error message exist?

My suspicion is that the decompose part of the script is not actually being run but that is only a guess. I would try a simpler bash script with just the two commands in it and see how that goes.

I assume that the roots section of the system/decomposeParDict has been set correctly.

The other possibility is that the bash script is not inheriting settings but I think the other possibilities are more likely.

Cheers,

Paul.
preichl is offline   Reply With Quote

Old   March 26, 2013, 18:36
Default
  #7
Member
 
Malik
Join Date: Dec 2012
Location: Austin, USA
Posts: 52
Rep Power: 4
malaboss is on a distinguished road
Hi Paul, and thanks for answering

Quote:
Originally Posted by preichl View Post

Have you checked that the script is doing what you think it is doing?. I would try using echo statements to get it to write something to the screen when it enters each section.
Yes I already checked it, putting a lot of echos before using this script. I see we have the same reactions when we want to know where comes from an error
My variables are well initialized. As a proof, my code is working only if I run decomposePar before running the script (and obviously if I remove the decomposePar part in my code).


Quote:
Originally Posted by preichl View Post
Does this error message show up in the logdecompose file or in the log file and more importantly does the directory in the error message exist?
The error message does not show in the logDecompose file, but in the terminal in which i run the script. I think the decomposition is done right.
Also, the directory exists.


Quote:
Originally Posted by preichl View Post
I assume that the roots section of the system/decomposeParDict has been set correctly.
Actually I did not put anything in the roots part of decomposeParDict. I thought it was only useful when you wanted to work on different computers. In my case, I work on the same computer but with several processors.


I am thinking about something else : I ran decomposePar with "&" at the end, which means I start to decompose and immediately, I can run the following command lines.
Then I may not wait for the end of the decomposition, and run the mpirun part too early.
That could explain the fact that OpenFOAM refuses to open the processor directory.

I will check tomorrow morning (in France^^) and tell you if that is it.

Thank you again !
malaboss is offline   Reply With Quote

Old   March 27, 2013, 04:24
Default
  #8
Member
 
Malik
Join Date: Dec 2012
Location: Austin, USA
Posts: 52
Rep Power: 4
malaboss is on a distinguished road
Hi Paul,
I just check my code today and dropped the "&" at the end of the decomposePar line.
Now everything is all right.
The problem was that I was trying to run my case before having finished my decomposition of the case.

Thanks for all !
malaboss is offline   Reply With Quote

Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
InterFoam stops after deltaT goes to 1e14 francesco_b OpenFOAM Running, Solving & CFD 8 July 31, 2013 02:29
Open Channel Flow using InterFoam type solver sxhdhi OpenFOAM Running, Solving & CFD 3 May 5, 2009 21:58
Error using LaunderGibsonRSTM on SGI ALTIX 4700 jaswi OpenFOAM 2 April 29, 2008 10:54
Is Testsuite on the way or not lakeat OpenFOAM Installation 6 April 28, 2008 11:12
MPI and parallel computation Wang Main CFD Forum 7 April 15, 2004 11:25


All times are GMT -4. The time now is 21:11.