CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > OpenFOAM Running, Solving & CFD

Parallel Problems

Register Blogs Members List Search Today's Posts Mark Forums Read

Like Tree1Likes
  • 1 Post By niklas

Reply
 
LinkBack Thread Tools Display Modes
Old   August 16, 2012, 06:28
Default Parallel Problems
  #1
New Member
 
James Davies
Join Date: Aug 2012
Posts: 7
Rep Power: 4
james80 is on a distinguished road
Hi,

Very new to OpenFOAM, managed to get simulations running properly, but now trying to move onto bigger and better things,

very basic but i want to run a simulation across the 4 CPUs in my machine, i have been able to run decomposePar, but then im having trouble actually running the simulation in parallel.

i have been entering:

mpirun -np 4 simpleFoam -case $FOAM_RUN/ConvergingPipe \-parallel > log

but it will not run, getting a variety of error messages, none of which i can make sense of!

any help Appreciated

Cheers


Tommaso
james80 is offline   Reply With Quote

Old   August 16, 2012, 07:11
Default
  #2
Super Moderator
 
niklas's Avatar
 
Niklas Nordin
Join Date: Mar 2009
Location: Stockholm, Sweden
Posts: 692
Rep Power: 18
niklas will become famous soon enough
...and you wont get any help if you dont post the error message
niklas is offline   Reply With Quote

Old   August 16, 2012, 08:11
Default
  #3
New Member
 
James Davies
Join Date: Aug 2012
Posts: 7
Rep Power: 4
james80 is on a distinguished road
Sorry about that, when i try to run it this time, this is what i entered and what came up,

lobby@lobby-desktop:~/OpenFOAM/lobby-2.0.1/run/James/Basic/pipe/simpleFoam/ConvergingPipe$ mpirun -np 4 simpleFoam -case $FOAM_RUN/James/Basic/pipe/simpleFoam/ConvergingPipe \-parallel > log
[0]
[0]
[0] --> FOAM FATAL ERROR:
[0] Cannot read "/home/lobby/OpenFOAM/lobby-2.1.1/run/James/Basic/pipe/simpleFoam/ConvergingPipe/system/decomposeParDict"
[0]
FOAM parallel run exiting
[0]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 3327 on
node lobby-desktop exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).

but decomposeParDict is in the system folder under that case??
james80 is offline   Reply With Quote

Old   August 16, 2012, 08:38
Default
  #4
Super Moderator
 
niklas's Avatar
 
Niklas Nordin
Join Date: Mar 2009
Location: Stockholm, Sweden
Posts: 692
Rep Power: 18
niklas will become famous soon enough
how were you able to run decomposePar without the decomposeParDict?
niklas is offline   Reply With Quote

Old   August 16, 2012, 08:40
Default
  #5
New Member
 
James Davies
Join Date: Aug 2012
Posts: 7
Rep Power: 4
james80 is on a distinguished road
decomposeParDict IS in that folder, i ran decomposePar fine, just when i try to start the simulation it won't read it i guess??
james80 is offline   Reply With Quote

Old   August 16, 2012, 08:56
Default
  #6
Super Moderator
 
niklas's Avatar
 
Niklas Nordin
Join Date: Mar 2009
Location: Stockholm, Sweden
Posts: 692
Rep Power: 18
niklas will become famous soon enough
hmmm....

so if you go to that case directory and type
mpirun -np 4 simpleFoam -parallel

does that work?
james80 likes this.
niklas is offline   Reply With Quote

Old   August 16, 2012, 09:07
Default
  #7
New Member
 
James Davies
Join Date: Aug 2012
Posts: 7
Rep Power: 4
james80 is on a distinguished road
Yesss!! i swapped out the old decomposeParDict for a fresh one from the tutorials and with this seems to be working now, thanks
james80 is offline   Reply With Quote

Old   October 30, 2012, 15:10
Default Simple 2D parallel heat conduction
  #8
Member
 
,...
Join Date: Apr 2011
Posts: 92
Rep Power: 4
mahdiiowa is an unknown quantity at this point
I am running a simple 2D heat condition problem in parallel. The domain is decomposed into 4 equal regions in the horizontal direction. The temperature contour looks like file 4 in attachment. However, when I am running the case on 1 processor the contour looks like file 1. As you see, the temperatures do not seem to be right at the intersection of the domains, solved by different processors. Any idea what is wrong?
Attached Images
File Type: jpg 1.jpg (61.5 KB, 2 views)
File Type: jpg 4.jpg (64.3 KB, 4 views)
mahdiiowa is offline   Reply With Quote

Reply

Tags
openfoam, parallel, parallel processing

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
HP MPI warning...Distributed parallel processing Peter CFX 10 May 14, 2011 06:17
Problems with "polyTopoChange" on parallel?!? daZigeiner OpenFOAM Programming & Development 0 March 14, 2011 11:05
Problems with parallel wolfgray OpenFOAM Running, Solving & CFD 0 April 14, 2008 04:36
Problems with mesh motion in parallel thomas OpenFOAM Running, Solving & CFD 3 July 4, 2007 02:48
CFX - Parallel Problems CFX User CFX 0 November 1, 2004 18:12


All times are GMT -4. The time now is 05:30.