CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

OpenFOAM CPU Usage

Register Blogs Community New Posts Updated Threads Search

Like Tree2Likes

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   April 5, 2013, 10:36
Default OpenFOAM CPU Usage
  #1
Senior Member
 
musaddeque hossein
Join Date: Mar 2009
Posts: 309
Rep Power: 18
musahossein is on a distinguished road
Dear all:
I am running solshingTank2D in interDyMFoam. The tank has 40,000 cells. I did this to see how the cpu is utilized by OpenFOAM. My computer processor is a Intel i3 CPU M330@2.13Ghzx4 with 4 CPU's. I am running UBUNTU 12.10 and OpenFoam 2.2.0. When I check the processor performance, what I see is that one CPU is running at 99% while the others are below 10%. Is there any way to make OpenFOAM seek out and use all the CPU's? A screen shot of CPU usage while running OpenFOAM for this problem is attached.

Any advice would be greatly appreciated.
Attached Images
File Type: jpg Screenshot from 2013-04-05 09:51:31.jpg (43.4 KB, 214 views)
musahossein is offline   Reply With Quote

Old   April 5, 2013, 10:40
Default
  #2
Senior Member
 
akidess's Avatar
 
Anton Kidess
Join Date: May 2009
Location: Germany
Posts: 1,377
Rep Power: 29
akidess will become famous soon enough
Have you checked the user manual? http://www.openfoam.org/docs/user/ru...s-parallel.php
__________________
*On twitter @akidTwit
*Spend as much time formulating your questions as you expect people to spend on their answer.
akidess is offline   Reply With Quote

Old   April 5, 2013, 12:14
Default
  #3
Senior Member
 
musaddeque hossein
Join Date: Mar 2009
Posts: 309
Rep Power: 18
musahossein is on a distinguished road
Quote:
Originally Posted by akidess View Post
Thankyou very much for your response. I looked into decomposPar dict file and it seems that it is set up for parallel processing over several computers -- or it does not care and looks for the number of subdomains specified in the same computer before looking elsewhere?
musahossein is offline   Reply With Quote

Old   April 5, 2013, 14:36
Default
  #4
New Member
 
Eric
Join Date: Mar 2013
Posts: 22
Rep Power: 13
erichu is on a distinguished road
Hello,

Are you running foamJob or mpirun? if not that explains why only one core is used.
erichu is offline   Reply With Quote

Old   April 5, 2013, 14:39
Default
  #5
Senior Member
 
Mojtaba.a's Avatar
 
Mojtaba Amiraslanpour
Join Date: Jun 2011
Location: Tampa, US
Posts: 308
Rep Power: 15
Mojtaba.a is on a distinguished road
Send a message via Skype™ to Mojtaba.a
Quote:
Originally Posted by musahossein View Post
Thankyou very much for your response. I looked into decomposPar dict file and it seems that it is set up for parallel processing over several computers -- or it does not care and looks for the number of subdomains specified in the same computer before looking elsewhere?
You can easily use it in the same computer with multiple processors.
Just set decomposeParDict correctly with respect to number of processors you have and you are done.
the rest is in the user's manual.
Good luck
__________________
Learn OpenFOAM in Persian
SFO (StarCCM+ FLUENT OpenFOAM) Project Team Member
Complex Heat & Flow Simulation Research Group
If you can't explain it simply, you don't understand it well enough. "Richard Feynman"
Mojtaba.a is offline   Reply With Quote

Old   April 5, 2013, 17:07
Default
  #6
Senior Member
 
musaddeque hossein
Join Date: Mar 2009
Posts: 309
Rep Power: 18
musahossein is on a distinguished road
I have 2 processors. Each processor has 2 CPUs. So I should set subdomain to the number of processors (2) or the number of CPUs (4). Also, I have to run blockMesh and setFields before running mpi, correct? Thanks.
musahossein is offline   Reply With Quote

Old   April 5, 2013, 17:15
Default
  #7
Senior Member
 
Mojtaba.a's Avatar
 
Mojtaba Amiraslanpour
Join Date: Jun 2011
Location: Tampa, US
Posts: 308
Rep Power: 15
Mojtaba.a is on a distinguished road
Send a message via Skype™ to Mojtaba.a
Quote:
Originally Posted by musahossein View Post
I have 2 processors. Each processor has 2 CPUs. So I should set subdomain to the number of processors (2) or the number of CPUs (4). Also, I have to run blockMesh and setFields before running mpi, correct? Thanks.
If you are using Gnome run

Quote:
gnome-system-monitor
go to resources tab and see how many CPUs are listed there under CPU History. thats your number for defining in decomposeParDict.

Not sure about setFields, but about blockMesh, yes you must run it before mpi.
__________________
Learn OpenFOAM in Persian
SFO (StarCCM+ FLUENT OpenFOAM) Project Team Member
Complex Heat & Flow Simulation Research Group
If you can't explain it simply, you don't understand it well enough. "Richard Feynman"
Mojtaba.a is offline   Reply With Quote

Old   April 5, 2013, 18:02
Default
  #8
New Member
 
Eric
Join Date: Mar 2013
Posts: 22
Rep Power: 13
erichu is on a distinguished road
I am putting a file in my case folder named 'machines'. In this file I later write my config, I.e.
Workstation cpu=2

Where workstation is the host name. Sub domains are based on cores, so in your case 4, I think.

My normal procedure is
0) having machines file setup
1) Blockmesh/ideasUnvToFoam
2) decomposePar
3) foamJob -s -p solverName
erichu is offline   Reply With Quote

Old   April 7, 2013, 17:56
Default Running OpenFoam over multiple CPU's on the same computer
  #9
Senior Member
 
musaddeque hossein
Join Date: Mar 2009
Posts: 309
Rep Power: 18
musahossein is on a distinguished road
Gentlement:

Here is my attempt to run OpenFOAM over 4 CPU's that my computer has. I noted that in the first error message, it shows that it cannot find processor0. But I thought OpenFOAM would automatically detect the number of CPU's. Was that assumption incorrect? Thanks for your help / advice.
__________________________________________________ _____________________
musa@ubuntu:~/OpenFOAM/musa-2.2.0/run/tutorials/multiphase/interDyMFoam/ras/sloshingTank2D$ mpirun -np 4 interDyMFoam -parallel >log &
[1] 4814
musa@ubuntu:~/OpenFOAM/musa-2.2.0/run/tutorials/multiphase/interDyMFoam/ras/sloshingTank2D$ [0]
[0]
[0] --> FOAM FATAL ERROR:
[0] interDyMFoam: cannot open case directory "/home/musa/OpenFOAM/musa-2.2.0/run/tutorials/multiphase/interDyMFoam/ras/sloshingTank2D/processor0"
[0]
[0]
FOAM parallel run exiting
[0]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 4815 on
node ubuntu exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
musahossein is offline   Reply With Quote

Old   April 7, 2013, 23:03
Default
  #10
Senior Member
 
musaddeque hossein
Join Date: Mar 2009
Posts: 309
Rep Power: 18
musahossein is on a distinguished road
Quote:
Originally Posted by Mojtaba.a View Post
If you are using Gnome run

go to resources tab and see how many CPUs are listed there under CPU History. thats your number for defining in decomposeParDict.

Not sure about setFields, but about blockMesh, yes you must run it before mpi.
I am running Ubuntu 10.2. In the dash there is a way to check the system performance, and it shows 4 CPU's. But for some reason decomposePar is still finiding errors. Any comments/ suggestions will be appreciated. Thankyou

__________________________________________________ _____________________
Decomposing mesh region0

Create mesh

Calculating distribution of cells
Selecting decompositionMethod hierarchical


--> FOAM FATAL ERROR:
Wrong number of processor divisions in geomDecomp:
Number of domains : 4
Wanted decomposition : (2 2 2)

From function geomDecomp::geomDecomp(const dictionary& decompositionDict)
in file geomDecomp/geomDecomp.C at line 50.

FOAM exiting
musahossein is offline   Reply With Quote

Old   April 8, 2013, 03:01
Default
  #11
New Member
 
Eric
Join Date: Mar 2013
Posts: 22
Rep Power: 13
erichu is on a distinguished road
Do you have a possibility to upload the decomposeDict file? It might be easier to find the source of the problem. However (2 2 2) is to me decomposition for 8 processors and not for 4.
erichu is offline   Reply With Quote

Old   April 8, 2013, 03:17
Default
  #12
Senior Member
 
Mojtaba.a's Avatar
 
Mojtaba Amiraslanpour
Join Date: Jun 2011
Location: Tampa, US
Posts: 308
Rep Power: 15
Mojtaba.a is on a distinguished road
Send a message via Skype™ to Mojtaba.a
Quote:
Originally Posted by musahossein View Post
I am running Ubuntu 10.2. In the dash there is a way to check the system performance, and it shows 4 CPU's. But for some reason decomposePar is still finiding errors. Any comments/ suggestions will be appreciated. Thankyou

__________________________________________________ _____________________
Decomposing mesh region0

Create mesh

Calculating distribution of cells
Selecting decompositionMethod hierarchical


--> FOAM FATAL ERROR:
Wrong number of processor divisions in geomDecomp:
Number of domains : 4
Wanted decomposition : (2 2 2)

From function geomDecomp::geomDecomp(const dictionary& decompositionDict)
in file geomDecomp/geomDecomp.C at line 50.

FOAM exiting
As eric said upload your decomposeParDict to see how you have configured it.
also have a loook at this:
http://www.cfd-online.com/Forums/ope...tml#post189895
__________________
Learn OpenFOAM in Persian
SFO (StarCCM+ FLUENT OpenFOAM) Project Team Member
Complex Heat & Flow Simulation Research Group
If you can't explain it simply, you don't understand it well enough. "Richard Feynman"
Mojtaba.a is offline   Reply With Quote

Old   April 9, 2013, 15:19
Default
  #13
Senior Member
 
musaddeque hossein
Join Date: Mar 2009
Posts: 309
Rep Power: 18
musahossein is on a distinguished road
Thanks for your help. The decomposePar works now. However, it stalls at the end of the run with the following message:

----- I have deleted preceeeding out put to keep this to the point -----------

Number of processor faces = 812
Max number of cells = 27540 (49.998% above average 18360.2)
Max number of processor patches = 2 (0% above average 2)
Max number of faces between processors = 408 (0.492611% above average 406)

Time = 0

--> FOAM FATAL IO ERROR:
Cannot find patchField entry for lowerWall

file: /home/musa/OpenFOAM/musa-2.2.0/run/tutorials/multiphase/interDyMFoam/ras/sloshingTank2D/0/alpha1.org-old.boundaryField from line 25 to line 33.

From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
in file /home/opencfd/OpenFOAM/OpenFOAM-2.2.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 154.

FOAM exiting
-----------------------------------------------------------------------------------------------------------------------
But, my blockmeshDict file has the patches as shown below:
vertices
(
(-0.05 -0.50 -0.35) // Vertex back lower left corner = 0
(-0.05 0.50 -0.35) // Vertex back lower right corner= 1
(-0.05 0.50 0.65) // Vertex back upper right corner= 2
(-0.05 -0.50 0.65) // Vertex back upper left corner = 3

(0.05 -0.50 -0.35) // Vertex front lower left corner = 4
(0.05 0.50 -0.35) // Vertex front lower right corner= 5
(0.05 0.50 0.65) // Vertex front upper right corner= 6
(0.05 -0.50 0.65) // Vertex front upper left corner = 7

);

blocks
(
// block0
hex (0 1 2 3 4 5 6 7)
(271 271 1)
simpleGrading (1 1 1)
);

//patches
boundary
(
lowerWall
{
type patch;
faces
(
(0 1 5 4)
);
}
rightWall
{
type patch;
faces
(
(1 2 6 5)
);
}
atmosphere
{
type patch;
faces
(
(2 3 7 6)
);
}
leftWall
{
type patch;
faces
(
(0 4 7 3)
);
}
frontAndBack
{
type Empty;
faces
(
(4 5 6 7)
(0 3 2 1)
);
}
);

Any comments / suggestions would be appreciated. Thankyou.
musahossein is offline   Reply With Quote

Old   April 9, 2013, 16:06
Default
  #14
New Member
 
Eric
Join Date: Mar 2013
Posts: 22
Rep Power: 13
erichu is on a distinguished road
I wonder if you are missing the patch name in one of the U, p .... files? Upload the boundary files as well and we can check if you cannot find the source of your problem.
erichu is offline   Reply With Quote

Old   April 9, 2013, 19:24
Default
  #15
Senior Member
 
musaddeque hossein
Join Date: Mar 2009
Posts: 309
Rep Power: 18
musahossein is on a distinguished road
Quote:
Originally Posted by erichu View Post
I wonder if you are missing the patch name in one of the U, p .... files? Upload the boundary files as well and we can check if you cannot find the source of your problem.
Actually I figured out what the problem was. OpenFOAM reads all the files in the "0" folder. So I had kept the original and revised versions of alpha1, p, and U files in the folder. It was trying to read all of them, eventhough they were named alpha1-old, p-old, U-old. Once I got rid of them, decomposePar ran w/o any problems. But when I do the mpi run, I get another error:


musa@ubuntu:~/OpenFOAM/musa-2.2.0/run/tutorials/multiphase/interDyMFoam/ras/sloshingTank2D$ mpirun -np 4 interDyMFoam -parallel >log &
[1] 2663
musa@ubuntu:~/OpenFOAM/musa-2.2.0/run/tutorials/multiphase/interDyMFoam/ras/sloshingTank2D$ [0]
[0]
[0] --> FOAM FATAL IO ERROR:
[0] cannot find file
[0]
[0] file: /home/musa/OpenFOAM/musa-2.2.0/run/tutorials/multiphase/interDyMFoam/ras/sloshingTank2D/processor0/0/alpha1 at line 0.
[0]
[0] From function regIOobject::readStream()
[0] in file db/regIOobject/regIOobjectRead.C at line 73.
[0]
FOAM parallel run exiting
[0]
[2]
[2]
[2] --> FOAM FATAL IO ERROR:
[2] cannot find file
[2]
[2] file: /home/musa/OpenFOAM/musa-2.2.0/run/tutorials/multiphase/interDyMFoam/ras/sloshingTank2D/processor2/0/alpha1 at line 0.
[2]
[3]
[3]
[3] --> FOAM FATAL IO ERROR:
[3] cannot find file
[3]
[3] file: /home/musa/OpenFOAM/musa-2.2.0/run/tutorials/multiphase/interDyMFoam/ras/sloshingTank2D/processor3/0/alpha1 at line 0.
[3]
[3] From function regIOobject::readStream()
[3] in file db/regIOobject/regIOobjectRead.C at line 73.
[3]
FOAM parallel run exiting
[3]
[2] From function regIOobject::readStream()
[2] in file db/regIOobject/regIOobjectRead.C at line 73.
[2]
FOAM parallel run exiting
[2]
[1]
[1]
[1] --> FOAM FATAL IO ERROR:
[1] cannot find file
[1]
[1] file: /home/musa/OpenFOAM/musa-2.2.0/run/tutorials/multiphase/interDyMFoam/ras/sloshingTank2D/processor1/0/alpha1 at line 0.
[1]
[1] From function regIOobject::readStream()
[1] in file db/regIOobject/regIOobjectRead.C at line 73.
[1]
FOAM parallel run exiting
[1]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 3 with PID 2667 on
node ubuntu exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[ubuntu:02663] 3 more processes have sent help message help-mpi-api.txt / mpi-abort
[ubuntu:02663] Set MCA parameter "orte_base_help_aggregate" to 0 to see all helpmusa@ubuntu:~/OpenFOAM/musa-2.2.0/run/tutorials/multiphase/interDyMFoam/ras/sloshingTank2D$
musahossein is offline   Reply With Quote

Old   April 9, 2013, 19:51
Default
  #16
Senior Member
 
musaddeque hossein
Join Date: Mar 2009
Posts: 309
Rep Power: 18
musahossein is on a distinguished road
If there are 4 CPU's then is the MPI error messages requiring that there be a alpha1.org folder for each processor?
musahossein is offline   Reply With Quote

Old   April 9, 2013, 21:05
Default i3 multicore performance
  #17
Senior Member
 
JR22's Avatar
 
Jose Rey
Join Date: Oct 2012
Posts: 134
Rep Power: 17
JR22 will become famous soon enough
Does your computer have:
1. Two separate processors with i3 two cores each?, or
2. One dual core i3 processor with hyperthreading (aka multithreading)?

If you have case #1, you should see 8 logical CPUs, if you have #2, you should see 4 logical CPUs. Only 50% of logical CPUs have the power to crunch data at fullest. For some jobs, however, the other 50% could give you an edge. What I am saying, is that your model might finish faster with the settings set to two CPUs than four. It also might explain your original observation of two CPUs working harder than the other two.

This is a good post that touches on the subject of hyperthreading:
http://www.cfd-online.com/Forums/ope...processor.html

Last edited by JR22; April 9, 2013 at 21:59.
JR22 is offline   Reply With Quote

Old   April 10, 2013, 01:06
Default
  #18
New Member
 
Eric
Join Date: Mar 2013
Posts: 22
Rep Power: 13
erichu is on a distinguished road
Quote:
Originally Posted by musahossein View Post
If there are 4 CPU's then is the MPI error messages requiring that there be a alpha1.org folder for each processor?


I have never used the solver you are using but I cannot imagine that you need .org files. If the setup has hanged, you might also need to run decomposePar -force to update the folders.

In general, I would say that each decomposed processor needs a full set of boundary files.

I ran the tutorial of interDyMFoam sloshingTank2D (ras) using ./Allrun, then aborted when the solver started. Later decomposePar and finally
foamJob -s -p interDyMFaom
erichu is offline   Reply With Quote

Old   April 10, 2013, 03:52
Default
  #19
New Member
 
Håkon Bartnes Line
Join Date: Mar 2013
Posts: 27
Rep Power: 14
hakonbar is on a distinguished road
Have you done the damBreak tutorial described in the user guide? It's a pretty good step-by-step walkthrough of setting up a parallel run. It's even multiphase, so it uses the variable alpha.

By the way, I think you forgot to rename the file called "alpha1.org" to "alpha1" before decomposing. The ".org"-ending doesn't do anything, it's just to keep a backup of the alpha1 file that's not modified by the setFields application. This is also described in the damBreak tutorial.
hakonbar is offline   Reply With Quote

Old   April 10, 2013, 20:54
Default
  #20
Senior Member
 
musaddeque hossein
Join Date: Mar 2009
Posts: 309
Rep Power: 18
musahossein is on a distinguished road
To all the forum members who put their time and effort in trying to help me out and responding to my posts -- a heartfelt thankyou. Paralle processing finally worked. The reasons it was not working are as follows:

1.decomposePar reads all the files in the "0" folder. So it not only read the alpha1, p and U folders that I had modified, but also read the original folders which I had saved with prefix such as "old" or "orignal". So the error was coming from decomposePar reading the original files after reading the files I had modified.

2. The ./Allclean kept deleting the alpha1, apha1.org files with the "rm" command. So I commented out that line.

3. I didnt realize that in the decomposePar dictionary, you should keep only the method you want to use and comment out or delete the other options.

After I took care of these items, the parallel processing works very well and the system monitor shows all the 4 CPU's running at 99%-100%.
Mojtaba.a likes this.
musahossein is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
fluent cpu usage aminshz FLUENT 1 December 16, 2011 02:45
Cross-compiling OpenFOAM 1.6 on Linux for Windows 32 and 64bits with Mingw-w64 wyldckat OpenFOAM Announcements from Other Sources 7 January 19, 2010 15:39
Critical errors during OpenFoam installation in OpenSuse 11.0 amscosta OpenFOAM 5 May 1, 2009 14:06
OpenFOAM Training and Workshop Zagreb 2628Jan2006 hjasak OpenFOAM 1 February 2, 2006 21:07
OpenFOAM Training and Workshop Hrvoje Jasak Main CFD Forum 0 October 7, 2005 07:14


All times are GMT -4. The time now is 15:04.