CFD Online Logo CFD Online URL
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

hpc multi-node parallel error

Register Blogs Members List Search Today's Posts Mark Forums Read

LinkBack Thread Tools Search this Thread Display Modes
Old   October 4, 2018, 21:09
Default hpc multi-node parallel error
New Member
Join Date: Nov 2017
Posts: 7
Rep Power: 8
shixiangyu is on a distinguished road
hi guys i have a problem when i used mpirun --hostfile ~~~
i posted some pics like this

the error always shows“” All nodes which are allocated for this job are already filled.” but it shows %CPU is 0 !! with “top” command at each node
i tried so many method about mpirun command:

mpirun --hostfile machines -np 20 interFoam -parallel
foamJob -p interFoam

And the HPC cluster have 3 nodes , i also found the “”/etc/hosts“” file
like this localhost.localdomain localhost naoe02-PC anode0 a0 node0 cw0 anode1 a1 node1 anode2 a2 node2

anyone solved this problem help me plz
Attached Images
File Type: png 1.PNG (86.6 KB, 3 views)
File Type: png 2.PNG (147.2 KB, 2 views)
File Type: png 3.PNG (152.1 KB, 2 views)
shixiangyu is offline   Reply With Quote


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On

Similar Threads
Thread Thread Starter Forum Replies Last Post
Building OpenFOAM1.7.0 from source ata OpenFOAM Installation 46 March 6, 2022 13:21
[OpenFOAM] an error in Calculator's equation immortality ParaView 12 June 29, 2021 00:10
Compile problem ivanyao OpenFOAM Running, Solving & CFD 1 October 12, 2012 09:31
How to install CGNS under windows xp? lzgwhy Main CFD Forum 1 January 11, 2011 18:44
DecomposePar links against liblamso0 with OpenMPI jens_klostermann OpenFOAM Bugs 11 June 28, 2007 17:51

All times are GMT -4. The time now is 18:40.