CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

Cannot read decomposeParDict during mpirun

Register Blogs Community New Posts Updated Threads Search

Like Tree2Likes
  • 1 Post By alexeym
  • 1 Post By alexeym

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   April 2, 2017, 23:33
Default Cannot read decomposeParDict during mpirun
  #1
New Member
 
Mike
Join Date: Dec 2016
Posts: 14
Rep Power: 9
MikeC is on a distinguished road
I'm trying to setup a local compute cluster (currently just 2 nodes), using ubuntu on master & slave, openmpi, and openfoam 4.1. I've shared a folder on the master '/nfs/TEST/' with the slave and verified that read/write access works. I've setup the environmental variables on the slave to work in non-interactive mode. If I execute
Code:
foamJob -s -p rhoCentralFoam
from the slave where hostfile just points to the slave, the job runs fine. Same goes if I execute from the master and have the hostfile only point to the master. However, if I execute from the master and have the host file point to the slave (or visa versa), I get the following error:

[0] --> FOAM FATAL ERROR:
[0] Cannot read "/home/michael/system/decomposeParDict"

My working directory is '/nfs/TEST' though (my username on both nodes is 'michael'), and I had already decomposed the case. If I execute
Code:
mpirun -np 16 -hostfile machines rhoCentralFoam -parallel > log
it fails, saying "mpirun was unable to find the specified executable file". What is going on?
MikeC is offline   Reply With Quote

Old   April 3, 2017, 03:10
Default
  #2
Senior Member
 
Alexey Matveichev
Join Date: Aug 2011
Location: Nancy, France
Posts: 1,930
Rep Power: 38
alexeym has a spectacular aura aboutalexeym has a spectacular aura about
Send a message via Skype™ to alexeym
Hi,

How do you execute that line (mpirun -np 16 ...)? Somehow you working directory becomes $HOME instead of /nfs/TEST (and so solver tries to search for decomposer dictionary in /home/michael/system). I have seen this behaviour in batch systems, do not know if it is your case.
alexeym is offline   Reply With Quote

Old   April 3, 2017, 08:18
Default
  #3
New Member
 
Mike
Join Date: Dec 2016
Posts: 14
Rep Power: 9
MikeC is on a distinguished road
Hi!

I'm just typing it into a standard bash terminal. I haven't setup any special batch system.
MikeC is offline   Reply With Quote

Old   April 3, 2017, 15:51
Default
  #4
Senior Member
 
Alexey Matveichev
Join Date: Aug 2011
Location: Nancy, France
Posts: 1,930
Rep Power: 38
alexeym has a spectacular aura aboutalexeym has a spectacular aura about
Send a message via Skype™ to alexeym
What if you add -case flags? For example like this:

Code:
-case $(pwd)
Or you can put real case path instead of pwd invocation.

Does you setup keep current working directory across nodes? I.e. what is cwd after you ssh on slave?
wht likes this.
alexeym is offline   Reply With Quote

Old   April 3, 2017, 23:58
Default
  #5
New Member
 
Mike
Join Date: Dec 2016
Posts: 14
Rep Power: 9
MikeC is on a distinguished road
When I add the case flag to the mpirun command, it errors saying there is a bad input to mpi. If I add it to the foamJob command, I get the same error as before. That is a good idea though - when I ssh into a node it takes me to the home directory, not the common working directory. How do I fix that?
MikeC is offline   Reply With Quote

Old   April 4, 2017, 01:56
Default
  #6
Senior Member
 
Alexey Matveichev
Join Date: Aug 2011
Location: Nancy, France
Posts: 1,930
Rep Power: 38
alexeym has a spectacular aura aboutalexeym has a spectacular aura about
Send a message via Skype™ to alexeym
It would be easier to diagnose, if you post error using copy-paste. What is the output of

Code:
mpirun -np 16 -hostfile machines rhoCentralFoam -parallel -case /nfs/TEST > log
?

Concerning your last question: https://www.google.com/search?q=ssh+preserve+cwd.
alexeym is offline   Reply With Quote

Old   April 4, 2017, 09:01
Default
  #7
New Member
 
Mike
Join Date: Dec 2016
Posts: 14
Rep Power: 9
MikeC is on a distinguished road
The full error message is:
Code:
--------------------------------------------------------------------------
mpirun was unable to find the specified executable file, and therefore
did not launch the job.  This error was first reported for process
rank 0; it may have occurred for other processes as well.

NOTE: A common cause for this error is misspelling a mpirun command
      line parameter option (remember that mpirun interprets the first
      unrecognized command line token as the executable).

Node:       sokar
Executable: rhoCentralFoam
--------------------------------------------------------------------------
16 total processes failed to start
Regarding preserving the cwd across ssh, I had tried to do that using SendEnv and AcceptEnv (per https://superuser.com/questions/4685...sh/46888#46888) but it didn't seem to work. Wasn't sure what the proper way of doing it is.
MikeC is offline   Reply With Quote

Old   April 4, 2017, 10:19
Default
  #8
Senior Member
 
Alexey Matveichev
Join Date: Aug 2011
Location: Nancy, France
Posts: 1,930
Rep Power: 38
alexeym has a spectacular aura aboutalexeym has a spectacular aura about
Send a message via Skype™ to alexeym
Hi,

Code:
mpirun was unable to find the specified executable file
The error is rather self-explanatory, no? Since mpirun can not find rhoCentralFoam, I guess, there is a problem with environment.

Since AcceptEnv (https://linux.die.net/man/5/sshd_config) and SendEnv (https://linux.die.net/man/5/ssh_config) both accept wildcards, you can use "SendEnv *" and "AcceptEnv *" to transfer all environment. You can use "SendEnv FOAM_* PATH PWD LD_LIBRARY_PATH" (and AcceptEnv), if you would like to limit number of variables transferred. You have to modify sshd_config on slave and ssh_config on master, restart sshd on slave.
olegrog likes this.
alexeym is offline   Reply With Quote

Old   April 5, 2017, 00:29
Default
  #9
New Member
 
Mike
Join Date: Dec 2016
Posts: 14
Rep Power: 9
MikeC is on a distinguished road
Yep, that fixed the problem. Thanks for your help!
MikeC is offline   Reply With Quote

Old   August 21, 2018, 08:17
Default mpirun was unable to find the executable
  #10
New Member
 
Rishab.G.Hombal
Join Date: Aug 2018
Posts: 20
Rep Power: 7
Rishab is on a distinguished road
Hi


im new to openfoam we have a cluster with one master node nd a client node with passwordless ssh enabled nd running openfoamv6 on both the machines with the case directories present in both nodes i get the following error when i try to run the case with foamJob


mpirun was unable to find the specified executable file, and therefore
did not launch the job. This error was first reported for process
rank 3; it may have occurred for other processes as well.

NOTE: A common cause for this error is misspelling a mpirun command
line parameter option (remember that mpirun interprets the first
unrecognized command line token as the executable).

Node: client1
Executable: /opt/openfoam6/bin/foamJob


so what exactly did you do to set the environments properly?


regards,
Rishab
Rishab is offline   Reply With Quote

Old   June 30, 2022, 14:59
Default
  #11
New Member
 
Mateus Grassano Lattari
Join Date: Apr 2018
Posts: 16
Rep Power: 8
MateusLatt is on a distinguished road
Hi, I did exactly this but the error remains. I also tried the full paths:

"mpirun -np 24 -hostfile hosts --use-hwthread-cpus -wdir `pwd` -x LD_LIBRARY_PATH=/usr/lib/openfoam/openfoam2112/platforms/linux64GccDPInt32Opt/bin:/usr/lib/openfoam/openfoam2112/platforms/linux64GccDPInt32Opt/lib:/usr/lib/openfoam/openfoam2112/platforms/linux64GccDPInt32Opt/lib/sys-openmpi -x PATH=/usr/lib/x86_64-linux-gnu/openmpi/bin:/home/matt/OpenFOAM/matt-v2112/platforms/linux64GccDPInt32Opt/bin:/usr/lib/openfoam/openfoam2112/site/2112/platforms/linux64GccDPInt32Opt/bin:/usr/lib/openfoam/openfoam2112/platforms/linux64GccDPInt32Opt/bin:/usr/lib/openfoam/openfoam2112/bin:/usr/lib/openfoam/openfoam2112/wmake:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin -x FOAM_RUN=/home/openfoam/rotatingSphere -x HOME=/home/openfoam -x FOAM_SIGFPE=1 -x FOAM_ETC=/usr/lib/openfoam/openfoam2112/etc pimpleFoam"


But when I try this the controlDict is not found

Quote:
Originally Posted by alexeym View Post
Hi,

Code:
mpirun was unable to find the specified executable file
The error is rather self-explanatory, no? Since mpirun can not find rhoCentralFoam, I guess, there is a problem with environment.

Since AcceptEnv (https://linux.die.net/man/5/sshd_config) and SendEnv (https://linux.die.net/man/5/ssh_config) both accept wildcards, you can use "SendEnv *" and "AcceptEnv *" to transfer all environment. You can use "SendEnv FOAM_* PATH PWD LD_LIBRARY_PATH" (and AcceptEnv), if you would like to limit number of variables transferred. You have to modify sshd_config on slave and ssh_config on master, restart sshd on slave.
MateusLatt is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Read in "Mean" values during runTime ChrisWe OpenFOAM Programming & Development 4 March 31, 2018 22:48
[OpenFOAM.org] MPI compiling and version mismatch pki OpenFOAM Installation 7 June 15, 2015 16:21
[Commercial meshers] fluentMeshToFoam multidomain mesh conversion problem Attesz OpenFOAM Meshing & Mesh Conversion 12 May 2, 2013 10:52
999999 (../../src/mpsystem.c@1123):mpt_read: failed:errno = 11 UDS_rambler FLUENT 2 November 22, 2011 09:46
Problem in running ICEM grid in Openfoam Tarak OpenFOAM 6 September 9, 2011 17:51


All times are GMT -4. The time now is 21:24.