CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Running, Solving & CFD (https://www.cfd-online.com/Forums/openfoam-solving/)
-   -   MPI run run fails of quadcore openfoam2.4 (https://www.cfd-online.com/Forums/openfoam-solving/165752-mpi-run-run-fails-quadcore-openfoam2-4-a.html)

Priya Somasundaran January 25, 2016 06:33

MPI run run fails of quadcore openfoam2.4
 
I am trying mpirun , decomposePar worked fine upon

/*---------------------------------------------------------------------------*\
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 2.4.0 |
| \\ / A nd | Web: www.OpenFOAM.org |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
Build : 2.4.0-f0842aea0e77
Exec : decomposePar
Date : Jan 25 2016
Time : 19:28:34
Host : "GEOSCIENCE-PC"
PID : 20725
Case : /home/priya/OpenFOAM/priya-2.4.0/run/simulation
nProcs : 1
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Allowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time



Decomposing mesh region0

Create mesh

Calculating distribution of cells
Selecting decompositionMethod scotch

Finished decomposition in 19.3 s

Calculating original mesh data

Distributing cells to processors

Distributing faces to processors

Distributing points to processors

Constructing processor meshes

Processor 0
Number of cells = 1532953
Number of faces shared with processor 1 = 6519
Number of faces shared with processor 2 = 1733
Number of faces shared with processor 3 = 4880
Number of processor patches = 3
Number of processor faces = 13132
Number of boundary faces = 330832

Processor 1
Number of cells = 1540649
Number of faces shared with processor 0 = 6519
Number of faces shared with processor 2 = 4783
Number of faces shared with processor 3 = 709
Number of processor patches = 3
Number of processor faces = 12011
Number of boundary faces = 333490

Processor 2
Number of cells = 1653908
Number of faces shared with processor 0 = 1733
Number of faces shared with processor 1 = 4783
Number of faces shared with processor 3 = 5770
Number of processor patches = 3
Number of processor faces = 12286
Number of boundary faces = 353823

Processor 3
Number of cells = 1604279
Number of faces shared with processor 0 = 4880
Number of faces shared with processor 1 = 709
Number of faces shared with processor 2 = 5770
Number of processor patches = 3
Number of processor faces = 11359
Number of boundary faces = 348006

Number of processor faces = 24394
Max number of cells = 1653908 (4.48282% above average 1.58295e+06)
Max number of processor patches = 3 (0% above average 3)
Max number of faces between processors = 13132 (7.66582% above average 12197)

Time = 0

Processor 0: field transfer
Processor 1: field transfer
Processor 2: field transfer
Processor 3: field transfer

End.

priya@GEOSCIENCE-PC:~/OpenFOAM/priya-2.4.0/run/simulation$ mpirun –np <nProcs> buoyantBoussinesqSimpleFoam -parallel >& log &
[1] 20734
bash: nProcs: No such file or directory
priya@GEOSCIENCE-PC:~/OpenFOAM/priya-2.4.0/run/simulation$ mpirun –np 4 buoyantBoussinesqSimpleFoam -parallel >& log &
[2] 20735
[1] Exit 1 mpirun –np -parallel < nProcs > buoyantBoussinesqSimpleFoam &> log


nothing seems to happen . I am using opefoam240 version.

Any help will be appreciated
Thanks
:confused::confused:

akidess January 25, 2016 07:58

Your first command is wrong. The second command might have worked, but you need to post the contents of the log file.

Priya Somasundaran January 25, 2016 08:39

log file
 
Thanks for the quick reply

sorry i forgot to post the log file

###### my command :

mpirun –np 4 buoyantBoussinesqSimpleFoam -parallel >& log &


###### the log file

--------------------------------------------------------------------------
mpirun was unable to launch the specified application as it could not find an executable:

Executable: –np
Node: GEOSCIENCE-PC

while attempting to start process rank 0.
--------------------------------------------------------------------------

I tried to search if there are any another MPI process my the command

update-alternatives --config mpirun
There is only one alternative in link group mpirun (providing /usr/bin/mpirun): /usr/bin/mpirun.openmpi
Nothing to configure.

any clues ?

akidess January 25, 2016 08:50

I bet you copy and pasted that command. Type it in proper, and all will be good (i. e. I believe the hyphen is not what you think it is).


All times are GMT -4. The time now is 07:10.