CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

OpenFOAM-5 parallel problem on hpc

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   August 24, 2019, 04:04
Default OpenFOAM-5 parallel problem on hpc
  #1
New Member
 
Xusong
Join Date: Aug 2019
Posts: 8
Rep Power: 6
littlebro is on a distinguished road
Hi,guys,
I have this problem when I use OpenFOAM-5 for running in parallel on hpc.
This is my script:
Quote:
#BSUB -J simpleFoamtest-parallel
#BSUB -n 10
#BSUB -q charge_normal
#BSUB -o %J.out
#BSUB -e %J.err

module load intel/2019.1.144 impi/2019.1.144 mkl/2019.1.144 openfoam/5
source /share/apps/openfoam/OpenFOAM-5/etc/bashrc

cd $FOAM_RUN/mypitzDaily
blockMesh
mpirun simpleFoam -parallel
But something went wrong in my err file:
Quote:
Abort(1094543) on node 5 (rank 5 in comm 0): Fatal error in PMPI_Init_thread: Other MPI error, error stack:
MPIR_Init_thread(639)......:
MPID_Init(860).............:
MPIDI_NM_mpi_init_hook(689): OFI addrinfo() failed (ofi_init.h:689:MPIDI_NM_mpi_init_hook:No data available)
[cli_5]: readline failed
Attempting to use an MPI routine before initializing MPICH
However, when I remove -parallel, there is no error in the err file, but multiple identical time steps will appear in the out file.
like this:
Quote:
smoothSolver: Solving for Ux, Initial residual = 1, Final residual = 0.0538101, No Iterations 1
smoothSolver: Solving for Ux, Initial residual = 1, Final residual = 0.0538101, No Iterations 1
smoothSolver: Solving for Uy, Initial residual = 1, Final residual = 0.030925, No Iterations 2
smoothSolver: Solving for Uy, Initial residual = 1, Final residual = 0.030925, No Iterations 2
smoothSolver: Solving for Ux, Initial residual = 1, Final residual = 0.0538101, No Iterations 1
smoothSolver: Solving for Uy, Initial residual = 1, Final residual = 0.030925, No Iterations 2
smoothSolver: Solving for Ux, Initial residual = 1, Final residual = 0.0538101, No Iterations 1
smoothSolver: Solving for Ux, Initial residual = 1, Final residual = 0.0538101, No Iterations 1
smoothSolver: Solving for Ux, Initial residual = 1, Final residual = 0.0538101, No Iterations 1
smoothSolver: Solving for Ux, Initial residual = 1, Final residual = 0.0538101, No Iterations 1
smoothSolver: Solving for Ux, Initial residual = 1, Final residual = 0.0538101, No Iterations 1
smoothSolver: Solving for Ux, Initial residual = 1, Final residual = 0.0538101, No Iterations 1
smoothSolver: Solving for Uy, Initial residual = 1, Final residual = 0.030925, No Iterations 2
smoothSolver: Solving for Uy, Initial residual = 1, Final residual = 0.030925, No Iterations 2
smoothSolver: Solving for Uy, Initial residual = 1, Final residual = 0.030925, No Iterations 2
smoothSolver: Solving for Uy, Initial residual = 1, Final residual = 0.030925, No Iterations 2
smoothSolver: Solving for Uy, Initial residual = 1, Final residual = 0.030925, No Iterations 2
smoothSolver: Solving for Uy, Initial residual = 1, Final residual = 0.030925, No Iterations 2
smoothSolver: Solving for Ux, Initial residual = 1, Final residual = 0.0538101, No Iterations 1
smoothSolver: Solving for Uy, Initial residual = 1, Final residual = 0.030925, No Iterations 2
GAMG: Solving for p, Initial residual = 1, Final residual = 0.068427, No Iterations 17
time step continuity errors : sum local = 1.19733, global = 0.179883, cumulative = 0.179883
GAMG: Solving for p, Initial residual = 1, Final residual = 0.068427, No Iterations 17
time step continuity errors : sum local = 1.19733, global = 0.179883, cumulative = 0.179883
GAMG: Solving for p, Initial residual = 1, Final residual = 0.068427, No Iterations 17
time step continuity errors : sum local = 1.19733, global = 0.179883, cumulative = 0.179883
GAMG: Solving for p, Initial residual = 1, Final residual = 0.068427, No Iterations 17
GAMG: Solving for p, Initial residual = 1, Final residual = 0.068427, No Iterations 17
time step continuity errors : sum local = 1.19733, global = 0.179883, cumulative = 0.179883
GAMG: Solving for p, Initial residual = 1, Final residual = 0.068427, No Iterations 17
time step continuity errors : sum local = 1.19733, global = 0.179883, cumulative = 0.179883
GAMG: Solving for p, Initial residual = 1, Final residual = 0.068427, No Iterations 17
time step continuity errors : sum local = 1.19733, global = 0.179883, cumulative = 0.179883
time step continuity errors : sum local = 1.19733, global = 0.179883, cumulative = 0.179883
GAMG: Solving for p, Initial residual = 1, Final residual = 0.068427, No Iterations 17
time step continuity errors : sum local = 1.19733, global = 0.179883, cumulative = 0.179883
GAMG: Solving for p, Initial residual = 1, Final residual = 0.068427, No Iterations 17
smoothSolver: Solving for epsilon, Initial residual = 0.23288, Final residual = 0.0114612, No Iterations 3
bounding epsilon, min: -1.98669 max: 1033.28 average: 35.8884
time step continuity errors : sum local = 1.19733, global = 0.179883, cumulative = 0.179883
smoothSolver: Solving for epsilon, Initial residual = 0.23288, Final residual = 0.0114612, No Iterations 3
bounding epsilon, min: -1.98669 max: 1033.28 average: 35.8884
GAMG: Solving for p, Initial residual = 1, Final residual = 0.068427, No Iterations 17
time step continuity errors : sum local = 1.19733, global = 0.179883, cumulative = 0.179883
smoothSolver: Solving for epsilon, Initial residual = 0.23288, Final residual = 0.0114612, No Iterations 3
bounding epsilon, min: -1.98669 max: 1033.28 average: 35.8884
smoothSolver: Solving for epsilon, Initial residual = 0.23288, Final residual = 0.0114612, No Iterations 3
bounding epsilon, min: -1.98669 max: 1033.28 average: 35.8884
smoothSolver: Solving for epsilon, Initial residual = 0.23288, Final residual = 0.0114612, No Iterations 3
bounding epsilon, min: -1.98669 max: 1033.28 average: 35.8884
smoothSolver: Solving for epsilon, Initial residual = 0.23288, Final residual = 0.0114612, No Iterations 3
smoothSolver: Solving for k, Initial residual = 1, Final residual = 0.0454552, No Iterations 3
bounding epsilon, min: -1.98669 max: 1033.28 average: 35.8884
ExecutionTime = 0.3 s ClockTime = 1 s

smoothSolver: Solving for epsilon, Initial residual = 0.23288, Final residual = 0.0114612, No Iterations 3
bounding epsilon, min: -1.98669 max: 1033.28 average: 35.8884
Time = 2

smoothSolver: Solving for k, Initial residual = 1, Final residual = 0.0454552, No Iterations 3
ExecutionTime = 0.29 s ClockTime = 1 s

Time = 2

smoothSolver: Solving for k, Initial residual = 1, Final residual = 0.0454552, No Iterations 3
smoothSolver: Solving for epsilon, Initial residual = 0.23288, Final residual = 0.0114612, No Iterations 3
ExecutionTime = 0.3 s ClockTime = 1 s

bounding epsilon, min: -1.98669 max: 1033.28 average: 35.8884
Time = 2

smoothSolver: Solving for epsilon, Initial residual = 0.23288, Final residual = 0.0114612, No Iterations 3
bounding epsilon, min: -1.98669 max: 1033.28 average: 35.8884
smoothSolver: Solving for epsilon, Initial residual = 0.23288, Final residual = 0.0114612, No Iterations 3
bounding epsilon, min: -1.98669 max: 1033.28 average: 35.8884
smoothSolver: Solving for k, Initial residual = 1, Final residual = 0.0454552, No Iterations 3
ExecutionTime = 0.31 s ClockTime = 1 s

smoothSolver: Solving for k, Initial residual = 1, Final residual = 0.0454552, No Iterations 3
smoothSolver: Solving for k, Initial residual = 1, Final residual = 0.0454552, No Iterations 3
ExecutionTime = 0.33 s ClockTime = 1 s

ExecutionTime = 0.3 s ClockTime = 1 s

Time = 2

Time = 2

Time = 2

smoothSolver: Solving for k, Initial residual = 1, Final residual = 0.0454552, No Iterations 3
ExecutionTime = 0.31 s ClockTime = 1 s

Time = 2

smoothSolver: Solving for k, Initial residual = 1, Final residual = 0.0454552, No Iterations 3
ExecutionTime = 0.31 s ClockTime = 1 s

Time = 2

smoothSolver: Solving for k, Initial residual = 1, Final residual = 0.0454552, No Iterations 3
ExecutionTime = 0.32 s ClockTime = 1 s

smoothSolver: Solving for k, Initial residual = 1, Final residual = 0.0454552, No Iterations 3
Time = 2

ExecutionTime = 0.31 s ClockTime = 1 s

Time = 2
It looks messy and does not run in parallel.
Why do I get an error when I add -parallel on the command line?
Quote:
mpirun simpleFoam -parallel
I use lsf job management system on hpc.
Best wish! Thanks.
littlebro is offline   Reply With Quote

Old   August 25, 2019, 08:28
Default
  #2
Senior Member
 
Joachim Herb
Join Date: Sep 2010
Posts: 650
Rep Power: 21
jherb is on a distinguished road
This looks like you are not starting the solver in parallel but n times a single-process solver.


So make sure, you start it corretly in parallel using the correct options for mpirun (and of course add the option -parallel to the solver).


First test your job skript with calling something like hostname from mpirun to see, that this part works.
jherb is offline   Reply With Quote

Old   August 25, 2019, 23:05
Default
  #3
New Member
 
Xusong
Join Date: Aug 2019
Posts: 8
Rep Power: 6
littlebro is on a distinguished road
Quote:
Originally Posted by jherb View Post
This looks like you are not starting the solver in parallel but n times a single-process solver.


So make sure, you start it corretly in parallel using the correct options for mpirun (and of course add the option -parallel to the solver).


First test your job skript with calling something like hostname from mpirun to see, that this part works.
Thank you, I solved my problem. I didn't need to add ‘mpirun -parallel’. I can directly enter simpleFoam and calculate it in parallel.
littlebro is offline   Reply With Quote

Old   August 28, 2019, 00:50
Default
  #4
New Member
 
Aashay Tinaikar
Join Date: May 2019
Location: Boston
Posts: 19
Rep Power: 6
ARTisticCFD is on a distinguished road
Hii Xusong,



Welcome to OpenFOAM. It is a great toolbox for CFD.

I remember the first time I started a parallel run. I also missed the -parallel flag during the solver execution. "-parallel" flag indicates OpenFOAM to use the parallel version of that solver and not the serial one. If you omit that flag, OpenFOAM will run a particular solver separately on each decomposed domain. That is why you can see multiple output lines for the same time step and variable.

Regarding your error on hpc, I don't have experience with #BSUB based system. However, I have been using OpenFOAM on a SLURM based HPC, and it is working perfectly fine. From the commands that you have executed and the output in the error file, it seems that you have not loaded the mpi module on your hpc. (I need to load that before running an mpi command on my hpc)

Without -parallel flag, it just runs everything in serial and hence no errors. Let me know if this helps

Good luck :-)
ARTisticCFD is offline   Reply With Quote

Old   August 28, 2019, 02:04
Default
  #5
New Member
 
Xusong
Join Date: Aug 2019
Posts: 8
Rep Power: 6
littlebro is on a distinguished road
yes,you are right ,I was wrong before,but I have loaded the mpi,like this:
Quote:
module load intel/2019.1.144 impi/2019.1.144 mkl/2019.1.144 openfoam/5
Should impi also work for OpenFOAM?
littlebro is offline   Reply With Quote

Old   August 28, 2019, 02:35
Default
  #6
Senior Member
 
Gerry Kan's Avatar
 
Gerry Kan
Join Date: May 2016
Posts: 347
Rep Power: 10
Gerry Kan is on a distinguished road
Hi Xusong:

I am not very familiar with lsf (I have use qsub and slurm), but I think some other members of this forum are already giving you suggestions.

However, when I looked at your submission script, I noticed your run process is not quite complete. I don't know if you have solved them in the mean time, but here is a complete workflow for a parallel run. I am using the pitzDaily tutorial for example as you are using it as test case:

Mesh and solve:

blockMesh
decomposePar -force [1][2]
mpirun -np X simpleFoam -parallel [1]

Post-processing:

reconstructPar
foamToVTK

Note [1] - The number of processors used is defined in system/decomposeParDict. Replace X with the number of cores used
Note [2] - The "-force" switch overwrites existing decomposed domains from previous runs

Alternatively, if you look at the pitzDaily case directory, you will also see some additional commands like Allpre, Allrun, and Allrun-parallel. You can look into them and discover some additional scripts provided by OpenFOAM that will make your tool chain a bit less tedious. I personally don't use them, however.

Hope that helps, Gerry.
Gerry Kan is offline   Reply With Quote

Old   August 28, 2019, 02:48
Default
  #7
New Member
 
Xusong
Join Date: Aug 2019
Posts: 8
Rep Power: 6
littlebro is on a distinguished road
Thank you,Gerry Kan.
And I test your script,
Quote:
#BSUB -J simpleFoamtest-parallel
#BSUB -n 4
#BSUB -q charge_normal
#BSUB -o %J.out
#BSUB -e %J.err

module load intel/2019.1.144 mkl/2019.1.144 openfoam/5
source /share/apps/openfoam/OpenFOAM-5/etc/bashrc

cd $FOAM_RUN/mypitzDaily
blockMesh
decomposePar -force
mpirun -n 4 simpleFoam -parallel
reconstructPar
Also ,I use mpich3.3 instead of impi/2019.1.144,
then ,the error occurs like this:
Quote:
--> FOAM FATAL ERROR:
Trying to use the dummy Pstream library.
This dummy library cannot be used in parallel mode

From function static bool Foam::UPstream::init(int &, char **&)
in file UPstream.C at line 37.

FOAM exiting



--> FOAM FATAL ERROR:
Trying to use the dummy Pstream library.
This dummy library cannot be used in parallel mode

From function static bool Foam::UPstream::init(int &, char **&)
in file UPstream.C at line 37.

FOAM exiting


--> FOAM FATAL ERROR:
Trying to use the dummy Pstream library.
This dummy library cannot be used in parallel mode

From function static bool Foam::UPstream::init(int &, char **&)
in file UPstream.C at line 37.

FOAM exiting



--> FOAM FATAL ERROR:
Trying to use the dummy Pstream library.
This dummy library cannot be used in parallel mode

From function static bool Foam::UPstream::init(int &, char **&)
in file UPstream.C at line 37.

FOAM exiting
I don't know how to do this.
Thanks everyone.
littlebro is offline   Reply With Quote

Old   August 28, 2019, 04:23
Default
  #8
Senior Member
 
Gerry Kan's Avatar
 
Gerry Kan
Join Date: May 2016
Posts: 347
Rep Power: 10
Gerry Kan is on a distinguished road
Hi Xusong:

It looks like you did not build OpenFOAM correctly. Pstream is part of the OpenFOAM parallel processing. Make sure you pick up the MPI liabraries (i.e., load module ...).

Also observe the difference nuances between difference compilers / MPI libraries.

Gerry.
Gerry Kan is offline   Reply With Quote

Old   August 28, 2019, 11:47
Default
  #9
New Member
 
Aashay Tinaikar
Join Date: May 2019
Location: Boston
Posts: 19
Rep Power: 6
ARTisticCFD is on a distinguished road
Quote:
Originally Posted by littlebro View Post
yes,you are right ,I was wrong before,but I have loaded the mpi,like this:

Should impi also work for OpenFOAM?

I am not sure whether impi works for OpenFOAM. Maybe some senior members can answer that better. I have always used openMpi for executing parallel runs and it has run perfectly every time.

Also, I think that it is the default method for OpenFOAM (not sure of the latest versions). Can you try to load the openMpi module in place of impi? You can ask the hpc helpdesk for that.

Note: Sometimes, I use to get warnings for parallel runs on hpc, which I never got on local PC. I inquired this with my hpc help desk and he told me something on the following lines. "This error seems to be strange. Looks like mpi libraries are not properly compiled. It may be possible that you had installed OpenFOAM from one particular node cluster and the mpi libraries were compiled for that specific cluster type only. Thus if you run the job on some another cluster node, then those libraries give some warnings"



Thus, maybe your help desk could also help you with some things
ARTisticCFD is offline   Reply With Quote

Old   August 28, 2019, 17:39
Default
  #10
Senior Member
 
Gerry Kan's Avatar
 
Gerry Kan
Join Date: May 2016
Posts: 347
Rep Power: 10
Gerry Kan is on a distinguished road
Dear Xusong:

If you haven't read it already, perhaps the folling link would help you rebuild OpenFOAM with the Intel compiler and MPI library.

https://linuxcluster.wordpress.com/2...ith-intel-mpi/

I am interested in seeing hoe it went. I have so far worked with g++ and OpenMPI, and had very little success with MPICH.

Gerry.
Gerry Kan is offline   Reply With Quote

Old   August 30, 2019, 09:23
Default
  #11
New Member
 
Xusong
Join Date: Aug 2019
Posts: 8
Rep Power: 6
littlebro is on a distinguished road
Hi,Gerry Kan.
I think the biggest problem is the OpenMPI.I will install the OpenMPI and try to test.
Thanks.
littlebro is offline   Reply With Quote

Old   August 30, 2019, 09:32
Default
  #12
New Member
 
Xusong
Join Date: Aug 2019
Posts: 8
Rep Power: 6
littlebro is on a distinguished road
Thanks,Aashay Tinaikar
Yeah,I think you are right.And I will ask my help desk to install the OpenMPI.
If it works, I will post the result.
littlebro is offline   Reply With Quote

Old   August 31, 2019, 04:12
Default
  #13
New Member
 
Xusong
Join Date: Aug 2019
Posts: 8
Rep Power: 6
littlebro is on a distinguished road
Quote:
Originally Posted by Gerry Kan View Post
Dear Xusong:

If you haven't read it already, perhaps the folling link would help you rebuild OpenFOAM with the Intel compiler and MPI library.

https://linuxcluster.wordpress.com/2...ith-intel-mpi/

I am interested in seeing hoe it went. I have so far worked with g++ and OpenMPI, and had very little success with MPICH.

Gerry.
Sorry to bother you again,
I uesd it and the hpc installed OpenFOAM-5.0 .
Frist I module load the environment,and when I don't load
Quote:
impi/2019.1.144
,
I also echo which mpirun , it displays
Quote:
/usr/mpi/gcc/openmpi-3.0.0rc6/bin/mpirun
then the err file appears
Quote:
blockMesh: error while loading shared libraries: libfabric.so.1: cannot open shared object file: No such file or directory
decomposePar: error while loading shared libraries: libfabric.so.1: cannot open shared object file: No such file or directory
simpleFoam: error while loading shared libraries: libfabric.so.1: cannot open shared object file: No such file or directory
So,I have to load the impi and i echo which mpirun ,it displays
Quote:
/share/intel/2019u1/compilers_and_libraries_2019.1.144/linux/mpi/intel64/bin/mpirun
My script is
Quote:
#BSUB -J simpleFoamtest-parallel
#BSUB -n 10
#BSUB -q charge_normal
#BSUB -o %J.out
#BSUB -e %J.err

cd $FOAM_RUN/pitzDaily

blockMesh
decomposePar
mpirun simpleFoam -parallel
And i submit it ,the err is
Quote:
Trying to use the dummy Pstream library.
This dummy library cannot be used in parallel mode

From function static bool Foam::UPstream::init(int &, char **&)
in file UPstream.C at line

--> FOAM FATAL ERROR:
Trying to use the dummy Pstream library.
This dummy library cannot be used in parallel mode

From function static bool Foam::UPstream::init(int &, char **&)
in file UPstream.C at line 37.

FOAM exiting
Then i found the same problem
PHP Code:
https://www.cfd-online.com/Forums/openfoam-installation/71741-pstream-library-error-parallel-mode.html 
I follow the option ,
Quote:
change the $WM_PROJECT_DIR/etc/bashrc file

Code:
export WM_MPLIB=SYSTEMOPENMPI
then change the $WM_PROJECT_DIR/etc/config/settings.sh

Code:
export FOAM_MPI=openmpi-system
Then source the etc/bashrc file again and recompiling Pstream solved my problem.

cd $FOAM_SRC
cd Pstream
./Allwmake
and try again.it displays
Quote:
blockMesh: error while loading shared libraries: libfabric.so.1: cannot open shared object file: No such file or directory
decomposePar: error while loading shared libraries: libfabric.so.1: cannot open shared object file: No such file or directory
simpleFoam: error while loading shared libraries: libfabric.so.1: cannot open shared object file: No such file or directory
simpleFoam: error while loading shared libraries: libfabric.so.1: cannot open shared object file: No such file or directory
simpleFoam: error while loading shared libraries: libfabric.so.1: cannot open shared object file: No such file or directory
simpleFoam: error while loading shared libraries: libfabric.so.1: cannot open shared object file: No such file or directory
simpleFoam: error while loading shared libraries: libfabric.so.1: cannot open shared object file: No such file or directory
simpleFoam: error while loading shared libraries: libfabric.so.1: cannot open shared object file: No such file or directory
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

Process name: [[54797,1],3]
Exit code: 127
--------------------------------------------------------------------------
So,I I changed it back,submit the same parallel case .
Now it displays
Quote:
Abort(1094543) on node 3 (rank 3 in comm 0): Fatal error in PMPI_Init_thread: Other MPI error, error stack:
MPIR_Init_thread(639)......:
MPID_Init(860).............:
MPIDI_NM_mpi_init_hook(689): OFI addrinfo() failed (ofi_init.h:689:MPIDI_NM_mpi_init_hook:No data available)
Abort(1094543) on node 4 (rank 4 in comm 0): Fatal error in PMPI_Init_thread: Other MPI error, error stack:
MPIR_Init_thread(639)......:
MPID_Init(860).............:
MPIDI_NM_mpi_init_hook(689): OFI addrinfo() failed (ofi_init.h:689:MPIDI_NM_mpi_init_hook:No data available)
[cli_4]: readline failed
Attempting to use an MPI routine before initializing MPICH
littlebro is offline   Reply With Quote

Old   September 4, 2019, 23:39
Default
  #14
New Member
 
Aashay Tinaikar
Join Date: May 2019
Location: Boston
Posts: 19
Rep Power: 6
ARTisticCFD is on a distinguished road
Hii Xusong,



Your problem might be more complicated, but I just noticed that you don't specify -np flag in your mpi run command.

For 4 cores parallel run, the execution command I usually use is

"mpirun -np 4 simpleFoam -parallel"

I tried to run it without using the -np flag, it threw error! Maybe you would want to change it.

Your number of cores are equal to the number of decomposed cases.
ARTisticCFD is offline   Reply With Quote

Old   September 5, 2019, 09:30
Default
  #15
New Member
 
Xusong
Join Date: Aug 2019
Posts: 8
Rep Power: 6
littlebro is on a distinguished road
Quote:
Originally Posted by ARTisticCFD View Post
Hii Xusong,



Your problem might be more complicated, but I just noticed that you don't specify -np flag in your mpi run command.

For 4 cores parallel run, the execution command I usually use is

"mpirun -np 4 simpleFoam -parallel"

I tried to run it without using the -np flag, it threw error! Maybe you would want to change it.

Your number of cores are equal to the number of decomposed cases.
Dear Aashay,
I tried use the code that add the -np 4 ,but it failed .The err file likes this:
Quote:
--> FOAM FATAL ERROR:
Trying to use the dummy Pstream library.
This dummy library cannot be used in parallel mode

From function static bool Foam::UPstream::init(int &, char **&, bool)
in file UPstream.C at line 37.

FOAM exiting
littlebro is offline   Reply With Quote

Old   September 5, 2019, 10:20
Default
  #16
New Member
 
Aashay Tinaikar
Join Date: May 2019
Location: Boston
Posts: 19
Rep Power: 6
ARTisticCFD is on a distinguished road
Quote:
Originally Posted by littlebro View Post
Dear Aashay,
I tried use the code that add the -np 4 ,but it failed .The err file likes this:
I think that your installation was not complete. I never faced this issue so have no personal experience with this, but I tried to google it and found this link that might have important information.
Problems running in parallel - Pstream not available

Wyldckat (Bruno) is the best at helping people out. You might want to look into his comments if you already haven't.

Good luck and try to post the solution here once you figure it out. I am interested to know what gives this error.
ARTisticCFD is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
OpenFOAM course for beginners Jibran OpenFOAM Announcements from Other Sources 2 November 4, 2019 08:51
The problem when i use parallel computation for mesh deforming. Hiroaki Sumikawa OpenFOAM Running, Solving & CFD 0 November 20, 2018 02:58
Error running openfoam in parallel fede32 OpenFOAM Programming & Development 5 October 4, 2018 16:38
How can i make a parallel programming in OpenFOAM? jignesh_thaker2007 OpenFOAM Running, Solving & CFD 3 July 2, 2014 09:37
[Commercial meshers] Handling cyclic BC from gambit to openfoam for a cascade airfoil problem - OF 1.6 maverick OpenFOAM Meshing & Mesh Conversion 2 June 18, 2011 04:36


All times are GMT -4. The time now is 09:11.