Error when using mpirun for parallel case
I have been working through the depthCharge2D tutorial on Win8 using cygwin. I have run decomposePar, and then I run the following to initiate the solve process...
Code:
mpirun -np 4 compressibleInterFoam -parallel > log.compress Code:
--> FOAM FATAL ERROR: Code:
mpicc -o hello_c hello_c.c Thoughts? |
Just for making sure everything went well before the installation was supposed to start, please check the following:
Did the decomposition succeed, i.e. did you have folders processor0 to processor3 in your case-directory afterwards? If not, please check system/decomposeParDict for the correct decomposition settings. Otherwise: Do you have four processors available at all? Does parallel execution of jobs in general work on your machine? Please simply check with one of the parallel-tutorials, e.g. the incompressible/simpleFoam/motorBike case by using nothing but the Allrun-script. This way we see if your machine is setup correctly. It should take not more than ten minutes and will exclude at least one potential reason for the failure. This might seem to be silly questions, but sometimes one misses the simplest things... Cheers, Bernhard |
Thank you for your suggestions...I'll get to those either Monday or Tuesday.
In some of my other troubleshooting, I realized wmake wasn't working. Would this make a difference? I added a source to the etc/bashrc into my home .bashrc that I also realized I needed. When I did this, the wmake command worked, but I lost access to all the solvers!? So, I'll need to add a PATH to the solvers into to .bashrc. |
Ok, I copied the motorbike tutorial to my home directory then ran Allrun. Here is what was displayed in the terminal window:
Code:
cp: cannot stat ‘/resources/geometry/motorBike.obj.gz’: No such file or directory Note: when I run "which icoFoam" the path given is... /opt/OpenFOAM/OpenFOAM-2.3.x/platforms/cygwin64mingw-w64DPOpt/bin/icoFoam Can you give me some direction? Thanks! |
Hi,
About a problem with "/resources/geometry/motorBike.obj.gz", do you have FOAM_TUTORIALS environment variable set? As Allrun file starts with Code:
cp $FOAM_TUTORIALS/resources/geometry/motorBike.obj.gz constant/triSurface/ Code:
mpirun -np 4 "compressibleInterFoam -parallel" > log.compress |
Alexey,
Yes, once I add... Code:
source /opt/OpenFOAM/OpenFOAM-2.3.x/etc/bashrc Code:
Running surfaceFeatureExtract on /home/Mike/motorBike Code:
icoFoam -help I then tried adding... Code:
PATH=/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/cygwin64mingw-w64DPOpt/bin:${PATH} Code:
icoFoam.exe -help I'm happy to change/move things around if necessary :) |
Hi,
Well, in fact Allrun script may fail, look in the log-files it created. For example here's Allrun script which tries to execute non-existent utilities: Code:
#!/bin/sh Code:
alexey at daphne in cavity$ ./Allrun Code:
$HOME/OpenFOAM/OpenFOAM-2.3.0/bin/tools/RunFunctions: line 52: blockMeshD: command not found Also I a little bit mess up messages. In fact this message Code:
--> FOAM FATAL ERROR: |
It looks like a few of the log files have the following error:
Code:
--> FOAM FATAL ERROR: http://www.cfdsupport.com/install-op...r-windows.html ...and also added the nano, openmpi, and libopenmpi-devel packages. |
Hi,
I think this http://www.cfd-online.com/Forums/ope...pport-com.html is somehow relevant to your case (or you've already read it, as it is third links in Google search on 'cfdsupport openfoam openmpi'?). Guess the problem with missing libraries is that you're trying to execute applications in parallel with OpenMPI while applications were compiled with MS MPI. |
Alexey,
That was it! (I may/may not have read that post, but at the time I don't think I would have understood the implications of what was mentioned.) I also emailed CFD-support and received some feedback. For this particular package, mpiexec.exe takes the place of mpirun. So, for the depthCharge tutorial, it would be... Code:
mpiexec.exe -n 4 compressibleInterFoam.exe -parallel > log.compress Thanks! |
thanks it helped really
Quote:
|
All times are GMT -4. The time now is 16:02. |