CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Installation (https://www.cfd-online.com/Forums/openfoam-installation/)
-   -   [Other] blueCFD-Core-2016 user compiled solvers not running in parallel (https://www.cfd-online.com/Forums/openfoam-installation/200437-bluecfd-core-2016-user-compiled-solvers-not-running-parallel.html)

sbence April 3, 2018 07:17

blueCFD-Core-2016 user compiled solvers not running in parallel
 
Dear All,

I am facing the issue, that the solvers I compile in blueCFD-Core-2016 are not running in parallel. E.g. running the pitzDaily example with the original simpleFoam solver works:

Code:

$ mpirun -np 4 simpleFoam.exe -parallel
/*---------------------------------------------------------------------------*\
| =========                |                                                |
| \\      /  F ield        | OpenFOAM: The Open Source CFD Toolbox          |
|  \\    /  O peration    | Version:  4.x                                  |
|  \\  /    A nd          | Web:      www.OpenFOAM.org                      |
|    \\/    M anipulation  |                                                |
\*---------------------------------------------------------------------------*/
/*  Windows 32 and 64 bit porting by blueCAPE: http://www.bluecape.com.pt  *\
|  Based on Windows porting (2.0.x v4) by Symscape: http://www.symscape.com  |
\*---------------------------------------------------------------------------*/
Build  : 4.x-f59a08eaaf41
Exec  : simpleFoam.exe -parallel
Date  : Apr 03 2018
Time  : 12:10:01
Host  : "KTMATRDWS635"
PID    : 9668
Case  : E:/TESTS/OPENFOAM/mpi/pitzDaily
nProcs : 4
Slaves :
3
(
"KTMATRDWS635.10192"
"KTMATRDWS635.6884"
"KTMATRDWS635.12208"
)

Pstream initialized with:
    floatTransfer      : 0
    nProcsSimpleSum    : 0
    commsType          : nonBlocking
    polling iterations : 0
SigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Allowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0


SIMPLE: convergence criteria
    field p      tolerance 0.01
    field U      tolerance 0.001
    field "(k|epsilon|omega|f|v2)"      tolerance 0.001

Reading field p

Reading field U

Reading/calculating face flux field phi

Selecting incompressible transport model Newtonian
Selecting turbulence model type RAS
Selecting RAS turbulence model kEpsilon
kEpsilonCoeffs
{
    Cmu            0.09;
    C1              1.44;
    C2              1.92;
    C3              0;
    sigmak          1;
    sigmaEps        1.3;
}

No MRF models present

No finite volume options present


Starting time loop

streamLine streamlines:
    automatic track length specified through number of sub cycles : 5

Time = 1

smoothSolver:  Solving for Ux, Initial residual = 1, Final residual = 0.0522766, No Iterations 1
smoothSolver:  Solving for Uy, Initial residual = 1, Final residual = 0.0297192, No Iterations 2
GAMG:  Solving for p, Initial residual = 1, Final residual = 0.09078, No Iterations 17
time step continuity errors : sum local = 1.58846, global = 0.224934, cumulative = 0.224934
smoothSolver:  Solving for epsilon, Initial residual = 0.198074, Final residual = 0.00900106, No Iterations 3
smoothSolver:  Solving for k, Initial residual = 1, Final residual = 0.044153, No Iterations 3
ExecutionTime = 0.093 s  ClockTime = 0 s

Time = 2

smoothSolver:  Solving for Ux, Initial residual = 0.449772, Final residual = 0.0335978, No Iterations 5
smoothSolver:  Solving for Uy, Initial residual = 0.276751, Final residual = 0.026266, No Iterations 4
GAMG:  Solving for p, Initial residual = 0.0514668, Final residual = 0.00405939, No Iterations 16
time step continuity errors : sum local = 3.67803, global = 0.176707, cumulative = 0.401641
smoothSolver:  Solving for epsilon, Initial residual = 0.142642, Final residual = 0.0092437, No Iterations 3
bounding epsilon, min: -103.176 max: 18221.7 average: 81.1715
smoothSolver:  Solving for k, Initial residual = 0.397026, Final residual = 0.0304835, No Iterations 3
ExecutionTime = 0.109 s  ClockTime = 0 s

But if I compile simpleFoam again as testSimpleFoam, after 10-20s of waiting I get absolutely no output from the execution:

Code:

user@machine MINGW64 OpenFOAM-4.x e/TESTS/OPENFOAM/mpi/pitzDaily
$ mpirun -np 4 testSimpleFoam.exe -parallel

user@machine MINGW64 OpenFOAM-4.x /e/TESTS/OPENFOAM/mpi/pitzDaily
$

I see no errors during the compilation, serial execution of testSimpleFoam runs as expected.

Has any of you experienced this issue?

With Best Regards,
Bence

wyldckat April 4, 2018 06:46

Quick answer:
  1. When running from the command line, you must use foamJob, e.g.:
    Code:

    foamJob -p -s testSimpleFoam
    For more details about the options, run:
    Code:

    foamJob -help
  2. Or use the Allrun scripts that OpenFOAM uses, as a basis for your cases, namely for running in parallel, the Allrun scripts use runParallel, e.g.:
    Code:

    runParallel testSimpleFoam

sbence April 4, 2018 07:15

Dear Bruno,

This does not help.
Code:

foamJob -p -s testSimpleFoam
, or
Code:

runParallel testSimpleFoam
has the same effect as
Code:

mpirun -np 4 testSimpleFoam.exe -parallel
In the Task Manager I see the testSimpleFoam processes to come up for a few secs, than they stop without producing any output in the log or in the command line.

BR,
Bence

wyldckat April 4, 2018 08:15

What happens if you simply run:
Code:

testSimpleFoam
Is there any output?

sbence April 4, 2018 08:40

Yes, it runs as expected:
Code:

user@machine MINGW64 OpenFOAM-4.x /e/TESTS/OPENFOAM/mpi/pitzDaily
$ testSimpleFoam.exe
/*---------------------------------------------------------------------------*\
| =========                |                                                |
| \\      /  F ield        | OpenFOAM: The Open Source CFD Toolbox          |
|  \\    /  O peration    | Version:  4.x                                  |
|  \\  /    A nd          | Web:      www.OpenFOAM.org                      |
|    \\/    M anipulation  |                                                |
\*---------------------------------------------------------------------------*/
/*  Windows 32 and 64 bit porting by blueCAPE: http://www.bluecape.com.pt  *\
|  Based on Windows porting (2.0.x v4) by Symscape: http://www.symscape.com  |
\*---------------------------------------------------------------------------*/
Build  : 4.x-f59a08eaaf41
Exec  : C:/PROGRA~1/BLUECF~1/ofuser-of4/platforms/mingw_w64GccDPInt32Opt/bin/testSimpleFoam.exe
Date  : Apr 04 2018
Time  : 13:39:03
Host  : "KTMATRDWS635"
PID    : 7876
Case  : E:/TESTS/OPENFOAM/mpi/pitzDaily
nProcs : 1
SigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Allowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0


SIMPLE: convergence criteria
    field p      tolerance 0.01
    field U      tolerance 0.001
    field "(k|epsilon|omega|f|v2)"      tolerance 0.001

Reading field p

Reading field U

Reading/calculating face flux field phi

Selecting incompressible transport model Newtonian
Selecting turbulence model type RAS
Selecting RAS turbulence model kEpsilon
kEpsilonCoeffs
{
    Cmu            0.09;
    C1              1.44;
    C2              1.92;
    C3              0;
    sigmak          1;
    sigmaEps        1.3;
}

No MRF models present

No finite volume options present


Starting time loop

streamLine streamlines:
    automatic track length specified through number of sub cycles : 5

Time = 1

smoothSolver:  Solving for Ux, Initial residual = 1, Final residual = 0.0522766, No Iterations 1
smoothSolver:  Solving for Uy, Initial residual = 1, Final residual = 0.0297192, No Iterations 2
GAMG:  Solving for p, Initial residual = 1, Final residual = 0.0953742, No Iterations 22
time step continuity errors : sum local = 1.66885, global = -0.318567, cumulative = -0.318567
smoothSolver:  Solving for epsilon, Initial residual = 0.198029, Final residual = 0.00891435, No Iterations 3
smoothSolver:  Solving for k, Initial residual = 1, Final residual = 0.0434833, No Iterations 3
ExecutionTime = 0.314 s  ClockTime = 0 s

Time = 2


wyldckat April 4, 2018 09:53

Greetings Bence,

OK, now I remember what might be making this problem. It's possible that you have at least 2 MPI libraries installed, where the one installed in your system is in conflict with the one that blueCFD-Core uses.

If you run the following commands:
Code:

where msmpi.dll
where msmpi.dll | sed -e 's=\\=/=g' | xargs -I {} ls -l "{}"

You should get something like this:
Code:

C:\Program Files\blueCFD-Core-2016\ThirdParty-4.x\platforms\mingw_w64Gcc\MS-MPI-7.1\bin\msmpi.dll
C:\Windows\System32\msmpi.dll

-rwxr-xr-x 1 Bruno Santos None 1300688 Jul 27  2016 C:/Program  Files/blueCFD-Core-2016/ThirdParty-4.x/platforms/mingw_w64Gcc/MS-MPI-7.1/bin/msmpi.dll
-rwxr-xr-x 1 Bruno Santos None 1300688 Jun 13  2016 C:/Windows/System32/msmpi.dll

In my case, the two files have the same exact size (and similar dates), which means that it's mainly referring to the same version of MS-MPI.

In your case, I suspect that the second one has a different size, implying that it's another version.

The solution for this is explained on this FAQ: http://bluecfd.github.io/Core/FAQ/mp...y-the-dll-file - see section "Solution 1 - Simply copy the DLL file".


If this doesn't solve the issue, then the other possibility is that the Windows Firewall might be getting in the way. Although this is a bit strange, because if that were the case, Windows should have complained about it and asked for permissions...


On the other hand... I've tested just now to go through the same steps you've described, but I've gotten the following error message:
Code:

ERROR: Failed RpcCliCreateContext error 1722

Aborting: mpiexec on MACHINE_NAME is unable to connect to the smpd service on machine_name:8677
Other MPI error, error stack:
connect failed - The RPC server is unavailable.  (errno 1722)

But this is yet another problem...
Edit: OK, the problem was that I had an old "machines" file that was referencing a machine that isn't turned on, hence the problem.


Please let me know if the descriptions above help and/or if you now get on this same error message.

Best regards,
Bruno

sbence April 4, 2018 11:02

Hello Bruno,

Thanks, copying the dll solved the issue!

I have seen this FAQ, but it did not look relevant. :(
Sorry about that!

BR,
Bence

wyldckat April 4, 2018 11:30

I'm very glad that solved the problem! I don't remember ever getting the symptom you had, which was why I wasn't at first thinking about it either.

I've created a report to document this situation sometime in the near future: https://github.com/blueCFD/Core/issues/95

malv83 December 3, 2018 12:32

parallel compilation with BlueCFD-core 2017
 
Hi Bruno, I have a question.

I have used the BlueCFD terminal to compile c++ code before (compile codes independently of OpenFOAM) using:

Code:

c++ myCode.c
Now, I was wondering if I can compile a parallel code using the BlueCFD terminal, I have used:

Code:

mpic++ myParallelCode.c, mpicxx myParallelCode.c
but I have not had success. Is there a command in MS-MPI (as mpic++ for example) to compile parallel code?. Is there a way to compile an independent parallel code with the BlueCFD terminal?

Thanks.

wyldckat December 4, 2018 06:10

Quick answer:
  1. In the folder "applications/test/parallelMin" you will find a minimal source code example for testing if the MPI is working properly, without needing OpenFOAM's libraries.
  2. You can also find it online here: https://github.com/blueCFD/OpenFOAM-...st/parallelMin
  3. If you run wmake inside that folder, it will tell you the complete command used for compiling with MS-MPI.
  4. The detail is that you need to include the path to the MPI interface libraries and the "-lmpi" link option, instead of using mpicc or similar.
What am I thinking... OK, here are the commands I used on blueCFD-Core's terminal:
Code:

app
cd test/parallelMin
wmake

It gave me the following output:
Code:

Making dependency list for source file Test-parallelMin.C
$(/home/ofuser/blueCFD/OpenFOAM-5.x/wmake/scripts/makeReinterpretExePath x86_64-w64-mingw32-g++) -std=c++11 -Dmingw_w64 -DWM_ARCH_OPTION=64 -DWM_DP -DWM_LABEL_SIZE=32 -DWIN64 -DLITTLE_ENDIAN -DWIN64 -DLITTLE_ENDIAN -Wall -Wextra -Wold-style-cast -Wnon-virtual-dtor -Wno-unused-parameter -Wno-invalid-offsetof -O2 -DNDEBUG -gdwarf  -DNoRepository -ftemplate-depth-100 -D_FILE_OFFSET_BITS=64 -D_MODE_T_  -I/home/ofuser/blueCFD/ThirdParty-5.x/platforms/mingw_w64Gcc/MS-MPI-7.1/include @/home/ofuser/blueCFD/OpenFOAM-5.x/platforms/mingw_w64GccDPInt32Opt/applications/test/parallelMin/includeHeaderPaths -IlnInclude -I. -I/home/ofuser/blueCFD/OpenFOAM-5.x/src/OpenFOAM/lnInclude -I/home/ofuser/blueCFD/OpenFOAM-5.x/src/OSspecific/MSwindows/lnInclude  -c Test-parallelMin.C -o D:/DEVELO~1/Core/BLUECF~2/OpenFOAM-5.x/platforms/mingw_w64GccDPInt32Opt/applications/test/parallelMin/Test-parallelMin.o
In file included from Test-parallelMin.C:37:0:
Test-parallelMin.C: In function 'int main(int, char**)':
D:/DEVELO~1/Core/BLUECF~2/ThirdParty-5.x/platforms/mingw_w64Gcc/MS-MPI-7.1/include/mpi.h:367:35: warning: use of old-style cast [-Wold-style-cast]
 #define MPI_COMM_WORLD ((MPI_Comm)0x44000000)
                                  ^
Test-parallelMin.C:44:17: note: in expansion of macro 'MPI_COMM_WORLD'
  MPI_Comm_size(MPI_COMM_WORLD, &numprocs);
                ^~~~~~~~~~~~~~
D:/DEVELO~1/Core/BLUECF~2/ThirdParty-5.x/platforms/mingw_w64Gcc/MS-MPI-7.1/include/mpi.h:367:35: warning: use of old-style cast [-Wold-style-cast]
 #define MPI_COMM_WORLD ((MPI_Comm)0x44000000)
                                  ^
Test-parallelMin.C:45:17: note: in expansion of macro 'MPI_COMM_WORLD'
  MPI_Comm_rank(MPI_COMM_WORLD, &rank);
                ^~~~~~~~~~~~~~
$(/home/ofuser/blueCFD/OpenFOAM-5.x/wmake/scripts/makeReinterpretExePath windres) /home/ofuser/blueCFD/OpenFOAM-5.x/platforms/mingw_w64GccDPInt32Opt/applications/test/parallelMin/version_of_build.rc /home/ofuser/blueCFD/OpenFOAM-5.x/platforms/mingw_w64GccDPInt32Opt/applications/test/parallelMin/version_of_build.o
$(/home/ofuser/blueCFD/OpenFOAM-5.x/wmake/scripts/makeReinterpretExePath x86_64-w64-mingw32-g++) -std=c++11 -Dmingw_w64 -DWM_ARCH_OPTION=64 -DWM_DP -DWM_LABEL_SIZE=32 -DWIN64 -DLITTLE_ENDIAN -DWIN64 -DLITTLE_ENDIAN -Wall -Wextra -Wold-style-cast -Wnon-virtual-dtor -Wno-unused-parameter -Wno-invalid-offsetof -O2 -DNDEBUG -gdwarf  -DNoRepository -ftemplate-depth-100 -D_FILE_OFFSET_BITS=64 -D_MODE_T_  -I/home/ofuser/blueCFD/ThirdParty-5.x/platforms/mingw_w64Gcc/MS-MPI-7.1/include @/home/ofuser/blueCFD/OpenFOAM-5.x/platforms/mingw_w64GccDPInt32Opt/applications/test/parallelMin/includeHeaderPaths -IlnInclude -I. -I/home/ofuser/blueCFD/OpenFOAM-5.x/src/OpenFOAM/lnInclude -I/home/ofuser/blueCFD/OpenFOAM-5.x/src/OSspecific/MSwindows/lnInclude  -Wl,--enable-auto-import,--force-exe-suffix @/home/ofuser/blueCFD/OpenFOAM-5.x/platforms/mingw_w64GccDPInt32Opt/applications/test/parallelMin/objectList -L/home/ofuser/blueCFD/OpenFOAM-5.x/platforms/mingw_w64GccDPInt32Opt/lib \
    -L/home/ofuser/blueCFD/ThirdParty-5.x/platforms/mingw_w64Gcc/MS-MPI-7.1/lib -lmpi  \
    -lm -o /home/ofuser/blueCFD/ofuser-of5/platforms/mingw_w64GccDPInt32Opt/bin/Test-parallelMin_MSMPI71.exe
/home/ofuser/blueCFD/ThirdParty-5.x/cv2pdb/cv2pdb.exe /home/ofuser/blueCFD/ofuser-of5/platforms/mingw_w64GccDPInt32Opt/bin/Test-parallelMin_MSMPI71.exe || \
        (strip --strip-unneeded /home/ofuser/blueCFD/ofuser-of5/platforms/mingw_w64GccDPInt32Opt/bin/Test-parallelMin_MSMPI71.exe; \
        /home/ofuser/blueCFD/OpenFOAM-5.x/wmake/scripts/infoCV2PDBworkaround)


OK, pretty lengthy and hard to sift through, but here is the summary command you need, based on your example:
Code:

g++  -I/home/ofuser/blueCFD/ThirdParty-5.x/platforms/mingw_w64Gcc/MS-MPI-7.1/include -L/home/ofuser/blueCFD/ThirdParty-5.x/platforms/mingw_w64Gcc/MS-MPI-7.1/lib myParallelCode.c -lmpi
Oh, I had forgotten to mention that we also need to include the path to the "mpi.h", which is what the "-I" is for...

And yes, the "-lmpi" has to come last, so that the compiler knows the list of dependencies on a need basis.

malv83 December 5, 2018 09:44

It worked. I was able to compile and run my code in parallel from the BlueCFD-core's terminal.

Thanks Bruno.


All times are GMT -4. The time now is 19:16.