CFD Online Discussion Forums

CFD Online Discussion Forums (http://www.cfd-online.com/Forums/)
-   OpenFOAM Running, Solving & CFD (http://www.cfd-online.com/Forums/openfoam-solving/)
-   -   OpenFOAM 14 parallel troubles (http://www.cfd-online.com/Forums/openfoam-solving/59589-openfoam-14-parallel-troubles.html)

msrinath80 May 29, 2007 03:35

When I recompile OpenFOAM 1.4
 
When I recompile OpenFOAM 1.4 and run decomposePar, I get the following warnings. I have defined probe points in controlDict.

Can I ignore these warnings:

[madhavan@head02 icoFoam]$ decomposePar . cavity_parallel
/*---------------------------------------------------------------------------*\
| ========= | |
| \ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \ / O peration | Version: 1.4 |
| \ / A nd | Web: http://www.openfoam.org |
| \/ M anipulation | |
\*---------------------------------------------------------------------------*/

Exec : decomposePar . cavity_parallel
Date : May 29 2007
Time : 01:33:05
Host : head02.cein.ualberta.ca
PID : 17797
Root : /amd/cein/homes/1/d/madhavan/OpenFOAM/madhavan-1.4/run/tutorials/icoFoam
Case : cavity_parallel
Nprocs : 1
Create time

Selecting function probes
Time = 0
Create mesh


Calculating distribution of cells
Selecting decompositionMethod metis


Finished decomposition in 0 s

Calculating original mesh data

Distributing cells to processors

Distributing faces to processors

Calculating processor boundary addressing

Distributing points to processors

Constructing processor meshes
Selecting function probes
--> FOAM Warning :
From function dlLibraryTable::open(const dictionary& dict, const word& libsEntry, const TablePtr tablePtr)
in file lnInclude/dlLibraryTableTemplates.C at line 67
library "libsampling.so" did not introduce any new entries


Processor 0
Number of cells = 199
Number of faces shared with processor 1 = 24
Number of processor patches = 1
Number of processor faces = 24
Number of boundary faces = 438
Selecting function probes
--> FOAM Warning :
From function dlLibraryTable::open(const dictionary& dict, const word& libsEntry, const TablePtr tablePtr)
in file lnInclude/dlLibraryTableTemplates.C at line 67
library "libsampling.so" did not introduce any new entries


Processor 1
Number of cells = 201
Number of faces shared with processor 0 = 24
Number of processor patches = 1
Number of processor faces = 24
Number of boundary faces = 442

Number of processor faces = 24
Max number of processor patches = 1
Max number of faces between processors = 24

Processor 0: field transfer
Selecting function probes
--> FOAM Warning :
From function dlLibraryTable::open(const dictionary& dict, const word& libsEntry, const TablePtr tablePtr)
in file lnInclude/dlLibraryTableTemplates.C at line 67
library "libsampling.so" did not introduce any new entries

Processor 1: field transfer
Selecting function probes
--> FOAM Warning :
From function dlLibraryTable::open(const dictionary& dict, const word& libsEntry, const TablePtr tablePtr)
in file lnInclude/dlLibraryTableTemplates.C at line 67
library "libsampling.so" did not introduce any new entries


End.

[madhavan@head02 icoFoam]$

msrinath80 May 30, 2007 00:52

This appears to be a bug in th
 
This appears to be a bug in the OpenFOAM downloaded from Sourceforge (64 bit version). When you have probes defined in controldict and you try to decompose a case, you get these warnings. Also, after the parallel simulation, one is unable to launch paraFoam. Perhaps this is a bug that was overlooked. Can someone confirm please?

nikos May 30, 2007 03:40

I've got the same message befo
 
I've got the same message before running the case.
In my case probes directory is created but there is nothing inside.

Concerning parafoam I'm not sure because I'm using Visit, but you must remove the probes function in the controlDict file.

msrinath80 May 30, 2007 03:52

I will post this again on Open
 
I will post this again on OpenFOAM-Bugs forum. Maybe Henry can look at it. Thanks for confirming my observation.

knabhishek July 3, 2007 10:00

Hi When i run serial case
 
Hi

When i run serial case it does not give any error.
But i get error with parallel as

[5] #0 Foam::error::printStack(Foam:http://www.cfd-online.com/OpenFOAM_D...part/proud.gifstream&)[6] #0 Foam::error::printStack(Foam:http://www.cfd-online.com/OpenFOAM_D...part/proud.gifstream&)
[5] #1 Foam::sigFpe::sigFpeHandler(int)
[6] #1 Foam::sigFpe::sigFpeHandler(int)
[5] #2
[6] #2 [4] #0 Foam::error::printStack(Foam:http://www.cfd-online.com/OpenFOAM_D...part/proud.gifstream&)__restore_rt__restore_rt
[5] #3 void Foam::processorLduInterface::compressedSend<double >(Foam::UList<double> const&, bool) const
[6] #3 void Foam::processorLduInterface::compressedSend<double >(Foam::UList<double> const&, bool) const
[4] #1 Foam::sigFpe::sigFpeHandler(int)
[6] #4 Foam::processorFvPatchField<foam::vector<double> >::initInterfaceMatrixUpdate(Foam::Field<double> const&, Foam::Field<double>&, Foam::lduMatrix const&, Foam::Field<double> const&, unsigned char, bool) const
[5] #4 Foam::processorFvPatchField<foam::vector<double> >::initInterfaceMatrixUpdate(Foam::Field<double> const&, Foam::Field<double>&, Foam::lduMatrix const&, Foam::Field<double> const&, unsigned char, bool) const
[4] #2 __restore_rt
[4] #3 void Foam::processorLduInterface::compressedSend<double >(Foam::UList<double> const&, bool) const
[5] #5 Foam::lduMatrix::initMatrixInterfaces(Foam::FieldF ield<foam::field,> const&, Foam::UPtrList<foam::lduinterfacefield> const&, Foam::Field<double> const&, Foam::Field<double>&, unsigned char) const
[6] #5 Foam::lduMatrix::initMatrixInterfaces(Foam::FieldF ield<foam::field,> const&, Foam::UPtrList<foam::lduinterfacefield> const&, Foam::Field<double> const&, Foam::Field<double>&, unsigned char) const
[6] #6 Foam::lduMatrix::Tmul(Foam::Field<double>&, Foam::tmp<foam::field<double> > const&, Foam::FieldField<foam::field,> const&, Foam::UPtrList<foam::lduinterfacefield> const&, unsigned char) const
[5] #6 Foam::lduMatrix::Tmul(Foam::Field<double>&, Foam::tmp<foam::field<double> > const&, Foam::FieldField<foam::field,> const&, Foam::UPtrList<foam::lduinterfacefield> const&, unsigned char) const
[6] #7 Foam::PBiCG::solve(Foam::Field<double>&, Foam::Field<double> const&, unsigned char) const
[5] #7 Foam::PBiCG::solve(Foam::Field<double>&, Foam::Field<double> const&, unsigned char) const
[4] #4 Foam::processorFvPatchField<foam::vector<double> >::initInterfaceMatrixUpdate(Foam::Field<double> const&, Foam::Field<double>&, Foam::lduMatrix const&, Foam::Field<double> const&, unsigned char, bool) const

[6] #8 [5] #8
[4] #5 Foam::lduMatrix::initMatrixInterfaces(Foam::FieldF ield<foam::field,> const&, Foam::UPtrList<foam::lduinterfacefield> const&, Foam::Field<double> const&, Foam::Field<double>&, unsigned char) const
[4] #6 Foam::lduMatrix::Tmul(Foam::Field<double>&, Foam::tmp<foam::field<double> > const&, Foam::FieldField<foam::field,> const&, Foam::UPtrList<foam::lduinterfacefield> const&, unsigned char) constFoam::fvMatrix<foam::vector<double> >::solve(Foam::Istream&)Foam::fvMatrix<foam::vecto r<double> >::solve(Foam::Istream&)
[4] #7 Foam::PBiCG::solve(Foam::Field<double>&, Foam::Field<double> const&, unsigned char) const
[6] #9
[5] #9
[4] #8 Foam::lduMatrix::solverPerformance Foam::solve<foam::vector<double> >(Foam::tmp<foam::fvmatrix<foam::vector<double> > > const&)Foam::lduMatrix::solverPerformance Foam::solve<foam::vector<double> >(Foam::tmp<foam::fvmatrix<foam::vector<double> > > const&)Foam::fvMatrix<foam::vector<double> >::solve(Foam::Istream&)
[4] #9
[6] #10
[5] #10 Foam::lduMatrix::solverPerformance Foam::solve<foam::vector<double> >(Foam::tmp<foam::fvmatrix<foam::vector<double> > > const&)mainmain
[4] #10 main
[6] #11 __libc_start_main
[5] #11 __libc_start_main
[4] #11 __libc_start_main
[6] #12 __gxx_personality_v0
[4] #12 __gxx_personality_v0
[5] #12 __gxx_personality_v0 at ../sysdeps/x86_64/elf/start.S:116
-----------------------------------------------------------------------------
One of the processes started by mpirun has exited with a nonzero exit
code. This typically indicates that the process finished in error.
If your process did not finish in error, be sure to include a "return
0" or "exit(0)" in your C code before exiting the application.

PID 7314 failed on node n2 (172.28.0.125) due to signal 8.
-----------------------------------------------------------------------------

msrinath80 July 3, 2007 11:39

How exactly are you running th
 
How exactly are you running the parallel simulation? Is this stock OF 1.4 or did you build from source?

knabhishek July 3, 2007 11:44

Hi Sorry i could not under
 
Hi

Sorry i could not understand your question stock OF 1.4 or did you build from source?

msrinath80 July 4, 2007 02:08

Stock OpenFOAM 1.4 refers to t
 
Stock OpenFOAM 1.4 refers to the precompiled version distributed by OpenCFD.

knabhishek July 4, 2007 04:10

Hi I used the pre compiled
 
Hi

I used the pre compiled version of OpenFOAM 1.4

msrinath80 July 4, 2007 07:25

Ahem, OK. What about the envir
 
Ahem, OK. What about the environment variable WM_64. Did you set it to on? Can you also post the exact sequence of commands you used to start the parallel version. Also is this SMP hardware or distributed node with interconnect?

knabhishek July 4, 2007 08:40

hi Surprisingly i ran it a
 
hi

Surprisingly i ran it again it is working now.
First time it stopped after 35 seconds and but now
it has reached 108 seconds without any problem.

One more surprise, WM_64 was not 'on' but still it choose a 64 bit version.

I use Lamboot and mpi type to run the parallel.

It is a distributed node with interconnect.

msrinath80 July 4, 2007 09:18

I faced a similar problem a wh
 
I faced a similar problem a while ago. Check this[1] post to see the mpirun command syntax that solved the issue.

[1] http://www.cfd-online.com/OpenFOAM_D...tml?1154110876

knabhishek July 4, 2007 11:24

Hi Case description Mesh
 
Hi

Case description

Mesh Size: 1.2 Million
Time step: 0.01
Under relaxation: 0.1
Standard k-e model
Upwind schemes for divergence term
Non-orthogonal corrector: 10

Imported from starCCM ++, the checkmesh reported
some highly skewed, Non-orthogonal cells.
Still i ran the case, the relevant changes for mpirun is made as suggested by you.

Still i get this error.

What could be the mistake the mesh or the way mpi command is used

If it has to do with mesh then any cleaning tool available in OF or get the mesh corrected in STAR

Time = 0.09

DILUPBiCG: Solving for Ux, Initial residual = 0.0629195, Final residual = 4.30642e-11, No Iterations 3
DILUPBiCG: Solving for Uy, Initial residual = 0.0580816, Final residual = 6.80881e-11, No Iterations 3
DILUPBiCG: Solving for Uz, Initial residual = 0.0694269, Final residual = 1.07529e-10, No Iterations 3
DICPCG: Solving for p, Initial residual = 0.597653, Final residual = 9.51117e-09, No Iterations 262
DICPCG: Solving for p, Initial residual = 0.149701, Final residual = 9.49542e-09, No Iterations 250
DICPCG: Solving for p, Initial residual = 0.0316422, Final residual = 9.67963e-09, No Iterations 237
DICPCG: Solving for p, Initial residual = 0.00462459, Final residual = 8.81629e-09, No Iterations 208
DICPCG: Solving for p, Initial residual = 0.000570972, Final residual = 9.57096e-09, No Iterations 184
DICPCG: Solving for p, Initial residual = 8.75491e-05, Final residual = 9.31579e-09, No Iterations 160
DICPCG: Solving for p, Initial residual = 2.12035e-05, Final residual = 9.92397e-09, No Iterations 63
DICPCG: Solving for p, Initial residual = 7.77277e-06, Final residual = 9.97595e-09, No Iterations 29
DICPCG: Solving for p, Initial residual = 4.64155e-06, Final residual = 9.72118e-09, No Iterations 15
DICPCG: Solving for p, Initial residual = 2.84015e-06, Final residual = 9.60676e-09, No Iterations 15
DICPCG: Solving for p, Initial residual = 1.965e-06, Final residual = 9.74522e-09, No Iterations 10
DICPCG: Solving for p, Initial residual = 1.38749e-06, Final residual = 9.83813e-09, No Iterations 9
DICPCG: Solving for p, Initial residual = 1.07198e-06, Final residual = 8.83078e-09, No Iterations 9
DICPCG: Solving for p, Initial residual = 8.83087e-07, Final residual = 9.97312e-09, No Iterations 6
DICPCG: Solving for p, Initial residual = 7.93485e-07, Final residual = 9.5128e-09, No Iterations 8
DICPCG: Solving for p, Initial residual = 7.67152e-07, Final residual = 9.17566e-09, No Iterations 7
time step continuity errors : sum local = 2605.87, global = 172.42, cumulative = 172.42
DILUPBiCG: Solving for epsilon, Initial residual = 0.999997, Final residual = 4.04242e-09, No Iterations 3
bounding epsilon, min: 0.0156719 max: 6.08811e+19 average: 3.20073e+14
DILUPBiCG: Solving for k, Initial residual = 1, Final residual = 1.50261e-09, No Iterations 3
ExecutionTime = 256.63 s ClockTime = 732 s

Time = 0.1

[6] #0 Foam::error::printStack(Foam:http://www.cfd-online.com/OpenFOAM_D...part/proud.gifstream&)
[6] #1 Foam::sigFpe::sigFpeHandler(int)
[6] #2 __restore_rt
[6] #3 void Foam::processorLduInterface::compressedSend<foam:: vector<double> >(Foam::UList<foam::vector<double> > const&, bool) const
[6] #4 Foam::processorFvPatchField<foam::vector<double> >::initEvaluate(bool)
[6] #5 Foam::GeometricField<foam::vector<double>, Foam::fvPatchField, Foam::volMesh>::GeometricBoundaryField::evaluate()
[6] #6 Foam::tmp<foam::geometricfield<foam::vector<double >, Foam::fvPatchField, Foam::volMesh> > Foam::fvc::surfaceIntegrate<foam::vector<double> >(Foam::GeometricField<foam::vector<double>, Foam::fvPatchField, Foam::surfaceMesh> const&)
[6] #7 Foam::tmp<foam::geometricfield<foam::vector<double >, Foam::fvPatchField, Foam::volMesh> > Foam::fvc::surfaceIntegrate<foam::vector<double> >(Foam::tmp<foam::geometricfield<foam::vector<doub le>, Foam::fvPatchField, Foam::surfaceMesh> > const&)
[6] #8 Foam::fv::gaussDivScheme<foam::tensor<double> >::fvcDiv(Foam::GeometricField<foam::tensor<double >, Foam::fvPatchField, Foam::volMesh> const&)
[6] #9 Foam::tmp<foam::geometricfield<foam::innerproduct< foam::vector<double>, Foam::Tensor<double> >::type, Foam::fvPatchField, Foam::volMesh> > Foam::fvc::div<foam::tensor<double> >(Foam::GeometricField<foam::tensor<double>, Foam::fvPatchField, Foam::volMesh> const&, Foam::word const&)
[6] #10 Foam::tmp<foam::geometricfield<foam::innerproduct< foam::vector<double>, Foam::Tensor<double> >::type, Foam::fvPatchField, Foam::volMesh> > Foam::fvc::div<foam::tensor<double> >(Foam::GeometricField<foam::tensor<double>, Foam::fvPatchField, Foam::volMesh> const&)
[6] #11 Foam::tmp<foam::geometricfield<foam::innerproduct< foam::vector<double>, Foam::Tensor<double> >::type, Foam::fvPatchField, Foam::volMesh> > Foam::fvc::div<foam::tensor<double> >(Foam::tmp<foam::geometricfield<foam::tensor<doub le>, Foam::fvPatchField, Foam::volMesh> > const&)
[6] #12 Foam::turbulenceModels::kEpsilon::divR(Foam::Geome tricField<foam::vector<double> , Foam::fvPatchField, Foam::volMesh>&) const
[6] #13 main
[6] #14 __libc_start_main
[6] #15 __gxx_personality_v0 at ../sysdeps/x86_64/elf/start.S:116
-----------------------------------------------------------------------------
One of the processes started by mpirun has exited with a nonzero exit
code. This typically indicates that the process finished in error.
If your process did not finish in error, be sure to include a "return
0" or "exit(0)" in your C code before exiting the application.

PID 24663 failed on node n3 (172.28.0.120) due to signal 8.
-----------------------------------------------------------------------------

msrinath80 July 4, 2007 13:45

Can you post the full checkMes
 
Can you post the full checkMesh output here?

msrinath80 July 5, 2007 04:51

I think that explains it all.
 
I think that explains it all. Even if you ignored the warnings, you have a serious error to rectify. Despite your observation concerning no problems when running in serial mode, my suggestion is to work toward solving that error and trying again. In either case, with a mesh so bad (i.e. high non-orthogonality and skewness), I doubt the solvers will do a decent job and/or how meaningful your final CFD solution will be. I have read in the forums and it is also my personal experience that OpenFOAM does a very nice job at parallel given a decent interconnect and case setup (which includes mesh definitions/discretizations et al.). If all you want to do is check your parallel setup, then try one of the example cases (e.g. dambreak). Hope that helps.


All times are GMT -4. The time now is 14:27.