CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > OpenFOAM Running, Solving & CFD

OpenFOAM 14 parallel troubles

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Display Modes
Old   May 29, 2007, 03:35
Default When I recompile OpenFOAM 1.4
  #1
Senior Member
 
Srinath Madhavan (a.k.a pUl|)
Join Date: Mar 2009
Location: Edmonton, AB, Canada
Posts: 698
Rep Power: 12
msrinath80 is on a distinguished road
When I recompile OpenFOAM 1.4 and run decomposePar, I get the following warnings. I have defined probe points in controlDict.

Can I ignore these warnings:

[madhavan@head02 icoFoam]$ decomposePar . cavity_parallel
/*---------------------------------------------------------------------------*\
| ========= | |
| \ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \ / O peration | Version: 1.4 |
| \ / A nd | Web: http://www.openfoam.org |
| \/ M anipulation | |
\*---------------------------------------------------------------------------*/

Exec : decomposePar . cavity_parallel
Date : May 29 2007
Time : 01:33:05
Host : head02.cein.ualberta.ca
PID : 17797
Root : /amd/cein/homes/1/d/madhavan/OpenFOAM/madhavan-1.4/run/tutorials/icoFoam
Case : cavity_parallel
Nprocs : 1
Create time

Selecting function probes
Time = 0
Create mesh


Calculating distribution of cells
Selecting decompositionMethod metis


Finished decomposition in 0 s

Calculating original mesh data

Distributing cells to processors

Distributing faces to processors

Calculating processor boundary addressing

Distributing points to processors

Constructing processor meshes
Selecting function probes
--> FOAM Warning :
From function dlLibraryTable::open(const dictionary& dict, const word& libsEntry, const TablePtr tablePtr)
in file lnInclude/dlLibraryTableTemplates.C at line 67
library "libsampling.so" did not introduce any new entries


Processor 0
Number of cells = 199
Number of faces shared with processor 1 = 24
Number of processor patches = 1
Number of processor faces = 24
Number of boundary faces = 438
Selecting function probes
--> FOAM Warning :
From function dlLibraryTable::open(const dictionary& dict, const word& libsEntry, const TablePtr tablePtr)
in file lnInclude/dlLibraryTableTemplates.C at line 67
library "libsampling.so" did not introduce any new entries


Processor 1
Number of cells = 201
Number of faces shared with processor 0 = 24
Number of processor patches = 1
Number of processor faces = 24
Number of boundary faces = 442

Number of processor faces = 24
Max number of processor patches = 1
Max number of faces between processors = 24

Processor 0: field transfer
Selecting function probes
--> FOAM Warning :
From function dlLibraryTable::open(const dictionary& dict, const word& libsEntry, const TablePtr tablePtr)
in file lnInclude/dlLibraryTableTemplates.C at line 67
library "libsampling.so" did not introduce any new entries

Processor 1: field transfer
Selecting function probes
--> FOAM Warning :
From function dlLibraryTable::open(const dictionary& dict, const word& libsEntry, const TablePtr tablePtr)
in file lnInclude/dlLibraryTableTemplates.C at line 67
library "libsampling.so" did not introduce any new entries


End.

[madhavan@head02 icoFoam]$
msrinath80 is offline   Reply With Quote

Old   May 30, 2007, 00:52
Default This appears to be a bug in th
  #2
Senior Member
 
Srinath Madhavan (a.k.a pUl|)
Join Date: Mar 2009
Location: Edmonton, AB, Canada
Posts: 698
Rep Power: 12
msrinath80 is on a distinguished road
This appears to be a bug in the OpenFOAM downloaded from Sourceforge (64 bit version). When you have probes defined in controldict and you try to decompose a case, you get these warnings. Also, after the parallel simulation, one is unable to launch paraFoam. Perhaps this is a bug that was overlooked. Can someone confirm please?
msrinath80 is offline   Reply With Quote

Old   May 30, 2007, 03:40
Default I've got the same message befo
  #3
New Member
 
Nicolas Coste
Join Date: Mar 2009
Location: Marseilles, France
Posts: 11
Rep Power: 8
nikos is on a distinguished road
I've got the same message before running the case.
In my case probes directory is created but there is nothing inside.

Concerning parafoam I'm not sure because I'm using Visit, but you must remove the probes function in the controlDict file.
nikos is offline   Reply With Quote

Old   May 30, 2007, 03:52
Default I will post this again on Open
  #4
Senior Member
 
Srinath Madhavan (a.k.a pUl|)
Join Date: Mar 2009
Location: Edmonton, AB, Canada
Posts: 698
Rep Power: 12
msrinath80 is on a distinguished road
I will post this again on OpenFOAM-Bugs forum. Maybe Henry can look at it. Thanks for confirming my observation.
msrinath80 is offline   Reply With Quote

Old   July 3, 2007, 10:00
Default Hi When i run serial case
  #5
New Member
 
abhishek k n
Join Date: Mar 2009
Location: Gothenburg, Sweden
Posts: 16
Rep Power: 8
knabhishek is on a distinguished road
Hi

When i run serial case it does not give any error.
But i get error with parallel as

[5] #0 Foam::error::printStack(Foam:stream&)[6] #0 Foam::error::printStack(Foam:stream&)
[5] #1 Foam::sigFpe::sigFpeHandler(int)
[6] #1 Foam::sigFpe::sigFpeHandler(int)
[5] #2
[6] #2 [4] #0 Foam::error::printStack(Foam:stream&)__restore_rt__restore_rt
[5] #3 void Foam::processorLduInterface::compressedSend<double >(Foam::UList<double> const&, bool) const
[6] #3 void Foam::processorLduInterface::compressedSend<double >(Foam::UList<double> const&, bool) const
[4] #1 Foam::sigFpe::sigFpeHandler(int)
[6] #4 Foam::processorFvPatchField<foam::vector<double> >::initInterfaceMatrixUpdate(Foam::Field<double> const&, Foam::Field<double>&, Foam::lduMatrix const&, Foam::Field<double> const&, unsigned char, bool) const
[5] #4 Foam::processorFvPatchField<foam::vector<double> >::initInterfaceMatrixUpdate(Foam::Field<double> const&, Foam::Field<double>&, Foam::lduMatrix const&, Foam::Field<double> const&, unsigned char, bool) const
[4] #2 __restore_rt
[4] #3 void Foam::processorLduInterface::compressedSend<double >(Foam::UList<double> const&, bool) const
[5] #5 Foam::lduMatrix::initMatrixInterfaces(Foam::FieldF ield<foam::field,> const&, Foam::UPtrList<foam::lduinterfacefield> const&, Foam::Field<double> const&, Foam::Field<double>&, unsigned char) const
[6] #5 Foam::lduMatrix::initMatrixInterfaces(Foam::FieldF ield<foam::field,> const&, Foam::UPtrList<foam::lduinterfacefield> const&, Foam::Field<double> const&, Foam::Field<double>&, unsigned char) const
[6] #6 Foam::lduMatrix::Tmul(Foam::Field<double>&, Foam::tmp<foam::field<double> > const&, Foam::FieldField<foam::field,> const&, Foam::UPtrList<foam::lduinterfacefield> const&, unsigned char) const
[5] #6 Foam::lduMatrix::Tmul(Foam::Field<double>&, Foam::tmp<foam::field<double> > const&, Foam::FieldField<foam::field,> const&, Foam::UPtrList<foam::lduinterfacefield> const&, unsigned char) const
[6] #7 Foam::PBiCG::solve(Foam::Field<double>&, Foam::Field<double> const&, unsigned char) const
[5] #7 Foam::PBiCG::solve(Foam::Field<double>&, Foam::Field<double> const&, unsigned char) const
[4] #4 Foam::processorFvPatchField<foam::vector<double> >::initInterfaceMatrixUpdate(Foam::Field<double> const&, Foam::Field<double>&, Foam::lduMatrix const&, Foam::Field<double> const&, unsigned char, bool) const

[6] #8 [5] #8
[4] #5 Foam::lduMatrix::initMatrixInterfaces(Foam::FieldF ield<foam::field,> const&, Foam::UPtrList<foam::lduinterfacefield> const&, Foam::Field<double> const&, Foam::Field<double>&, unsigned char) const
[4] #6 Foam::lduMatrix::Tmul(Foam::Field<double>&, Foam::tmp<foam::field<double> > const&, Foam::FieldField<foam::field,> const&, Foam::UPtrList<foam::lduinterfacefield> const&, unsigned char) constFoam::fvMatrix<foam::vector<double> >::solve(Foam::Istream&)Foam::fvMatrix<foam::vecto r<double> >::solve(Foam::Istream&)
[4] #7 Foam::PBiCG::solve(Foam::Field<double>&, Foam::Field<double> const&, unsigned char) const
[6] #9
[5] #9
[4] #8 Foam::lduMatrix::solverPerformance Foam::solve<foam::vector<double> >(Foam::tmp<foam::fvmatrix<foam::vector<double> > > const&)Foam::lduMatrix::solverPerformance Foam::solve<foam::vector<double> >(Foam::tmp<foam::fvmatrix<foam::vector<double> > > const&)Foam::fvMatrix<foam::vector<double> >::solve(Foam::Istream&)
[4] #9
[6] #10
[5] #10 Foam::lduMatrix::solverPerformance Foam::solve<foam::vector<double> >(Foam::tmp<foam::fvmatrix<foam::vector<double> > > const&)mainmain
[4] #10 main
[6] #11 __libc_start_main
[5] #11 __libc_start_main
[4] #11 __libc_start_main
[6] #12 __gxx_personality_v0
[4] #12 __gxx_personality_v0
[5] #12 __gxx_personality_v0 at ../sysdeps/x86_64/elf/start.S:116
-----------------------------------------------------------------------------
One of the processes started by mpirun has exited with a nonzero exit
code. This typically indicates that the process finished in error.
If your process did not finish in error, be sure to include a "return
0" or "exit(0)" in your C code before exiting the application.

PID 7314 failed on node n2 (172.28.0.125) due to signal 8.
-----------------------------------------------------------------------------
knabhishek is offline   Reply With Quote

Old   July 3, 2007, 11:39
Default How exactly are you running th
  #6
Senior Member
 
Srinath Madhavan (a.k.a pUl|)
Join Date: Mar 2009
Location: Edmonton, AB, Canada
Posts: 698
Rep Power: 12
msrinath80 is on a distinguished road
How exactly are you running the parallel simulation? Is this stock OF 1.4 or did you build from source?
msrinath80 is offline   Reply With Quote

Old   July 3, 2007, 11:44
Default Hi Sorry i could not under
  #7
New Member
 
abhishek k n
Join Date: Mar 2009
Location: Gothenburg, Sweden
Posts: 16
Rep Power: 8
knabhishek is on a distinguished road
Hi

Sorry i could not understand your question stock OF 1.4 or did you build from source?
knabhishek is offline   Reply With Quote

Old   July 4, 2007, 02:08
Default Stock OpenFOAM 1.4 refers to t
  #8
Senior Member
 
Srinath Madhavan (a.k.a pUl|)
Join Date: Mar 2009
Location: Edmonton, AB, Canada
Posts: 698
Rep Power: 12
msrinath80 is on a distinguished road
Stock OpenFOAM 1.4 refers to the precompiled version distributed by OpenCFD.
msrinath80 is offline   Reply With Quote

Old   July 4, 2007, 04:10
Default Hi I used the pre compiled
  #9
New Member
 
abhishek k n
Join Date: Mar 2009
Location: Gothenburg, Sweden
Posts: 16
Rep Power: 8
knabhishek is on a distinguished road
Hi

I used the pre compiled version of OpenFOAM 1.4
knabhishek is offline   Reply With Quote

Old   July 4, 2007, 07:25
Default Ahem, OK. What about the envir
  #10
Senior Member
 
Srinath Madhavan (a.k.a pUl|)
Join Date: Mar 2009
Location: Edmonton, AB, Canada
Posts: 698
Rep Power: 12
msrinath80 is on a distinguished road
Ahem, OK. What about the environment variable WM_64. Did you set it to on? Can you also post the exact sequence of commands you used to start the parallel version. Also is this SMP hardware or distributed node with interconnect?
msrinath80 is offline   Reply With Quote

Old   July 4, 2007, 08:40
Default hi Surprisingly i ran it a
  #11
New Member
 
abhishek k n
Join Date: Mar 2009
Location: Gothenburg, Sweden
Posts: 16
Rep Power: 8
knabhishek is on a distinguished road
hi

Surprisingly i ran it again it is working now.
First time it stopped after 35 seconds and but now
it has reached 108 seconds without any problem.

One more surprise, WM_64 was not 'on' but still it choose a 64 bit version.

I use Lamboot and mpi type to run the parallel.

It is a distributed node with interconnect.
knabhishek is offline   Reply With Quote

Old   July 4, 2007, 09:18
Default I faced a similar problem a wh
  #12
Senior Member
 
Srinath Madhavan (a.k.a pUl|)
Join Date: Mar 2009
Location: Edmonton, AB, Canada
Posts: 698
Rep Power: 12
msrinath80 is on a distinguished road
I faced a similar problem a while ago. Check this[1] post to see the mpirun command syntax that solved the issue.

[1] http://www.cfd-online.com/OpenFOAM_D...tml?1154110876
msrinath80 is offline   Reply With Quote

Old   July 4, 2007, 11:24
Default Hi Case description Mesh
  #13
New Member
 
abhishek k n
Join Date: Mar 2009
Location: Gothenburg, Sweden
Posts: 16
Rep Power: 8
knabhishek is on a distinguished road
Hi

Case description

Mesh Size: 1.2 Million
Time step: 0.01
Under relaxation: 0.1
Standard k-e model
Upwind schemes for divergence term
Non-orthogonal corrector: 10

Imported from starCCM ++, the checkmesh reported
some highly skewed, Non-orthogonal cells.
Still i ran the case, the relevant changes for mpirun is made as suggested by you.

Still i get this error.

What could be the mistake the mesh or the way mpi command is used

If it has to do with mesh then any cleaning tool available in OF or get the mesh corrected in STAR

Time = 0.09

DILUPBiCG: Solving for Ux, Initial residual = 0.0629195, Final residual = 4.30642e-11, No Iterations 3
DILUPBiCG: Solving for Uy, Initial residual = 0.0580816, Final residual = 6.80881e-11, No Iterations 3
DILUPBiCG: Solving for Uz, Initial residual = 0.0694269, Final residual = 1.07529e-10, No Iterations 3
DICPCG: Solving for p, Initial residual = 0.597653, Final residual = 9.51117e-09, No Iterations 262
DICPCG: Solving for p, Initial residual = 0.149701, Final residual = 9.49542e-09, No Iterations 250
DICPCG: Solving for p, Initial residual = 0.0316422, Final residual = 9.67963e-09, No Iterations 237
DICPCG: Solving for p, Initial residual = 0.00462459, Final residual = 8.81629e-09, No Iterations 208
DICPCG: Solving for p, Initial residual = 0.000570972, Final residual = 9.57096e-09, No Iterations 184
DICPCG: Solving for p, Initial residual = 8.75491e-05, Final residual = 9.31579e-09, No Iterations 160
DICPCG: Solving for p, Initial residual = 2.12035e-05, Final residual = 9.92397e-09, No Iterations 63
DICPCG: Solving for p, Initial residual = 7.77277e-06, Final residual = 9.97595e-09, No Iterations 29
DICPCG: Solving for p, Initial residual = 4.64155e-06, Final residual = 9.72118e-09, No Iterations 15
DICPCG: Solving for p, Initial residual = 2.84015e-06, Final residual = 9.60676e-09, No Iterations 15
DICPCG: Solving for p, Initial residual = 1.965e-06, Final residual = 9.74522e-09, No Iterations 10
DICPCG: Solving for p, Initial residual = 1.38749e-06, Final residual = 9.83813e-09, No Iterations 9
DICPCG: Solving for p, Initial residual = 1.07198e-06, Final residual = 8.83078e-09, No Iterations 9
DICPCG: Solving for p, Initial residual = 8.83087e-07, Final residual = 9.97312e-09, No Iterations 6
DICPCG: Solving for p, Initial residual = 7.93485e-07, Final residual = 9.5128e-09, No Iterations 8
DICPCG: Solving for p, Initial residual = 7.67152e-07, Final residual = 9.17566e-09, No Iterations 7
time step continuity errors : sum local = 2605.87, global = 172.42, cumulative = 172.42
DILUPBiCG: Solving for epsilon, Initial residual = 0.999997, Final residual = 4.04242e-09, No Iterations 3
bounding epsilon, min: 0.0156719 max: 6.08811e+19 average: 3.20073e+14
DILUPBiCG: Solving for k, Initial residual = 1, Final residual = 1.50261e-09, No Iterations 3
ExecutionTime = 256.63 s ClockTime = 732 s

Time = 0.1

[6] #0 Foam::error::printStack(Foam:stream&)
[6] #1 Foam::sigFpe::sigFpeHandler(int)
[6] #2 __restore_rt
[6] #3 void Foam::processorLduInterface::compressedSend<foam:: vector<double> >(Foam::UList<foam::vector<double> > const&, bool) const
[6] #4 Foam::processorFvPatchField<foam::vector<double> >::initEvaluate(bool)
[6] #5 Foam::GeometricField<foam::vector<double>, Foam::fvPatchField, Foam::volMesh>::GeometricBoundaryField::evaluate()
[6] #6 Foam::tmp<foam::geometricfield<foam::vector<double >, Foam::fvPatchField, Foam::volMesh> > Foam::fvc::surfaceIntegrate<foam::vector<double> >(Foam::GeometricField<foam::vector<double>, Foam::fvPatchField, Foam::surfaceMesh> const&)
[6] #7 Foam::tmp<foam::geometricfield<foam::vector<double >, Foam::fvPatchField, Foam::volMesh> > Foam::fvc::surfaceIntegrate<foam::vector<double> >(Foam::tmp<foam::geometricfield<foam::vector<doub le>, Foam::fvPatchField, Foam::surfaceMesh> > const&)
[6] #8 Foam::fv::gaussDivScheme<foam::tensor<double> >::fvcDiv(Foam::GeometricField<foam::tensor<double >, Foam::fvPatchField, Foam::volMesh> const&)
[6] #9 Foam::tmp<foam::geometricfield<foam::innerproduct< foam::vector<double>, Foam::Tensor<double> >::type, Foam::fvPatchField, Foam::volMesh> > Foam::fvc::div<foam::tensor<double> >(Foam::GeometricField<foam::tensor<double>, Foam::fvPatchField, Foam::volMesh> const&, Foam::word const&)
[6] #10 Foam::tmp<foam::geometricfield<foam::innerproduct< foam::vector<double>, Foam::Tensor<double> >::type, Foam::fvPatchField, Foam::volMesh> > Foam::fvc::div<foam::tensor<double> >(Foam::GeometricField<foam::tensor<double>, Foam::fvPatchField, Foam::volMesh> const&)
[6] #11 Foam::tmp<foam::geometricfield<foam::innerproduct< foam::vector<double>, Foam::Tensor<double> >::type, Foam::fvPatchField, Foam::volMesh> > Foam::fvc::div<foam::tensor<double> >(Foam::tmp<foam::geometricfield<foam::tensor<doub le>, Foam::fvPatchField, Foam::volMesh> > const&)
[6] #12 Foam::turbulenceModels::kEpsilon::divR(Foam::Geome tricField<foam::vector<double> , Foam::fvPatchField, Foam::volMesh>&) const
[6] #13 main
[6] #14 __libc_start_main
[6] #15 __gxx_personality_v0 at ../sysdeps/x86_64/elf/start.S:116
-----------------------------------------------------------------------------
One of the processes started by mpirun has exited with a nonzero exit
code. This typically indicates that the process finished in error.
If your process did not finish in error, be sure to include a "return
0" or "exit(0)" in your C code before exiting the application.

PID 24663 failed on node n3 (172.28.0.120) due to signal 8.
-----------------------------------------------------------------------------
knabhishek is offline   Reply With Quote

Old   July 4, 2007, 13:45
Default Can you post the full checkMes
  #14
Senior Member
 
Srinath Madhavan (a.k.a pUl|)
Join Date: Mar 2009
Location: Edmonton, AB, Canada
Posts: 698
Rep Power: 12
msrinath80 is on a distinguished road
Can you post the full checkMesh output here?
msrinath80 is offline   Reply With Quote

Old   July 5, 2007, 04:51
Default I think that explains it all.
  #15
Senior Member
 
Srinath Madhavan (a.k.a pUl|)
Join Date: Mar 2009
Location: Edmonton, AB, Canada
Posts: 698
Rep Power: 12
msrinath80 is on a distinguished road
I think that explains it all. Even if you ignored the warnings, you have a serious error to rectify. Despite your observation concerning no problems when running in serial mode, my suggestion is to work toward solving that error and trying again. In either case, with a mesh so bad (i.e. high non-orthogonality and skewness), I doubt the solvers will do a decent job and/or how meaningful your final CFD solution will be. I have read in the forums and it is also my personal experience that OpenFOAM does a very nice job at parallel given a decent interconnect and case setup (which includes mesh definitions/discretizations et al.). If all you want to do is check your parallel setup, then try one of the example cases (e.g. dambreak). Hope that helps.
msrinath80 is offline   Reply With Quote

Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
OpenFOAM 141 parallel results infiniband vs gigabit vs SMP msrinath80 OpenFOAM Running, Solving & CFD 10 November 30, 2007 19:11
OpenFOAM 13 AMD quadcore parallel results msrinath80 OpenFOAM Running, Solving & CFD 1 November 11, 2007 00:23
Problem using OpenFOAM in parallel skabilan OpenFOAM Running, Solving & CFD 6 October 12, 2007 23:47
OpenFOAM 14 stock version parallel bug msrinath80 OpenFOAM Bugs 2 May 30, 2007 14:47
OpenFOAM 14 Parallel build mplongjr OpenFOAM Installation 3 May 15, 2007 03:32


All times are GMT -4. The time now is 23:19.