CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > OpenFOAM

running OpenFoam in parallel

Register Blogs Members List Search Today's Posts Mark Forums Read

Like Tree4Likes

Reply
 
LinkBack Thread Tools Display Modes
Old   November 7, 2012, 16:33
Default
  #21
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 8,301
Blog Entries: 34
Rep Power: 84
wyldckat is just really nicewyldckat is just really nicewyldckat is just really nicewyldckat is just really nice
Greetings Arun,

Quote:
Originally Posted by arunsmec View Post
My workstation is equipped with 2 quadcore processors with multithreading (2 threads/core). Does decomposition into more than 8 subdomains make any difference?
I've had a test case with the cavity tutorial, expanded from 2D to 3D, which did not scale well with an AMD 1055T processor with 6 cores, but with 16 sub-domains it was still faster than with 6. Nonetheless, it was possibly a fluke, in the sense that the over-scheduling lead to a slightly better usage of memory and cache.

I've been keeping a blog post with notes on this subject: Notes about running OpenFOAM in parallel
In it you should find this note:
From that post #20 you should be able to learn a bit more about this subject of using a single machine with multiple parallel processes!

Best regards,
Bruno
arunsmec likes this.
wyldckat is offline   Reply With Quote

Old   July 29, 2015, 02:07
Default Strange behaviour
  #22
New Member
 
Join Date: Mar 2015
Posts: 4
Rep Power: 2
cStef is on a distinguished road
Hi all,

I set up a case in OpenFoam and I tried to run it paralell. The simulation works quite fine with two cores. Than we tried to increase the number of cores and an error occures:
run/VAWT$ [6] #0 Foam::error:rintStack(Foam::Ostream&) at ??:?
[6] #4 Foam::PBiCG::solve(Foam::Field<double>&, Foam::Field<double> const&, unsigned char) const at ??:?
[6] #5 Foam::fvMatrix<double>::solveSegregated(Foam::dict ionary const&) at ??:?
[6] #6 Foam::fvMatrix<double>::solve(Foam::dictionary const&) at ??:?
[6] #7 Foam::SolverPerformance<double> Foam::solve<double>(Foam::tmp<Foam::fvMatrix<doubl e> > const&) at ??:?
[6] #8 Foam::incompressible::RASModels::kOmegaSST::correc t() at ??:?
[6] #9 ? at ??:?
[6] #10 __libc_start_main in "/lib/x86_64-linux-gnu/libc.so.6"
[6] #11 ? at ??:?
[openfoamtest:58577] *** Process received signal ***
[openfoamtest:58577] Signal: Floating point exception (8)
[openfoamtest:58577] Signal code: (-6)
[openfoamtest:58577] Failing at address: 0x3e80000e4d1


There were no errors while I decompose the mesh (neither for 2 nor for 16 cores) and the error also occures randomly (somtimes at timestep 1.82, sometimes later). It is also possible to reconstruct the case after the error occurs and restart it. Here my decomposedParDict:
FoamFile
{
version 2.0;
format ascii;
class dictionary;
object decomposeParDict;
}

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //

numberOfSubdomains 16;

method scotch;

Please can someone tell me why this case works on two cores but not on 16 cores? And why it is possible to restart the case?
cStef is offline   Reply With Quote

Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Superlinear speedup in OpenFOAM 13 msrinath80 OpenFOAM Running, Solving & CFD 18 March 3, 2015 06:36
Large test case for running OpenFoam in parallel fhy OpenFOAM Running, Solving & CFD 22 September 22, 2009 11:13
Statically Compiling OpenFOAM Issues herzfeldd OpenFOAM Installation 21 January 6, 2009 10:38
Kubuntu uses dash breaks All scripts in tutorials platopus OpenFOAM Bugs 8 April 15, 2008 07:52
Performance of interFoam running in parallel hsieh OpenFOAM Running, Solving & CFD 8 September 14, 2006 09:15


All times are GMT -4. The time now is 01:00.