CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

Slow running of pimpleDyMFoam with respect to pimpleFoam.

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   June 14, 2015, 07:12
Default Slow running of pimpleDyMFoam with respect to pimpleFoam.
  #1
New Member
 
Gareth Jones
Join Date: Jun 2015
Location: London/Singapore
Posts: 2
Rep Power: 0
gj11 is on a distinguished road
Dear All,

I have been having some problems with running a simulation on a moving grid. I want to see the effect of a moving aerofoil and I therefore run a simulation without motion first using pimpleFoam and then I stop and restart the simulation with pimpleDyMFoam.
Both simulations are exactly the same in terms of meshing and numerical schemes (obviously with the former not including the dynamicMeshDict or pointDisplacement etc.) and all the velocity (DILUPBiCG) and pressure (GAMG) solvers converge in a similar number of iterations at each time step in both cases. Despite these similarities however, pimpleDyMFoam is taking 82 seconds per time step while pimpleFoam is taking 1 second. Furthermore, the cell displacement solver (GAMG) takes only 1-2 iterations to converge so I don't understand where the huge difference in time is coming from?

I was wondering if anybody else had experienced something similar or has any insight into the solvers which could account for this poor performance?

Many Thanks,
Gareth

p.s. The simulation is run in parallel on 300 CPUs with roughly 30,000 cells per CPUs and the difference between execution time and clock time is not significant - I think suggesting inter-processor communications are not the problem?
gj11 is offline   Reply With Quote

Old   June 15, 2015, 14:58
Default
  #2
New Member
 
Gareth Jones
Join Date: Jun 2015
Location: London/Singapore
Posts: 2
Rep Power: 0
gj11 is on a distinguished road
Furthermore, results from the static grid simulation compare well with experiments I have conducted so I am confident the simulation has been set up well in general but I must have missed a small numerical detail somewhere when adding the mesh motion....
gj11 is offline   Reply With Quote

Old   June 18, 2015, 11:59
Default
  #3
Member
 
Bruno Blais
Join Date: Sep 2013
Location: Canada
Posts: 64
Rep Power: 12
blais.bruno is on a distinguished road
Some questions which might make it cleared to narrow down the problem.

1 - Do you have frequent IO? Those are dramatic for AMI
2 - Have you checked for a lower number of processor (let's say 0.2 or 0.1 of what you are using right now) if the scaling issue is the same?
3 - Is your AMI interface spread out on many processors? Did you try alternative decomposition strategies?

From my experience, I usually experience something like a 20% slowdown or so, but that was with 12 cores on a single node of a cluster. But from what I have seen, AMI style moving mesh exhibits very poor parallel scaling, but someone might have had different experience from me.

Cheers,
Bruno
blais.bruno is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Running stably with a large Courant number using pimpleFoam mlawson OpenFOAM 11 November 29, 2014 10:37
Problem running IDDES with pimpleFoam charlie OpenFOAM Running, Solving & CFD 7 July 11, 2013 11:45
Statically Compiling OpenFOAM Issues herzfeldd OpenFOAM Installation 21 January 6, 2009 09:38
Kubuntu uses dash breaks All scripts in tutorials platopus OpenFOAM Bugs 8 April 15, 2008 07:52
slow down running in parallel laf FLUENT 1 April 4, 2007 02:48


All times are GMT -4. The time now is 20:45.