CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   Fluent Multiphase (https://www.cfd-online.com/Forums/fluent-multiphase/)
-   -   DPM strange performance decrease with many injections (https://www.cfd-online.com/Forums/fluent-multiphase/165641-dpm-strange-performance-decrease-many-injections.html)

einandr January 22, 2016 04:08

DPM strange performance decrease with many injections
 
Hello everyone!

I am simulating ramjet combustion chamber. I use DPM model with many plain-orifice-atomizer injections, breakup model, and explicit time scheme. This problem is "heavy" and I use Linux cluster.

The problem is that the longer simulation go, the slower it gets. First 1000 iterations go within 2 hours, next 1000 withing maybe 5 hours, next 1000 - within 1 day, next 1000 - 2 days. A lot of droplet parcels appear, but surprisingly they are not the reason! Wen I cancel simulation and re-run it from the last data save, I again have 2 "normal" hours between first 1000 iterations, then 5 hours between second 1000 iterations and so on. When this lag gets bigger, I start to get messages

RPC CX_Flush failed.
: RPC: Timed out

I suppose this is something related to architecture or parallel settings. When I run with not so many injections, it goes well.

Please give any suggestions or ideas!:)

unnikrishnan January 27, 2016 11:25

Parallel processing needs to be balanced in all the DPM problems. When you use DPM, the entire particle domain will be divided into groups and delivered to different cores of the CPU. So it is necessary that the load taken by all the cores needs to be balanced throughout the computation. I work in Fluent but I remember that in Star CCM there is an option to balance the load like I said. The problem may be due to this.
Whenever you start with the saved case and data, the CPU will distribute the load to all the processors equally. But as time progresses, there may be some unbalance. I am not so sure, but I have heard of problems arising due to this.

Jeeloong January 27, 2016 19:34

Hi,

I faced similar problem.

Basically as particle breakup occurs the droplets keep generates until I believe a stable flow field is achieved. where all particle in respective cell that is associated to high aerodynamic force is stable.

Unless the number of droplet going out is balanced(I wasnt sure still learning)

The simulation will get heavier and heavier. Same goes to me.

Try check the particle number in Cluster. If the number keep increases the simulation gets slower.

I trying to keep the particle number low. But it gave me very different solution in term of film thickness height profile with respect to time (monitor on one surface using sum)

Jee

unnikrishnan January 27, 2016 23:42

From the point of breakup, one processor may get unbalanced. That is, when particles assigned to a particular processor may be undergoing a severe breakup process leading to the release of a large no of new child droplets.

CFDYourself January 28, 2016 10:48

Quote:

Originally Posted by unnikrishnan (Post 582657)
...the load taken by all the cores needs to be balanced throughout the computation...

I think the option unnikrishnan may be describing is called "use DPM domain" in Fluent ?

einandr January 29, 2016 05:43

Quote:

Originally Posted by Jeeloong (Post 582725)
Hi,

Try check the particle number in Cluster. If the number keep increases the simulation gets slower.

Jee

Particle number from every injections appx. the same with little increase. My colleagues face the same problem - RAM is gradually loaded while the solution progresses, but when restarting solution from last saved data, RAM is empty again and gradually loaded again untill it gets full and solution speed gets very slow.

Quote:

Originally Posted by unnikrishnan (Post 582657)
Parallel processing needs to be balanced in all the DPM problems.

Thanks for your reply!
Please clarify how can I balance parallel processing?

Also, I learned that when unchecking "COUPLED HEAT-MASS SOLUTION" options, the problem dessapers! But sometimes solution diverge when I dont use this option.

Also I had been told that collision and coallescence computation slows down solution a lot. I try withot that option, but temperature and pressure fields diverge near injection point at low-We regions near flashback zones.

Maybe it will be helpfull fore someone else =))

unnikrishnan January 29, 2016 10:12

I am really sorry. I actually don't know the solution for this. I remember star ccm+ offering a direct solution to this dilemma.
If I come to know anything about this, I will post a reply here for sure.

wilsonrcf August 28, 2023 19:27

Maybe there's a way to do parallel load balancing in fluent
 
Hi guys!

I'm doing some DPM calculations and I've being facing similar issues. I found some commands in fluent manual that might help us:

define models dpm parallel hybrid-2domain yes
parallel partition set dpm-load-balacing yes 50 100
define models dpm parallel expert partition-method-hybrid-2domain yes

I cannot guarantee that will work for you, but i think it's worth a try!


All times are GMT -4. The time now is 04:07.