Coupled solver, computational cost
Hi everybody guys,
I'm solving a wind turbine starting with the steady laminar case and using a pressure-based coupled algorithm. I have a few questions about the setup. I've looked at the theory guide but I don't know how the Courant number (the solver one, no the regular one) affects convergence. The only thing theory guide says is 1/CFL=(1-alpha)/alpha. I increased CFL from default 200 to 200000 but then I reduced it to 200 as it seemed to increase computational cost. Residuals went up so I had to reduce uder-relaxation factor (URF) to half of its value for momentum and pressure (0.75 to 0.375 each). Now that residuals are starting to go down I would like to increase computational speed as it is way too slow. I know increasing URF will improve that, but what about CFL? For a 1.7 Gb mesh, 8 cores parallel, 12 Gb RAM it's taking 45 min for each step. Also, I will have to do a transient laminar case and steady/transient k-omega SST, which solver setup should I use? SIMPLEC? PISO? Thanks in advance! |
I've increased URF to 0.75 and it obviously takes less to calculate each iteration, but it's still slow (30 min/iteration). Can I increase URF over default values? What about Courant number, how does it affect the simulation?
Also there's reverse flow at pressure outlet, and it's increasing. Any suggestion? Thanks! |
30 minutes for one iteration sounds like the simulation is running out of core.
Check the RAM usage while running the case. |
Quote:
For transient case PISO and coupled are good option as they can accommodate larger time steps... And I would prefer coupled. Still not checked the results, accuracy and speed of these two schemes... |
Quote:
I have 16 Gb RAM My mesh size is: Total elements : 8250475 Total nodes : 8118741 |
Quote:
|
Quote:
As suggested by Far I'm going to coarsen my mesh a bit more. 8 millions cells seem too much. |
Double precision usually is only necessary if there are cells with really high aspect ratio in the boundary layer to achieve Yplus<1.
In most cases, the other errors in the simulation are several magnitudes higher than the roundoff error in single precision. The thing with double precision is that it doubles memory usage. |
Courant number is proportional to the time step and velocity and inversely proportional to grid size at local cell conditions. The fact that how much sensetive the solution is to Courant number, depends on whether the solution method is explicit or implicit.
With explicit method, there are obvious limitations on timestep, and hence Courant number. For implicit method, typically it should be helpful to increase the Courant number in accelerating convergence. However, you should be careful here since the Courant number should be ramped gradually over say 50-100 iterations. This interval should be smaller, and the increase in Courant number should be smaller, at the initial phase of solution, where the transient instabilities have a strong influence on solution. A sudden increase in Courant number may not achieve the steady convergence goal. OJ |
Quote:
OJ: This cournt number is different. It is for pressure based couple solver only. Recommended value is 200,000 Try SIMPLE , it will cost you less along with single precision. |
Quote:
|
Quote:
|
i don't want to sound pessimist:for an 8 million nodes, i'm not surprised. you need to take that case to a cluster. no matter what you change, it will always take too much time...
is it the mesh about the blade ? |
2 Attachment(s)
Quote:
I'm working on reducing the mesh density, but what max cell size can I use in the farfield? Thanks! |
have you tried the block refinement tab, see if you could bring the blocks in far field to 1/2 the others...
|
i have the same machine as you , 16gb of ram, i don't go beyond 1 500 000 nodes. i know my machine can't handle them. how much of the extra minutes could you get after inputting the right adequate CFL ??
|
Quote:
Quote:
Quote:
I don't know which is the adequate CFL, I've just started with the 200 default value. I guess the higher the faster so the better for me. Thanks a lot! |
5 Attachment(s)
I've reduced the mesh from 8 to 2 million elements without changing the y+. Quality>0.2, min angle>18º and volume change <50 (most elements vol change<30)
Then I've run the steady laminar case with single precision, SIMPLE and 2nd order pressure and momentum. inlet velocity is 5m/s, rotational velocity is 1.034 rad/s. It has reached residuals 1e-4 for 2700 iteratons, computation time was 1h30 using 8 cores, which seems okay to me. Reverse flow at outlet started in 15,000 cells then it reduced to 0 and started raising up to 6,000 when convergence was reached. The issue is force and moment coefficients for the blade surface are very low: CD=0.04 (x-axis), CL=0.48 (z-axis) and CM=0.0032 (around z-axis) I've checked the flow pattern around the airfoil and it looks good, I think there's some recirculation in the leeward face, which is strange for an angle of attack of 12º (see pictures) What can be wrong? Bad mesh near the airfoil? Bad mesh in the farfield? Wrong solver setup? Is it possible to have stall in the airfoil that early? Another question, could anybody tell me how to meassure lift or drag force over a line (airfoil perimeter) of the blade surface? Any suggestion is welcome! Thank you guys! |
Quote:
Have a closer look at the region where backflow occurs and consider moving the outlet further downstream. |
All times are GMT -4. The time now is 11:39. |