CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

3D simulations OpenFoam running too low(more 1 000 000 elements) Suggestion speed up?

Register Blogs Members List Search Today's Posts Mark Forums Read

Like Tree1Likes
  • 1 Post By Taataa

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   March 31, 2017, 08:39
Post 3D simulations OpenFoam running too low(more 1 000 000 elements) Suggestion speed up?
  #1
New Member
 
Join Date: May 2013
Posts: 21
Rep Power: 13
silvai is on a distinguished road
Hi OpenFoam people!

I am simulating blood flow in 3D arteries using OpenFoam code.

Previous simulations are converging much more faster using the multi-grid solver GAMG for Pressure. And, I also used a bounded scheme for div(phi, U) to avoid divergence. The 3D blood flow simulations using a simple model to define blood rheology and conservative equations implemented in OpenFoam, are running faster (only 1 day!) for more than 1 000 000 elements.
Also, the results are accurate since they are coincident with that obtained through the comercial code ANSYS.

However.... I also implemented a complex blood rheology to define in a more accurate way the blood physical properties. This properties are defined through a constitutive equation, which conservative equations depend on this constitutive equation. So, since OpenFoam is an Open Source Code, I modified the conservative equations and implemented the constitutive equation (defining blood rheology more correctly). This code was already validated in 2D dimension of the artery (with much less elements!).

In this way... My question is....

I want to simulate the previous case, in 3D dimension with more than 1 000 000 elements, faster. The simulations are too low... (Perhaps more than 1 week to converge! ) Since, now, this code is more complex than the first one.

Do you have any suggestion (modify any method or parameter) for that simulations can run a lot more faster, for this case, and do not have the risk to diverge?

*************************************************
"boundary" file:

(
wall-interior
{
type wall;
nFaces 59205;
startFace 1248069;
}
inlet_lca
{
type patch;
nFaces 2625;
startFace 1307274;
}
outlet_lda
{
type patch;
nFaces 2130;
startFace 1309899;
}
outlet_lcx
{
type patch;
nFaces 2214;
startFace 1312029;
}
)

**************************************************
The "controlDict" file than I am using is:

application elasticshearthinFluidFoam;
startFrom latestTime;
startTime 0;
stopAt endTime;
endTime 3.0;
deltaT 0.0005;
writeControl adjustableRunTime;
writeInterval 0.01;
purgeWrite 0;
writeFormat ascii;
writePrecision 6;
writeCompression uncompressed;
timeFormat general;
timePrecision 6;
graphFormat raw;
runTimeModifiable yes;
adjustTimeStep on;
maxCo 0.8;
maxDeltaT 0.001;
************************************************** ***
The "fvSchemes" file is:

ddtSchemes
{
default Euler;
}
gradSchemes
{
default Gauss linear;
grad(p) Gauss linear;
grad(U) Gauss linear;
}
divSchemes
{
default none;
div(phi,U) Gauss SFCD;
div(phi,detAfirst) Gauss Minmod;
div(phi,detAsecond) Gauss Minmod;
div(phi,detAthird) Gauss Minmod;
div(phi,detAfourth) Gauss Minmod;
div(phi,taufirst) Gauss Minmod;
div(phi,tausecond) Gauss Minmod;
div(phi,tauthird) Gauss Minmod;
div(phi,taufourth) Gauss Minmod;
div(tau) Gauss linear;
}
laplacianSchemes
{
default none;
laplacian(etaPEff,U) Gauss linear corrected;
laplacian(etaPEff+etaS,U) Gauss linear corrected;
laplacian((1|A(U)),p) Gauss linear corrected;
}
interpolationSchemes
{
default linear;
interpolate(HbyA) linear;
}
snGradSchemes
{
default corrected;
}
fluxRequired
{
default no;
p;
}
*************************************************
The "fvSolution" file is:

solvers
{
p
{
solver GAMG;
preconditioner
{
// preconditioner Cholesky;
preconditioner FDIC;
cycle W-cycle;
policy AAMG;
nPreSweeps 0;
nPostSweeps 2;
groupSize 4;
minCoarseEqns 20;
nMaxLevels 100;
scale off;
smoother ILU;
}
tolerance 1e-05;
relTol 0;
minIter 0;
maxIter 800;
agglomerator faceAreaPair;
nCellsInCoarsestLevel 100;
mergeLevels 1;
smoother GaussSeidel;
}

U
{
solver BICCG;
preconditioner
{
preconditioner DILU;
}
tolerance 1e-6;
relTol 0;
minIter 0;
maxIter 1000;
agglomerator faceAreaPair;
nCellsInCoarsestLevel 4;
mergeLevels 1;
smoother DILUGaussSeidel;
}

detAfirst
{
solver BICCG;
preconditioner
{
preconditioner DILU;
}
tolerance 1e-6;
relTol 0;
minIter 0;
maxIter 1000;
agglomerator faceAreaPair;
nCellsInCoarsestLevel 4;
mergeLevels 1;
smoother DILUGaussSeidel;
}

detAsecond
{
solver BICCG;
preconditioner
{
preconditioner DILU;
}
tolerance 1e-6;
relTol 0;
minIter 0;
maxIter 1000;
agglomerator faceAreaPair;
nCellsInCoarsestLevel 4;
mergeLevels 1;
smoother DILUGaussSeidel;
}

detAthird
{
solver BICCG;
preconditioner
{
preconditioner DILU;
}
tolerance 1e-6;
relTol 0;
minIter 0;
maxIter 1000;
agglomerator faceAreaPair;
nCellsInCoarsestLevel 4;
mergeLevels 1;
smoother DILUGaussSeidel;
}

detAfourth
{
solver BICCG;
preconditioner
{
preconditioner DILU;
}
tolerance 1e-6;
relTol 0;
minIter 0;
maxIter 1000;
agglomerator faceAreaPair;
nCellsInCoarsestLevel 4;
mergeLevels 1;
smoother DILUGaussSeidel;
}

taufirst
{
solver BICCG;
preconditioner
{
preconditioner DILU;
}
tolerance 1e-6;
relTol 0;
minIter 0;
maxIter 1000;
agglomerator faceAreaPair;
nCellsInCoarsestLevel 4;
mergeLevels 1;
smoother DILUGaussSeidel;
}

tausecond
{
solver BICCG;
preconditioner
{
preconditioner DILU;
}
tolerance 1e-6;
relTol 0;
minIter 0;
maxIter 1000;
agglomerator faceAreaPair;
nCellsInCoarsestLevel 4;
mergeLevels 1;
smoother DILUGaussSeidel;
}

tauthird
{
solver BICCG;
preconditioner
{
preconditioner DILU;
}
tolerance 1e-6;
relTol 0;
minIter 0;
maxIter 1000;
agglomerator faceAreaPair;
nCellsInCoarsestLevel 4;
mergeLevels 1;
smoother DILUGaussSeidel;
}

taufourth
{
solver BICCG;
preconditioner
{
preconditioner DILU;
}
tolerance 1e-6;
relTol 0;
minIter 0;
maxIter 1000;
agglomerator faceAreaPair;
nCellsInCoarsestLevel 4;
mergeLevels 1;
smoother DILUGaussSeidel;
};

}
PISO
{
nCorrectors 1;
nNonOrthogonalCorrectors 2;
pRefCell 200;
pRefValue 10000;
// nCorrectors 2;
// nNonOrthogonalCorrectors 0;
// pRefCell 0;
// pRefValue 0;
}

relaxationFactors
{
p 0.3;
U 0.5;
detAfirst 0.3;
detAsecond 0.3;
detAthird 0.3;
detAfourth 0.3;
taufirst 0.3;
tausecond 0.3;
tauthird 0.3;
taufourth 0.3;
// p 0.3;
// U 0.5;
// tau 0.3;
}
************************************************** *********

Apparently, the simulations are converging with this process, but I think that I will wait one week for the results... Do someone suggest to modify any parameter to run faster and with convergence?

I will be greatful! Thank you very much!
silvai is offline   Reply With Quote

Old   March 31, 2017, 18:12
Default
  #2
Senior Member
 
Taher Chegini
Join Date: Nov 2014
Location: Houston, Texas
Posts: 125
Rep Power: 13
Taataa is on a distinguished road
Here are my suggestions:
in controlDict:
writeFormat binary;
writePrecision 8;
maxDelta 1;

in fv Scheme you can try MUSCL, sometimes it gives me better results faster.

in fvSolution:
for all of the pressure related variables:
p
{
solver PBiCGStab;
preconditioner DIC;
tolerance 1e-8;
relTol 1e-2;
}

pFinal
{
$p;
relTol 0;
}

for everything else:
else
{
solver PBiCGStab;
preconditioner DILU;
tolerence 1e-8;
relTol 1e-2;
}

elseFinal
{
$else;
relTol 0;
}

for relaxation:
relaxationFactors
{
equations
{
".*" 1;
}
}
BlnPhoenix likes this.
Taataa is offline   Reply With Quote

Old   April 3, 2017, 09:26
Post
  #3
New Member
 
Join Date: May 2013
Posts: 21
Rep Power: 13
silvai is on a distinguished road
Hi!

Thank you very much Taataa!

In div(phi,U) instead of Gauss SFCD, I am trying Gauss MUSCL. However, it does not speed up the process. It continues very slow!!!

I know that there are so many equations and a very refined 3D mesh.

I also read that reducing the tolerance in pressure (p) for 1e-4 with relTol 1e-2 can speed up.

But it continues too slow!!

Someone know... How can I SPEED UP with CONVERGENCE?
(The original files are above.)

I am thinking on changing the "controlDict" file. I would like to fix a deltaT (equal to 0.0005 for e.g.) instead of using adjustableRunTime. Using the adjustableRunTime, the deltaT becomes around 3e-6, thatīs why I think it becomes too slow!!

Someone know... For this type of simulation (original files and described above) if I am correct and how to do this?

I will be very greatful!
silvai is offline   Reply With Quote

Old   April 6, 2017, 03:52
Default
  #4
Senior Member
 
Join Date: Oct 2013
Posts: 397
Rep Power: 19
chriss85 will become famous soon enough
Have you checked mesh quality? Any problems?
If you have some very small cells they can slow down the whole simulation because they might result in a large Courant number.
chriss85 is offline   Reply With Quote

Old   April 6, 2017, 12:44
Post
  #5
New Member
 
Join Date: May 2013
Posts: 21
Rep Power: 13
silvai is on a distinguished road
Thanks for your answer chriss85!

The mesh is very refined (around 1 000 000 elements). The elements are very small, but the mesh is accurate. I checked the mesh. I am trying the convergence for this extreme case (a lot of elements). When I decrease the Maximum Courant number (maxCo), deltaT becames very small and the simulation process is too slow!! :/
For I higher maxCo, the simulations are faster, but for a time of the transient pulse flow, suddendly, the simulation diverges!

However, I know that the adjustableRunTime must be used in these simualtions; not a fixed deltaT.

I think that to solve the problem is to reduce the number of elements of the mesh (checking the mesh quality and to be sure that give appropriate resuts). Perhaps reducing the elements number for half, can result and converge.

I am trying this.

Thank you very much for your response.
silvai is offline   Reply With Quote

Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Frequently Asked Questions about Installing OpenFOAM wyldckat OpenFOAM Installation 3 November 14, 2023 12:58
Something weird encountered when running OpenFOAM in parallel on multiple nodes xpqiu OpenFOAM Running, Solving & CFD 2 May 2, 2013 05:59
New OpenFOAM Forum Structure jola OpenFOAM 2 October 19, 2011 07:55
Statically Compiling OpenFOAM Issues herzfeldd OpenFOAM Installation 21 January 6, 2009 10:38
Kubuntu uses dash breaks All scripts in tutorials platopus OpenFOAM Bugs 8 April 15, 2008 08:52


All times are GMT -4. The time now is 16:28.