
[Sponsors] 
Simulations are not converging in OpenFOAM for many elements (more than 1 000 000) 

LinkBack  Thread Tools  Display Modes 
March 27, 2017, 07:15 
Simulations are not converging in OpenFOAM for many elements (more than 1 000 000)

#1 
New Member
Join Date: May 2013
Posts: 21
Rep Power: 6 
Hi OpenFOAM people!
I am simulating blood flow in a 3D artery, in OpenFOAM code, and unfortunatly, my simulations are not converging for many elements (more than 1 000 000 like you can see below in the "Boundary" file). And many elements are necessary for mt work. I know this is a problem element, since for a number of elements much lower (5 or 10 times) the simulation converges. In the "boundary" file, the elements are: Code:
( wallinterior { type wall; nFaces 59205; startFace 1248069; } inlet_lca { type patch; nFaces 2625; startFace 1307274; } outlet_lda { type patch; nFaces 2130; startFace 1309899; } outlet_lcx { type patch; nFaces 2214; startFace 1312029; } Code:
startFrom latestTime; startTime 0; stopAt endTime; endTime 3.0; deltaT 0.001; writeControl timeStep; writeInterval 10; purgeWrite 0; writeFormat ascii; writePrecision 6; writeCompression off; timeFormat general; timePrecision 6; runTimeModifiable true; Code:
ddtSchemes { default Euler; } gradSchemes { default Gauss linear; grad(p) leastSquares; } divSchemes { default none; div(phi,U) Gauss linear; } laplacianSchemes { default none; laplacian(nu,U) Gauss linear corrected; laplacian((1A(U)),p) Gauss linear corrected; } interpolationSchemes { default linear; interpolate(U) linear; } snGradSchemes { default corrected; } fluxRequired { default no; p ; } Code:
solvers { p { solver PCG; preconditioner DIC; tolerance 1e06; relTol 0; } U { solver PBiCG; preconditioner DILU; tolerance 1e05; relTol 0; } } PISO { nCorrectors 4; nNonOrthogonalCorrectors 2; } I would be very happy and grateful with your answers and sugestions. Last edited by wyldckat; April 30, 2017 at 12:17. Reason: Added [CODE][/CODE] markers 

March 27, 2017, 08:31 

#2 
Member
Timofey Mukha
Join Date: Mar 2012
Location: Uppsala, Sweden
Posts: 82
Rep Power: 7 
What type of simulation is this? Laminar, RANS, LES?
What sticks out directly is that you use linear interpolation of the convective fluxes: div(phi,U) Gauss linear; This is not a bounded scheme, so this may very well lead to problems. Using the multigrid solver for pressure might speed up your computations. 

March 28, 2017, 09:08 

#3 
New Member
Join Date: May 2013
Posts: 21
Rep Power: 6 
Thank you very much for your answer.
The simulations are laminar and transient. I reduced the time step deltaT to 0,0005 and, now, simulations seems to converge, but take so many time! I think that your suggestion of using multigrid solver GAMG insead of PCG is a good suggestion. I will try this in following simulations. 

March 28, 2017, 09:30 

#4 
New Member
Join Date: May 2013
Posts: 21
Rep Power: 6 
I will also try a bounded scheme. Thank you very much.


March 31, 2017, 07:19 

#5 
New Member
Join Date: May 2013
Posts: 21
Rep Power: 6 
Hi OpenFoam people!
The simulations are, now, converging much more faster using the multigrid solver GAMG for Pressure. And, I also used a bounded scheme for div(phi, U). The blood flow simulations using a simple model to define blood rheology and conservative equations implemented in OpenFoam, are, now, running faster (only 1 day!) for more than 1 000 000 elements. Also, the results are accurate since they are coincident with that obtained through the comercial code ANSYS. Thank you very much TIAM from Sweden!! However.... I also implemented a complex blood rheology to define in a more accurate way the blood physical properties. This properties are defined through a constitutive equation, which conservative equations depends on this constitutive equation. So, since OpenFoam is an Open Source Code, I modified the conservative equations and implemented the constitutive equation (defining blood rheology more correctly). This code was already validated in 2D dimension of the artery (with much less elements!). In this way... My question is.... I want to simulate the previous case, in 3D dimension with more than 1 000 000 elements, faster. The simulations are too low... Since, now, this code is more complex than the first one. Do you have any suggestion for that simulations can run a lot more faster, for this case, and do not have the risk to diverge? ************************************************* The "boundary" file with many elements is: ( wallinterior { type wall; nFaces 59205; startFace 1248069; } inlet_lca { type patch; nFaces 2625; startFace 1307274; } outlet_lda { type patch; nFaces 2130; startFace 1309899; } outlet_lcx { type patch; nFaces 2214; startFace 1312029; } ) ************************************************** The "controlDict" file than I am using is: application elasticshearthinFluidFoam; startFrom latestTime; startTime 0; stopAt endTime; endTime 3.0; deltaT 0.0005; writeControl adjustableRunTime; writeInterval 0.01; purgeWrite 0; writeFormat ascii; writePrecision 6; writeCompression uncompressed; timeFormat general; timePrecision 6; graphFormat raw; runTimeModifiable yes; adjustTimeStep on; maxCo 0.8; maxDeltaT 0.001; ************************************************** *** The "fvSchemes" file is: ddtSchemes { default Euler; } gradSchemes { default Gauss linear; grad(p) Gauss linear; grad(U) Gauss linear; } divSchemes { default none; div(phi,U) Gauss SFCD; div(phi,detAfirst) Gauss Minmod; div(phi,detAsecond) Gauss Minmod; div(phi,detAthird) Gauss Minmod; div(phi,detAfourth) Gauss Minmod; div(phi,taufirst) Gauss Minmod; div(phi,tausecond) Gauss Minmod; div(phi,tauthird) Gauss Minmod; div(phi,taufourth) Gauss Minmod; div(tau) Gauss linear; } laplacianSchemes { default none; laplacian(etaPEff,U) Gauss linear corrected; laplacian(etaPEff+etaS,U) Gauss linear corrected; laplacian((1A(U)),p) Gauss linear corrected; } interpolationSchemes { default linear; interpolate(HbyA) linear; } snGradSchemes { default corrected; } fluxRequired { default no; p; } ************************************************* The "fvSolution" file is: solvers { p { solver GAMG; preconditioner { // preconditioner Cholesky; preconditioner FDIC; cycle Wcycle; policy AAMG; nPreSweeps 0; nPostSweeps 2; groupSize 4; minCoarseEqns 20; nMaxLevels 100; scale off; smoother ILU; } tolerance 1e05; relTol 0; minIter 0; maxIter 800; agglomerator faceAreaPair; nCellsInCoarsestLevel 100; mergeLevels 1; smoother GaussSeidel; } U { solver BICCG; preconditioner { preconditioner DILU; } tolerance 1e6; relTol 0; minIter 0; maxIter 1000; agglomerator faceAreaPair; nCellsInCoarsestLevel 4; mergeLevels 1; smoother DILUGaussSeidel; } detAfirst { solver BICCG; preconditioner { preconditioner DILU; } tolerance 1e6; relTol 0; minIter 0; maxIter 1000; agglomerator faceAreaPair; nCellsInCoarsestLevel 4; mergeLevels 1; smoother DILUGaussSeidel; } detAsecond { solver BICCG; preconditioner { preconditioner DILU; } tolerance 1e6; relTol 0; minIter 0; maxIter 1000; agglomerator faceAreaPair; nCellsInCoarsestLevel 4; mergeLevels 1; smoother DILUGaussSeidel; } detAthird { solver BICCG; preconditioner { preconditioner DILU; } tolerance 1e6; relTol 0; minIter 0; maxIter 1000; agglomerator faceAreaPair; nCellsInCoarsestLevel 4; mergeLevels 1; smoother DILUGaussSeidel; } detAfourth { solver BICCG; preconditioner { preconditioner DILU; } tolerance 1e6; relTol 0; minIter 0; maxIter 1000; agglomerator faceAreaPair; nCellsInCoarsestLevel 4; mergeLevels 1; smoother DILUGaussSeidel; } taufirst { solver BICCG; preconditioner { preconditioner DILU; } tolerance 1e6; relTol 0; minIter 0; maxIter 1000; agglomerator faceAreaPair; nCellsInCoarsestLevel 4; mergeLevels 1; smoother DILUGaussSeidel; } tausecond { solver BICCG; preconditioner { preconditioner DILU; } tolerance 1e6; relTol 0; minIter 0; maxIter 1000; agglomerator faceAreaPair; nCellsInCoarsestLevel 4; mergeLevels 1; smoother DILUGaussSeidel; } tauthird { solver BICCG; preconditioner { preconditioner DILU; } tolerance 1e6; relTol 0; minIter 0; maxIter 1000; agglomerator faceAreaPair; nCellsInCoarsestLevel 4; mergeLevels 1; smoother DILUGaussSeidel; } taufourth { solver BICCG; preconditioner { preconditioner DILU; } tolerance 1e6; relTol 0; minIter 0; maxIter 1000; agglomerator faceAreaPair; nCellsInCoarsestLevel 4; mergeLevels 1; smoother DILUGaussSeidel; }; } PISO { nCorrectors 1; nNonOrthogonalCorrectors 2; pRefCell 200; pRefValue 10000; // nCorrectors 2; // nNonOrthogonalCorrectors 0; // pRefCell 0; // pRefValue 0; } relaxationFactors { p 0.3; U 0.5; detAfirst 0.3; detAsecond 0.3; detAthird 0.3; detAfourth 0.3; taufirst 0.3; tausecond 0.3; tauthird 0.3; taufourth 0.3; // p 0.3; // U 0.5; // tau 0.3; } ************************************************** ********* Apparently, the simulations are converging with this process, but I think that I will wait one week for the results... Do someone suggest to modify any parameter to run faster and with convergence? I will be greatful! Thank you very much! Last edited by silvai; April 3, 2017 at 08:43. 

April 1, 2017, 05:16 

#6 
Senior Member
Derek Mitchell
Join Date: Mar 2014
Location: UK, Reading
Posts: 144
Rep Power: 6 
have you looked at this ?
http://www.arek.pajak.info.pl/wpcon...ndtricksOF.pdf things like changing tolerances, relaxation factors , nonorthogonal correctors
__________________
A CHEERING BAND OF FRIENDLY ELVES CARRY THE CONQUERING ADVENTURER OFF INTO THE SUNSET 

April 3, 2017, 08:33 

#7 
New Member
Join Date: May 2013
Posts: 21
Rep Power: 6 
Hi!
I looked the pdf file that you suggested. In div(phi,U) instead of Gauss SFCD, I am trying Gauss MUSCL. However, it does not speed up the process. It continues very slow!!! I know that there are so many equations and a very refined 3D mesh. I also read that reducing the tolerance in pressure (p) for 1e4 with relTol 1e2 can speed up. But it continues too slow!! Someone know... How can I SPEED UP with CONVERGENCE? (The original files are above.) I am thinking on changing the "controlDict" file. I would like to fix a deltaT (equal to 0.0005 for e.g.) instead of using adjustableRunTime. Using the adjustableRunTime, the deltaT becomes around 3e6, thatīs why I think it becomes too slow!! Someone know... For these type of simulation (original files above) if I am correct and how to change for a fixed deltaT in this case? I will be very greatful! 

Tags 
many elements, not convergence, openfoam 
Thread Tools  
Display Modes  


Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Simulations are not converging for many elementes (more than 1 000 000)  silvai  OpenFOAM Running, Solving & CFD  5  April 20, 2017 07:16 
Ansys SIG$ILL error  loth  ANSYS  3  December 24, 2015 06:31 
can prism elements exist in 2D mesh in openfoam  shuoxue  OpenFOAM Meshing & Mesh Conversion  2  June 20, 2013 23:19 
2D Mesh Generation Tutorial for GMSH  aeroslacker  Open Source Meshers: Gmsh, Netgen, CGNS, ...  12  January 19, 2012 04:52 
CFX4.3 build analysis form  Chie Min  CFX  5  July 12, 2001 23:19 