CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > General Forums > Main CFD Forum

Linear solver choice for inner iterations of implicit CFD solvers

Register Blogs Community New Posts Updated Threads Search

Like Tree3Likes
  • 1 Post By andy_
  • 1 Post By aerosayan
  • 1 Post By aerosayan

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   December 22, 2023, 06:46
Default Linear solver choice for inner iterations of implicit CFD solvers
  #1
Senior Member
 
Sayan Bhattacharjee
Join Date: Mar 2020
Posts: 495
Rep Power: 8
aerosayan is on a distinguished road
Hello everyone,

I'm trying to implement this implicit solver : https://doi.org/10.1016/j.jcp.2018.04.005

They use an approximate Jacobian to linearize the system, then relax it using 10 inner iterations of a Gauss-Seidel solver.

Essentially it is a Newton's method based solver, and in the inner iterations, the solver takes small steps following the gradient/direction prescribed in the Jacobian

Since it is not required to completely solve the linear system in one step, a simple Gauss-Seidel solver seems to be fast enough for the inner iterations.

In future I will use PETSc, but for now I'm trying to use Eigen 3.

However, Eigen 3 does not have a sparse Gauss-Seidel solver, so I'm not sure which solver I should use for the inner iterations.

Eigen has Sparse LU, Sparse QR, Conjugate Gradient, Least Square Conjugate Gradient, BiCGSTAB, and few other sparse solvers : https://eigen.tuxfamily.org/dox/grou...seSystems.html

Can we use any of these solvers instead of Gauss-Seidel for the inner iterations?

Thanks.
aerosayan is offline   Reply With Quote

Old   December 22, 2023, 09:14
Default
  #2
Senior Member
 
Sayan Bhattacharjee
Join Date: Mar 2020
Posts: 495
Rep Power: 8
aerosayan is on a distinguished road
Currently I'm thinking of just using BiCGSTAB or GMRES to "partially" solve it.

One of Venkatakrishnan and Mavriplis's papers from 1993 recommend that since the linear system is not accurate to begin with, (when away from the linearization point), we can solve the linear system to a lower degree of accuracy, and still get correct nonlinear convergence.

Eigen 3 seems to provide a method to specify the required tolerance of the linear solvers. So, using the tolerance of something like 1E-3 to 1E-6, I think I maybe able to get a converged solution.

Does that seem correct?

Thanks
Attached Images
File Type: png res.png (22.5 KB, 8 views)
aerosayan is offline   Reply With Quote

Old   December 22, 2023, 14:34
Default
  #3
Senior Member
 
andy
Join Date: May 2009
Posts: 270
Rep Power: 17
andy_ is on a distinguished road
Gauss-Seidel is only a few lines long if implemented by hand. It might be only a single line in some languages(?) that can write vector expressions though what an optimizer might do may need some thought/checking. This may be why it is not in your solver library.

I'm curious what guided you to Eigen 3 for a CFD code? Not knocking the choice just wondering about what you saw as the pros given the significant con of not running in parallel.

You can use any stable solver for inner iterations. Reducing the residuals a modest amount rather than fully works well in some circumstances but badly, sometimes not at all, in others. A bit of experimentation with typical problems should produce a reasonable starting sets of parameters which may need tightening for trickier problems.
aerosayan likes this.
andy_ is offline   Reply With Quote

Old   December 23, 2023, 01:34
Default
  #4
Senior Member
 
Sayan Bhattacharjee
Join Date: Mar 2020
Posts: 495
Rep Power: 8
aerosayan is on a distinguished road
Thanks Andy,

Eigen3 is good enough for experimental codes, and it's the one I know, so I don't want to setup more complicated solvers like PETSc right now.

I know how to implement Gauss-Siedel, but don't want to implement it for EIgen if I don't have to.

Eigen3 can run some solvers like BiCGSTAB in parallel using OpenMP, so it's good enough for testing.

And I can provide the OpenMP parallelized solver to some alpha/beta users for a trial period without the risk of it being pirated (for some time).

The final MPI based version of the code will be kept close, and will only be provided to academic and industrial folks, to whom we would want to show the full capabilities/scaling of the solver.

For example: A student who only has a 8 core laptop, has no need for a MPI based solver that can scale to 1000 CPU cores.

Also, since the OpenMP based solver is not really too much powerful, I can share the software for educational use with much relaxed license, so students can benefit from the solver, and don't have to worry too much about the complexity of licensing.

The MPI based solver would obviously be only for academic and industrial uses, and be controlled by commercial licensing.
andy_ likes this.
aerosayan is offline   Reply With Quote

Old   January 5, 2024, 12:49
Default
  #5
Senior Member
 
Sayan Bhattacharjee
Join Date: Mar 2020
Posts: 495
Rep Power: 8
aerosayan is on a distinguished road
Answering my own question, since Eigen is getting more popular, and maybe it will help someone in future ...

Use DIRECT solvers *as much as possible*, for development, debugging, and testing.

Use ITERATIVE solvers with PRE-CONDITIONERS for *large* simulations, AFTER your code has been developed, debugged, and validated.

Best Direct Solver in Eigen: SparseLU ( https://eigen.tuxfamily.org/dox/clas...1SparseLU.html)

Somewhat okay Iterative solver in Eigen: GMRES, with Incomplete LUT pre-conditioner.( https://eigen.tuxfamily.org/dox/unsu..._1_1GMRES.html and https://eigen.tuxfamily.org/dox/clas...mpleteLUT.html )

In books, and papers, we're often told that direct solvers are bad, slow, and require more memory, while iterative solvers are extremely good.

That is correct, but we often forget that direct solvers are extremely stable, while iterative solvers are sometimes unstable, and require complex pre-conditioning to be stable, and accurate.

In Eigen, BiCGSTAB is the standard iterative solver, but for the simulation I'm trying to run, the solver doesn't converge.

However, GMRES solver with Incomplete LUT pre-conditioner has proven to be more stable and accurate for my simulation cases.

So, if iterative solvers are so good, why do we need direct solvers?

For developing and debugging our code.

When I tried to only use iterative solvers for my code, I wasn't sure what was causing the errors. As seen above, choice of iterative solver, and pre-conditioners are very crucial. So, when I was trying to only use BiCGSTAB, the code was diverging, and I didn't know what to debug.

When I used direct solver, like SparseLU, the inner iterations were more stable, and I could focus more on debugging the other sections of my code, like matrix assembly procedure, Jacobian calculation procedure, diagonal dominance procedure, etc.

However it's important to know that direct solvers require more memory, so for initial development of your code, it's best to test on small and medium sized meshes.

Thanks.
arjun likes this.
aerosayan is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
chtMultiRegionSimpleFoam: maximum number of iterations excedeed. Nkl OpenFOAM Running, Solving & CFD 19 October 10, 2019 02:42
Suppress twoPhaseEulerFoam energy AlmostSurelyRob OpenFOAM Running, Solving & CFD 33 September 25, 2018 17:45
p_rgh initial residual no change with different settings manuc OpenFOAM Running, Solving & CFD 3 June 26, 2018 15:53
Segmentation fault when using reactingFOAM for Fluids Tommy Floessner OpenFOAM Running, Solving & CFD 4 April 22, 2018 12:30
Free surface issues with interDyMFoam for hydroturbine oumnion OpenFOAM Running, Solving & CFD 0 October 6, 2017 14:05


All times are GMT -4. The time now is 00:24.