CFD Online Discussion Forums

CFD Online Discussion Forums (
-   Main CFD Forum (
-   -   Help to solve 2 eqns simultaneously in MPI (

Zonexo May 31, 2008 05:11

Help to solve 2 eqns simultaneously in MPI

I obtain 2 linear eqns (u and v velocity) from the momentum eqn in my CFD code. Instead of solving eqn 1 in parallel, and then subsequently eqn 2 in parallel, I am thinking of solving the 2 eqns at the same time, using half the number of processors on each eqn. In other words, when using 4 processors, I use 2 processors for eqn 1 and 2 processors for eqn 2. Sonmeone told to use the MPI_Group features but after looking on the net, I still can't really understand how to do it.

Any experts in MPI around here? Or if someone can point me to some examples, tt'll be great. Btw, I'm using the parallel solver PETSc.

Thanks alot!

Jed May 31, 2008 09:01

Re: Help to solve 2 eqns simultaneously in MPI
You can create separate PETSc objects on separate communicators, but I doubt this is something you really want to do. In particular, it only possibly makes sense if you are absolutely sure that there will never be coupling between the components, even indirectly. My recommendation would be to solve them simultaneously as a coupled system (i.e. in the same Krylov iteration), but use a split preconditioner (PCFIELDSPLIT will make this easy, but you can have more manual control with PCSHELL). A normal preconditioner may work very well too. This is far more flexible and should scale well. Also, if you move to nonlinear problems, you will generally get much faster convergence with Newton-Krylov iterations on the whole coupled system. In general, anything `special' can and should go in the preconditioner.

Zonexo June 1, 2008 20:21

Re: Help to solve 2 eqns simultaneously in MPI
Thanks Jed.

It's obtained from a semi-implicit scheme so that u/v are definitely not coupled together.

Anyway, I'm still not able to find a suitable mpi sample code which suits my problem. Anyone can help?


All times are GMT -4. The time now is 19:35.