|June 29, 2011, 15:46||
Multi-region MPI parallel issue
Join Date: Mar 2009
Location: Trieste, Italy
Posts: 105Rep Power: 11
I have a quite specific question to ask-
did any of you run into problems with running multi-region parallel simulations using MPI?
I am working on quite extended version of conjugated heat transfer solver.
I added some additional scalar fields, and for each field I created a new boundary condition.
Initially: box of fluid, simple rectangular solid inside. All at the same initial temperature.
Process: the walls of the box are being cooled.
I decompose both fluid and solid between 8 processors. Unfortunately I arrive to situation, when the "coupling" boundaries on the fluid and solid part are on different processors. (The same problem appears after decomposing to 2 processors)
I run the case. For several hundred iterations there is no problem. Then suddenly I get a message from MPI that MPI_Waitall message failed.
After I restart the simulation from the nearest time step, the error occurs at exact same iteration.
The error occurs during creation of fvScalarMatrix, after the boundaries have been updated, but the matrix does not initialize and is not solved.
What can be the problem in my case?
Is it because the coupled patches are not on the same processor? (I thought that OF handled that well)
Or can it be another thing?
Thank you for any suggestions.
With best regards,
|Thread||Thread Starter||Forum||Replies||Last Post|
|Multi region meshing & recovering the original patch names||fluidpath||OpenFOAM Native Meshers: snappyHexMesh and Others||4||May 19, 2013 19:13|
|Using starToFoam||clo||OpenFOAM Other Meshers: ICEM, Star, Ansys, Pointwise, GridPro, Ansa, ...||33||September 26, 2012 04:04|
|StarToFoam error||Kart||OpenFOAM Meshing & Mesh Conversion||1||February 4, 2010 05:38|
|Error using LaunderGibsonRSTM on SGI ALTIX 4700||jaswi||OpenFOAM||2||April 29, 2008 10:54|
|Import gmsh msh to Foam||adorean||Open Source Meshers: Gmsh, Netgen, CGNS, ...||24||April 27, 2005 08:19|