About empty patch in parallel run
For chtMultiRegionFoam parallel run , I have written a small modification to decomposePar that works on regions' mesh after splitMeshRegion .
After decomposition , maybe regionI's mesh all distribute in processor0 and none distribute in processor1 .
when I use symmetry plane for a 2D case , it works well , but empty patch does not work .
I found empty patch's updateCoeffs() does not allow zero mesh number:
<< "This mesh contains patches of type empty but is not 1D or 2D\n"
" by virtue of the fact that the number of faces of this\n"
" empty patch is not divisible by the number of cells."
When I modify updateCoeffs() , let it does nothing :
Then when regionI is a solid region, it works , but when regionI is a fluid region , still have problem :
[ff02:29082] *** An error occurred in MPI_Recv
[ff02:29082] *** on communicator MPI_COMM_WORLD
[ff02:29082] *** MPI_ERR_TRUNCATE: message truncated
[ff02:29082] *** MPI_ERRORS_ARE_FATAL (goodbye)
it occurs when solve(U equation) in UEqn.H of chtMultiRegionFoam :
fEqnResidual = solve
UEqn() == -fvc::grad(pf[i])
So I don't know why empty patch can't work when regionI have Zero mesh number in processor1 , but symmetry plane can work well .
Can anybody help me with this problem? Thanks.
I've pushed a fix for the division by zero (cells) to 1.6.x.
2) if you are running chtMultiRegionFoam, the 1.6 version allows independent decomposition of all regions so you're highly unlikely to get zero cells. Look at the Allrun script in the tutorial.
I've run the multiRegionHeater tutorial of OpenFoam version 1.6 , yes the chtMultiRegionFoam can run in parallel well . That's great , Thank you very much Mattijs .
|All times are GMT -4. The time now is 08:58.|