|
[Sponsors] |
April 1, 2022, 18:46 |
Problem with splitMeshRegions in parallel
|
#1 |
Senior Member
Alan w
Join Date: Feb 2021
Posts: 260
Rep Power: 6 |
Hello there,
Having a serial case for chtMultiRegion that seems to run okay, I now would like to re-write it as a parallel case. Actually, I have already done it in another version that runs snappyHexMesh twice to create 2 regions, but I would rather use the serial case logic using topoSet, as it's more similar to what worked before, and therefore trustworthy. So far, what I have created results in the attached image of a paraView screen. It shows a fluid cellZone under fluid, and while I would expect to see a solid cellZone under solid, once again it shows one for fluid. I think the problem lies in the way I have implemented splitMeshRegions in the attached script for a parallel run. This script is a work in progress, still unfinished. For reference, also attached is a successful script for my serial case. Hoping for some sharp eye to see the problem in my parallel script. Thanks in advance! |
|
April 1, 2022, 19:04 |
run output errors with parallel splitMeshRegions
|
#2 |
Senior Member
Alan w
Join Date: Feb 2021
Posts: 260
Rep Power: 6 |
I forgot to include the errors shown upon running; here it is:
Code:
Create time Create mesh solid for time = 0 Creating single patch per inter-region interface. Trying to match regions to existing cell zones. Number of regions:1 Writing region per cell file (for manual decomposition) to "constant/solid/cellToRegion" Writing region per cell as volScalarField to "0/solid/cellToRegion" Region Cells ------ ----- 0 4452 Region Zone Name ------ ---- ---- 0 0 fluid Sizes of interfaces between regions: Interface Region Region Faces --------- ------ ------ ----- -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI COMMUNICATOR 3 SPLIT FROM 0 with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [0] [0] [0] --> FOAM FATAL IO ERROR: [0] size 13558 is not equal to the given value of 2227 [0] [0] file: /home/boffin5/cfdaero/radiator-parallel-alt/processor0/0/solid/cellLevel from line 18 to line 13589. [0] [0] From function Foam::Field<Type>::Field(const Foam::word&, const Foam::dictionary&, Foam::label) [with Type = double; Foam::label = int] [0] in file /home/ubuntu/OpenFOAM/OpenFOAM-8/src/OpenFOAM/lnInclude/Field.C at line 210. [0] FOAM parallel run exiting [0] Reading geometric fields Reading volScalarField htcConst Reading volScalarField AoV Reading volScalarField p_rgh Reading volScalarField cellToRegion Reading volScalarField p Reading volScalarField T Reading volScalarField cellLevel [boffin5-VirtualBox:10773] PMIX ERROR: UNREACHABLE in file ../../../src/server/pmix_server.c at line 2193 [1] [1] [1] --> FOAM FATAL IO ERROR: [1] size 13558 is not equal to the given value of 2225 [1] [1] file: /home/boffin5/cfdaero/radiator-parallel-alt/processor1/0/solid/cellLevel from line 18 to line 13589. [1] [1] From function Foam::Field<Type>::Field(const Foam::word&, const Foam::dictionary&, Foam::label) [with Type = double; Foam::label = int] [1] in file /home/ubuntu/OpenFOAM/OpenFOAM-8/src/OpenFOAM/lnInclude/Field.C at line 210. [1] FOAM parallel run exiting |
|
April 2, 2022, 03:07 |
|
#3 |
Senior Member
|
Hello,
I don't know whether my answer will be helpful here. I have used multiregion models in both serial and parallel for 2 solid regions. Serial and Parallel scripts are almost same for topoSet, splitMeshRegions except 'mprirun' command for parallel. The major difference comparing my case with yours is the mesh, I opt uniform mesh for rectangular domain. So there might be problem with your snappyhexamesh ? As you mentioned, you ran parallel case in another OF version. So why not working in current version ? You didn't mention versions either, it might be helpful for others to understand. And another idea, try with simple mesh and execute the following commands as you mentioned. If that works, then mesh is the key here. Regarding ERROR: Solid and Fluid domains are not in their respective cell zones and thats the reason why cell size error comes (13558 not equal to 2227). So locating the cell zones appropriately matters. Thank you |
|
April 4, 2022, 09:19 |
|
#4 |
Senior Member
Yann
Join Date: Apr 2012
Location: France
Posts: 1,066
Rep Power: 26 |
Hi Alan,
Here is what is in your serialscript file: Code:
blockMesh surfaceFeatures snappyHexMesh -overwrite splitMeshRegions -cellZones -overwrite # results in domain with radiator carved out If the answer is yes, then this would be the parallel version: Code:
blockMesh surfaceFeatures decomposePar mpirun -np 2 snappyHexMesh -overwrite -parallel mpirun -np 2 splitMeshRegions -cellZones -overwrite -parallel # results in domain with radiator carved out Yann |
|
November 9, 2022, 13:54 |
|
#5 |
Member
Giles Richardson
Join Date: Jun 2012
Location: Cambs UK
Posts: 98
Rep Power: 13 |
do you not have to run reconstructParMesh after the parallel splitMeshRegions command?
|
|
November 10, 2022, 03:25 |
|
#6 |
Senior Member
Yann
Join Date: Apr 2012
Location: France
Posts: 1,066
Rep Power: 26 |
You don't need to reconstruct the mesh to run the case as long as you run it in parallel.
If you want to reconstruct the case, make sure to use the -allRegions option with reconstructParMesh in order to reconstruct the regions created by splitMeshRegions. I hope this helps, Yann |
|
November 10, 2022, 16:51 |
|
#7 |
Member
Giles Richardson
Join Date: Jun 2012
Location: Cambs UK
Posts: 98
Rep Power: 13 |
Hi Yann, thanks for your reply - in this case there should only be 1 region,
since its using -largestOnly, but I also used -withZero, because otherwise I think it gets written into the constant directory (not sure). Thanks. |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
problem during mpi in server: expected Scalar, found on line 0 the word 'nan' | muth | OpenFOAM Running, Solving & CFD | 3 | August 27, 2018 04:18 |
Run Mode:Platform MPI Local Parallel core problem | mztcu | CFX | 0 | October 13, 2016 03:14 |
Explicitly filtered LES | saeedi | Main CFD Forum | 16 | October 14, 2015 11:58 |
damBreak case parallel run problem | behzad-cfd | OpenFOAM Running, Solving & CFD | 5 | August 2, 2015 17:18 |
problem for parallel processing | minaret | OpenFOAM Running, Solving & CFD | 14 | January 19, 2015 23:41 |