CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

Problem with splitMeshRegions in parallel

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   April 1, 2022, 18:46
Default Problem with splitMeshRegions in parallel
  #1
Senior Member
 
Alan w
Join Date: Feb 2021
Posts: 260
Rep Power: 6
boffin5 is on a distinguished road
Hello there,

Having a serial case for chtMultiRegion that seems to run okay, I now would like to re-write it as a parallel case. Actually, I have already done it in another version that runs snappyHexMesh twice to create 2 regions, but I would rather use the serial case logic using topoSet, as it's more similar to what worked before, and therefore trustworthy.

So far, what I have created results in the attached image of a paraView screen. It shows a fluid cellZone under fluid, and while I would expect to see a solid cellZone under solid, once again it shows one for fluid.

I think the problem lies in the way I have implemented splitMeshRegions in the attached script for a parallel run. This script is a work in progress, still unfinished. For reference, also attached is a successful script for my serial case.

Hoping for some sharp eye to see the problem in my parallel script. Thanks in advance!
Attached Images
File Type: gif paraview.gif (20.8 KB, 20 views)
Attached Files
File Type: txt parallelrunscript.txt (816 Bytes, 13 views)
File Type: txt serialscript.txt (566 Bytes, 8 views)
boffin5 is offline   Reply With Quote

Old   April 1, 2022, 19:04
Default run output errors with parallel splitMeshRegions
  #2
Senior Member
 
Alan w
Join Date: Feb 2021
Posts: 260
Rep Power: 6
boffin5 is on a distinguished road
I forgot to include the errors shown upon running; here it is:
Code:
Create time

Create mesh solid for time = 0

Creating single patch per inter-region interface.

Trying to match regions to existing cell zones.


Number of regions:1

Writing region per cell file (for manual decomposition) to "constant/solid/cellToRegion"

Writing region per cell as volScalarField to "0/solid/cellToRegion"

Region    Cells
------    -----
0    4452

Region    Zone    Name
------    ----    ----
0    0    fluid

Sizes of interfaces between regions:

Interface    Region    Region    Faces
---------    ------    ------    -----

--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI COMMUNICATOR 3 SPLIT FROM 0
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[0] 
[0] 
[0] --> FOAM FATAL IO ERROR: 
[0] size 13558 is not equal to the given value of 2227
[0] 
[0] file: /home/boffin5/cfdaero/radiator-parallel-alt/processor0/0/solid/cellLevel from line 18 to line 13589.
[0] 
[0]     From function Foam::Field<Type>::Field(const Foam::word&, const Foam::dictionary&, Foam::label) [with Type = double; Foam::label = int]
[0]     in file /home/ubuntu/OpenFOAM/OpenFOAM-8/src/OpenFOAM/lnInclude/Field.C at line 210.
[0] 
FOAM parallel run exiting
[0] 
Reading geometric fields

Reading volScalarField htcConst
Reading volScalarField AoV
Reading volScalarField p_rgh
Reading volScalarField cellToRegion
Reading volScalarField p
Reading volScalarField T
Reading volScalarField cellLevel
[boffin5-VirtualBox:10773] PMIX ERROR: UNREACHABLE in file ../../../src/server/pmix_server.c at line 2193
[1] 
[1] 
[1] --> FOAM FATAL IO ERROR: 
[1] size 13558 is not equal to the given value of 2225
[1] 
[1] file: /home/boffin5/cfdaero/radiator-parallel-alt/processor1/0/solid/cellLevel from line 18 to line 13589.
[1] 
[1]     From function Foam::Field<Type>::Field(const Foam::word&, const Foam::dictionary&, Foam::label) [with Type = double; Foam::label = int]
[1]     in file /home/ubuntu/OpenFOAM/OpenFOAM-8/src/OpenFOAM/lnInclude/Field.C at line 210.
[1] 
FOAM parallel run exiting
boffin5 is offline   Reply With Quote

Old   April 2, 2022, 03:07
Default
  #3
Senior Member
 
Kumaresh
Join Date: Oct 2016
Posts: 348
Rep Power: 11
Kummi is on a distinguished road
Send a message via Yahoo to Kummi
Hello,
I don't know whether my answer will be helpful here.
I have used multiregion models in both serial and parallel for 2 solid regions. Serial and Parallel scripts are almost same for topoSet, splitMeshRegions except 'mprirun' command for parallel. The major difference comparing my case with yours is the mesh, I opt uniform mesh for rectangular domain.

So there might be problem with your snappyhexamesh ? As you mentioned, you ran parallel case in another OF version. So why not working in current version ? You didn't mention versions either, it might be helpful for others to understand.

And another idea, try with simple mesh and execute the following commands as you mentioned. If that works, then mesh is the key here.

Regarding ERROR:
Solid and Fluid domains are not in their respective cell zones and thats the reason why cell size error comes (13558 not equal to 2227). So locating the cell zones appropriately matters.
Thank you
Kummi is offline   Reply With Quote

Old   April 4, 2022, 09:19
Default
  #4
Senior Member
 
Yann
Join Date: Apr 2012
Location: France
Posts: 1,066
Rep Power: 26
Yann will become famous soon enough
Hi Alan,

Here is what is in your serialscript file:

Code:
blockMesh
surfaceFeatures
snappyHexMesh -overwrite
splitMeshRegions -cellZones -overwrite  # results in domain with radiator carved out
Does this works and gives you the expected result?
If the answer is yes, then this would be the parallel version:

Code:
blockMesh
surfaceFeatures
decomposePar
mpirun -np 2 snappyHexMesh -overwrite -parallel
mpirun -np 2 splitMeshRegions -cellZones -overwrite -parallel  # results in domain with radiator carved out
I hope this helps,
Yann
Yann is offline   Reply With Quote

Old   November 9, 2022, 13:54
Default
  #5
Member
 
Giles Richardson
Join Date: Jun 2012
Location: Cambs UK
Posts: 98
Rep Power: 13
ufocfd is on a distinguished road
do you not have to run reconstructParMesh after the parallel splitMeshRegions command?
ufocfd is offline   Reply With Quote

Old   November 10, 2022, 03:25
Default
  #6
Senior Member
 
Yann
Join Date: Apr 2012
Location: France
Posts: 1,066
Rep Power: 26
Yann will become famous soon enough
You don't need to reconstruct the mesh to run the case as long as you run it in parallel.

If you want to reconstruct the case, make sure to use the -allRegions option with reconstructParMesh in order to reconstruct the regions created by splitMeshRegions.

I hope this helps,
Yann
Yann is offline   Reply With Quote

Old   November 10, 2022, 16:51
Default
  #7
Member
 
Giles Richardson
Join Date: Jun 2012
Location: Cambs UK
Posts: 98
Rep Power: 13
ufocfd is on a distinguished road
Hi Yann, thanks for your reply - in this case there should only be 1 region,
since its using -largestOnly, but I also used -withZero, because otherwise
I think it gets written into the constant directory (not sure). Thanks.
ufocfd is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
problem during mpi in server: expected Scalar, found on line 0 the word 'nan' muth OpenFOAM Running, Solving & CFD 3 August 27, 2018 04:18
Run Mode:Platform MPI Local Parallel core problem mztcu CFX 0 October 13, 2016 03:14
Explicitly filtered LES saeedi Main CFD Forum 16 October 14, 2015 11:58
damBreak case parallel run problem behzad-cfd OpenFOAM Running, Solving & CFD 5 August 2, 2015 17:18
problem for parallel processing minaret OpenFOAM Running, Solving & CFD 14 January 19, 2015 23:41


All times are GMT -4. The time now is 19:32.