CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

directMapped + regionCoupling + parallel problems

Register Blogs Members List Search Today's Posts Mark Forums Read

Like Tree5Likes
  • 1 Post By ngj
  • 1 Post By ngj
  • 1 Post By elisabet
  • 1 Post By ngj
  • 1 Post By elisabet

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   January 16, 2013, 07:34
Default directMapped + regionCoupling + parallel problems
  #1
Member
 
Elisabet Mas de les Valls
Join Date: Mar 2009
Location: Barcelona, Spain
Posts: 64
Rep Power: 17
elisabet is on a distinguished road
Hi Foamers!

I'm playing with an MHD conjugate solver (like conjugateHeatFoam in OF-1.6-ext) which has been validated. My present case of interest needs periodic boundary conditions and I'm trying to use directMapped b.c. for it.

Everything runs fine in serial mode but, when decomposing the case and running it, I experience some MPI_Recv problems.

DirectMapped b.c. works perfect in parallel and the same occurs for the conjugate solver (with regionCoupling b.c.), but both issues together do not.... is it too much? Do you have any experience with it?

Any help is appreciated....

elisabet
elisabet is offline   Reply With Quote

Old   January 16, 2013, 08:13
Default
  #2
ngj
Senior Member
 
Niels Gjoel Jacobsen
Join Date: Mar 2009
Location: Copenhagen, Denmark
Posts: 1,900
Rep Power: 37
ngj will become famous soon enoughngj will become famous soon enough
Hi Elisabet,

Have you tried using the
Code:
preservePatches ( <listOfPatchNames> );
in the decomposeParDict? This allows for both of the cyclic patches to be on the same processor. It works in 1.6-ext.

On the other hand, I do not quite understand how you are using the directMapped on cyclic boundaries. Could you elaborate? Are the boundary patches not actually of type "cyclic", but something else?

Kind regards,

Niels
mm.abdollahzadeh likes this.
ngj is offline   Reply With Quote

Old   January 16, 2013, 08:27
Default
  #3
Member
 
Elisabet Mas de les Valls
Join Date: Mar 2009
Location: Barcelona, Spain
Posts: 64
Rep Power: 17
elisabet is on a distinguished road
Hi Niels,

Thanks for your quick reply.

I already checked the preservePatches option but with no improvement... And it doesn't matter in which direction I split my mesh...

Related with cyclic b.c. I wanted to avoid that option and use, instead, the directMapped b.c. for the velocity field (with an average value) at the inlet. This gives me the option to fix a mean value for the pressure and the electric potential in the outlet and inlet b.c. respectively.
However, I do not dismiss using cyclic b.c.
elisabet is offline   Reply With Quote

Old   January 16, 2013, 08:49
Default
  #4
ngj
Senior Member
 
Niels Gjoel Jacobsen
Join Date: Mar 2009
Location: Copenhagen, Denmark
Posts: 1,900
Rep Power: 37
ngj will become famous soon enoughngj will become famous soon enough
Okay. What happens if you make sure that the cutting plane for the directMapped BC is in the same processor as the boundary? Would this still crash?

/ Niels
ngj is offline   Reply With Quote

Old   January 16, 2013, 09:20
Default
  #5
Member
 
Elisabet Mas de les Valls
Join Date: Mar 2009
Location: Barcelona, Spain
Posts: 64
Rep Power: 17
elisabet is on a distinguished road
Ok, I'm a little bit confused right now:


I reactivated again the line
Code:
preservePatches (inlet outlet);
in the decomposeParDict for region0 and the corresponding line for region solid. Then, I decomposed the domain as usual (decomposePar and decomposePar -region solid, using OF-1.7.1 just for these two commands). I thought that this would result in a 2 domain decomposition where both inlet and outlet b.c. would stay in one of the subdomains, BUT THIS IS NOT THE CASE!! I've checked it with paraFoam.

My channel length is 0.6 m in x direction, the main flow direction. The offset for the directMappedPatch is (0.5995 0 0 ). I'm using the simple decomposition method with a distribution (2 1 1).

How can I impose that the inlet and outlet b.c. are in the same processor?

Thanks!

elisabet
elisabet is offline   Reply With Quote

Old   January 16, 2013, 09:27
Default
  #6
ngj
Senior Member
 
Niels Gjoel Jacobsen
Join Date: Mar 2009
Location: Copenhagen, Denmark
Posts: 1,900
Rep Power: 37
ngj will become famous soon enoughngj will become famous soon enough
Aha, I begin to understand your problem. As I understand preservePatches, then it preserves the individual patches on one of the subdomains, but it does not preserve the entire list of patches on a common subdomain.

You could for instance do a manual decomposition of the domains, where the process is hinted in this thread:

http://www.cfd-online.com/Forums/ope...computing.html

/ Niels
luiscardona likes this.
ngj is offline   Reply With Quote

Old   January 16, 2013, 10:30
Default
  #7
Member
 
Elisabet Mas de les Valls
Join Date: Mar 2009
Location: Barcelona, Spain
Posts: 64
Rep Power: 17
elisabet is on a distinguished road
Thanks for the hint!

I succeed in splitting the domain in 2 subdomains (sbd) while keeping inlet and outlet patches on the same sbd. For those who would like some help on it, the steps are (for a channel along x direction)
1.- split the domain in 4 sbd along x using simple method in decomposeParDict and run 'decomposePar -cellDist'. This will create a 'cellDecomposition' file in constant directory.
2.- modify header (object entry) and the name of this file to decompDict, for instance
3.- replace all '2' by '1' and all '3' by '0' in decompDict file (make sure you do not alter the label of the list's length). You can easily do it with vim comand: :%s/2/1/g for instance
4.- change the decomposeParDict file in order to use the manual method and specify the name of your file (decompDict in this example)
5.- run again 'decomposePar' and check the results with paraFoam.
The bad new is that, once the system has been successfully decomposed, I ran the case in parallel and got the MPI_Recv error again:
*** An error occurred in MPI_Recv
*** on communicator MPI_COMM_WORLD
*** MPI_ERR_TRUNCATE: message truncated
any idea?

elisabet

EDIT: P.D. setFields is a fantastic tool also for prepare 'decompDict' file for the manual decomposition!

Last edited by elisabet; February 4, 2013 at 10:24.
elisabet is offline   Reply With Quote

Old   January 17, 2013, 04:52
Default
  #8
ngj
Senior Member
 
Niels Gjoel Jacobsen
Join Date: Mar 2009
Location: Copenhagen, Denmark
Posts: 1,900
Rep Power: 37
ngj will become famous soon enoughngj will become famous soon enough
Hi Elisabet,

Really nice description of how to decompose "manually".

The MPI errors are really nasty, but sometimes it helps to put

Code:
export FOAM_ABORT=1
in the command line before running the simulation. But only sometimes

Secondly, if you change your directMapped into say fixedValue, does the simulation run?

All the best,

Niels
ngj is offline   Reply With Quote

Old   January 17, 2013, 13:42
Default
  #9
Member
 
Elisabet Mas de les Valls
Join Date: Mar 2009
Location: Barcelona, Spain
Posts: 64
Rep Power: 17
elisabet is on a distinguished road
Hi Niels,

I've tried what you suggested about FOAM_ABORT, but the error persists and no extra info is obtained.

Just to sum up, with the conjugate MHD solver:

- The case works without directMapped b.c. (i.e. fixed value b.c.) both in serial and parallel modes.
- The case works with directMapped b.c. in serial mode
- The case DO NOT work with directMapped b.c. in PARALLEL mode (despite being the cutting plane in the same processor as the directMapped b.c.)

I'm going to build a simple case with the original conjugateHeatFoam and with directMapped b.c., let's see what happens.

Just in case: has anyone done it before? any suggestions?

Regards,

elisabet
elisabet is offline   Reply With Quote

Old   January 18, 2013, 03:47
Default
  #10
ngj
Senior Member
 
Niels Gjoel Jacobsen
Join Date: Mar 2009
Location: Copenhagen, Denmark
Posts: 1,900
Rep Power: 37
ngj will become famous soon enoughngj will become famous soon enough
Hi Elisabet,

I am sorry, but I can not be of more help to you; I am out of ideas.

Good luck,

Niels
ngj is offline   Reply With Quote

Old   January 22, 2013, 19:51
Default
  #11
Senior Member
 
Daniel P. Combest
Join Date: Mar 2009
Location: St. Louis, USA
Posts: 621
Rep Power: 0
chegdan will become famous soon enoughchegdan will become famous soon enough
Elisabet,

I had a similar issue a while back and a work around i had was similar to your approach using manual decomposition, with some differences

1. In your fields and boundary file, make your directMapped patches into cyclic patch
2. Decompose with your favorite method (i think scotch will work), outputting the cellDist and making sure to preserve the cyclic patches
3. Remove your processor* folders sinc eyou are going to decompose again anyway
4. In your boundary file and fields, change your BC back to directMapped
5. Manually decompose with your cell distribution you previously found using cyclic BCs instead of directMapped

OF is good at decomposing cyclic BCs for paralle computation...
chegdan is offline   Reply With Quote

Old   February 4, 2013, 11:02
Default
  #12
Member
 
Elisabet Mas de les Valls
Join Date: Mar 2009
Location: Barcelona, Spain
Posts: 64
Rep Power: 17
elisabet is on a distinguished road
Dear all,

I could finally return to this problem.

I've attached a very simple example with conjugateHeatFoam set up, were the case fails to run in parallel but not in serial mode.

How to: (1) blockMesh for main and solid region (cp boundary_original on boundary file at constant directory afterwords) (2) decompose (already set to manual) (3) check the run in serial mode (4) run in parallel

If anyone could give me some advice....

elisabet
Attached Files
File Type: zip example.zip (18.2 KB, 41 views)
Maimouna likes this.
elisabet is offline   Reply With Quote

Old   February 4, 2013, 13:02
Default
  #13
ngj
Senior Member
 
Niels Gjoel Jacobsen
Join Date: Mar 2009
Location: Copenhagen, Denmark
Posts: 1,900
Rep Power: 37
ngj will become famous soon enoughngj will become famous soon enough
HI Elisabet,

When I am decomposing it, I have a complaint about a missing region for the field T. It happens both times, either

Code:
decomposePar
or

Code:
decomposePar -region solid
Apparently the T field wants the neighbouring region, so how to give it that and still have correct decomposition of the boundary files?

Kind regards,

Niels
Maimouna likes this.
ngj is offline   Reply With Quote

Old   February 4, 2013, 17:04
Default
  #14
Member
 
Elisabet Mas de les Valls
Join Date: Mar 2009
Location: Barcelona, Spain
Posts: 64
Rep Power: 17
elisabet is on a distinguished road
Hi Niels,

Due to size limitations I have not been able to attach the meshes. Hence, naming the folder I'd sent you 'mainFolder', you should:

1. create a folder for the solid region (solidFolder, hereafter) outside mainFolder.
2. copy the solid sub-folders ('mainFolder/0/solid', 'mainFolder/constant/solid' and 'mainFolder/system/solid') folders to this new one. Note that mainFolder/system/controlDict should also be copied into solidFolder/system.
2. blockMesh for mainFolder, and copy boundary_original into boundary file.
3. blockMesh for solidFolder, and copy boundary_original into boundary file.
4. copy solidFolder/constant/polyMesh into mainFolder/constant/solid one.
5. 'decompose' and 'decompose -region solid': watch up! for this utility you need to use 1.7 or newer OF versions.
6. run in parallel mode (mpirun -np 2 conjugateHeatFoam -parallel > log &')

Probably you are not using the right OF version?

elisabet
Maimouna likes this.
elisabet is offline   Reply With Quote

Old   February 4, 2013, 17:44
Default
  #15
ngj
Senior Member
 
Niels Gjoel Jacobsen
Join Date: Mar 2009
Location: Copenhagen, Denmark
Posts: 1,900
Rep Power: 37
ngj will become famous soon enoughngj will become famous soon enough
Oh, I see. I am still running all my simulations in 1.6-ext, as I need a lot of those special things, so unfortunately I will not be able to help you out.

Good luck,

Niels
ngj is offline   Reply With Quote

Old   October 3, 2018, 11:04
Default
  #16
New Member
 
Artem
Join Date: Apr 2014
Posts: 29
Rep Power: 11
Kombinator is on a distinguished road
Quote:
Originally Posted by elisabet View Post
Hi Foamers!

I'm playing with an MHD conjugate solver (like conjugateHeatFoam in OF-1.6-ext) which has been validated. My present case of interest needs periodic boundary conditions and I'm trying to use directMapped b.c. for it.

Everything runs fine in serial mode but, when decomposing the case and running it, I experience some MPI_Recv problems.

DirectMapped b.c. works perfect in parallel and the same occurs for the conjugate solver (with regionCoupling b.c.), but both issues together do not.... is it too much? Do you have any experience with it?

Any help is appreciated....

elisabet
Dear Elisabet,

Are you still working on conjugate mhd solver?

Best regards,
Artem
Kombinator is offline   Reply With Quote

Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Case running in serial, but Parallel run gives error atmcfd OpenFOAM Running, Solving & CFD 18 March 26, 2016 13:40
Parallel processing problem newbie29 OpenFOAM Running, Solving & CFD 1 June 22, 2012 05:23
Problems with "polyTopoChange" on parallel?!? daZigeiner OpenFOAM Programming & Development 0 March 14, 2011 11:05
Problems with parallel wolfgray OpenFOAM Running, Solving & CFD 0 April 14, 2008 05:36
Problems with mesh motion in parallel thomas OpenFOAM Running, Solving & CFD 3 July 4, 2007 03:48


All times are GMT -4. The time now is 13:52.