CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Programming & Development

[OF 5.0] patches externalCoupled in parallel

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   June 19, 2019, 10:41
Default [OF 5.0] patches externalCoupled in parallel
  #1
Senior Member
 
Gerry Kan's Avatar
 
Gerry Kan
Join Date: May 2016
Posts: 347
Rep Power: 10
Gerry Kan is on a distinguished road
Howdy people:

I am seeing something pretty strange lately with the externalCoupled boundary condition lately. I am not sure why.

In this case, the externalCoupled BC code is slightly modified so that it does not check the lock file, as the data are already present and the solver needs not wait for them. Returning the result from OpenFOAM back to the external solver is at the moment unimportant. Here is the sequence of commands that I used to generate the mesh and linking the boundaries to the external data:

Code:
 blockMesh
 createExternalCoupledPatchGeometry T
 decomposePar
I tested this with both a single core, and parallel run with 16 cores. In both cases the external data files (as well as the files patchFaces and patchPoints) were read in correctly. However, I am a bit perplexed as to how they were mapped in the end. While the mapping was performed as expected when I run this in serial, they seem to be all over the place in parallel. I have attached the figures here for reference:

1: Single core (normal)


2: parallel with 16 cores (abnormal)


I am not sure what I have done incorrectly, and perhaps someone could have some ideas how I could solve this problem.

Thank you very much in advance,

Gerry.

P.S. - I realized after the fact this should have been in the "user" section. My apologies in advance.

Last edited by Gerry Kan; June 26, 2019 at 10:09.
Gerry Kan is offline   Reply With Quote

Old   June 20, 2019, 11:57
Default Minimally reproducible example
  #2
Senior Member
 
Gerry Kan's Avatar
 
Gerry Kan
Join Date: May 2016
Posts: 347
Rep Power: 10
Gerry Kan is on a distinguished road
Folks:

I tinker with this issue today and managed to demonstrate this with a small, reproducible example (at least on my cluster) based on the externalCoupledCavity tutorial.

I replaced the external solver (externalSolver) to introduce a spatial and temporal variations on the hot and cold surfaces. Everything else remains unchanged. Again, for the serial run (1 core) the temperature as mapped correctly as expected. The problem appears again when it is run in parallel, in which the patches are all mixed up.

The curious could pick up the test case following this link: 20190620-externalCoupledCavity.tar.gz?dl=0

To reproduce: The Allrun, Allrun-parallel, and Allclean commands will automate the whole process. The corresponding solver output can be viewed in Paraview by using reconstructPar (in parallel) and foamToVTK. (For those who know how to do this, sorry for the reiteration).

The following three figures shows the spatial temperature distribution of the "hot" surface after iteration 1 (also note difference between T distribution between the two parallel runs):

1) Serial run (normal)


2) Parallel run with 4 cores


3) Parallel run with 16 cores


The range of the temperature suggests that the mix up is localized within the hot boundary. This is known as I have configured the temperatures boundaries as such that the maximum T of the cold wall is lower than the minimum T of the hot wall.

At this point, I believe that this is a bug in OpenFOAM, unless, of course, the externalCoupled boundary condition is meant to be run in serial, though I doubt this was the intent.

However, while browsing through all externalCoupled related issues, I did not see any issues pertaining to parallelization between OF 5 and OF 6. I assume that this has not been resolved or addressed.

Thanks again, Gerry.

Last edited by Gerry Kan; June 21, 2019 at 04:46.
Gerry Kan is offline   Reply With Quote

Old   June 27, 2019, 02:58
Default Problem solved
  #3
Senior Member
 
Gerry Kan's Avatar
 
Gerry Kan
Join Date: May 2016
Posts: 347
Rep Power: 10
Gerry Kan is on a distinguished road
Folks:

Problem solved!


The externalCoupledCavity tutorial and documentation kind of imply that you always run createExternalCoupledPatchGeometry before, and decomposePar picks up the coupled patches.

However, you need to run createExternalCoupledPatchGeometry in parallel after decomposePar, both on the same number of processors. Then the external boundary data are mapped correctely.

Here is a modified version of the externalCoupledCavity tutorial to reflect these change. Note that I have rewritten the externalSolver in Python 3 to allow for both temporal and spatial boundary temperature variations.

Hope that helps, Gerry.
Gerry Kan is offline   Reply With Quote

Reply

Tags
externalcoupled, parallel, patches


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Error running simpleFoam in parallel Yuby OpenFOAM Running, Solving & CFD 14 October 7, 2021 04:38
problem during mpi in server: expected Scalar, found on line 0 the word 'nan' muth OpenFOAM Running, Solving & CFD 3 August 27, 2018 04:18
MPI error in parallel application usv001 OpenFOAM Programming & Development 2 September 14, 2017 11:30
Some questions about a multi region case run in parallel zfaraday OpenFOAM Running, Solving & CFD 5 February 23, 2017 10:25
Parallel Computing Classes at San Diego Supercomputer Center Jan. 20-22 Amitava Majumdar Main CFD Forum 0 January 5, 1999 12:00


All times are GMT -4. The time now is 00:50.