CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Community Contributions

[ImmersedBoundary] Immersed Boundary Method: Error Occurs in parallelization of icoIbFoam

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   October 29, 2016, 08:52
Default Immersed Boundary Method: Error Occurs in parallelization of icoIbFoam
  #1
New Member
 
Di Wu
Join Date: May 2016
Location: Australia
Posts: 5
Rep Power: 9
wudi is on a distinguished road
Hi Foamers:

As you know Immersed boundary method has been implemented in Openfoam-extend. Currently I am using foam-extend-4.0 version.

I just try to run a cylinderInChannelFineIcoIbFoam simulation in the tutorial folder, and the simulation goes well in serial computation. Then I test it in parallel mode. I followed the description in "Allrun" file and do the following command:

1. blockMesh
2. cp save/boundary constant/polyMesh/
3. mkdir 0
4. cp 0_org/* 0/
5. decomposePar (I have modified the decomposeParDict file in system folder, 4 processors are specified)
6. mpirun -np 4 potentialIbFoam -parallel
Then I got errors from step 6, which are:

************************************************** *************************************************
Create time

Create mesh for time = 0


SIMPLE: no convergence criteria found. Calculations will run for 50 steps.

Create immersed boundary cell mask
Create immersed boundary face mask
Found immersed boundary patch 0 named ibCylinder
[3] Number of IB cells: 0
External flow
[0] Number of IB cells: 72
[1] Number of IB cells: 0
[2] Number of IB cells: 72
Reading field p

Reading field U


Calculating potential flow
[wudi-HOME:19883] *** An error occurred in MPI_Recv
[wudi-HOME:19883] *** reported by process [3012755457,0]
[wudi-HOME:19883] *** on communicator MPI_COMM_WORLD
[wudi-HOME:19883] *** MPI_ERR_TRUNCATE: message truncated
[wudi-HOME:19883] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[wudi-HOME:19883] *** and potentially your MPI job)
************************************************** **************************************************

I searched online, someone else report the same error in 2014. Here is the link:
https://sourceforge.net/p/openfoam-e...ndrelease/260/

I have no idea whether this error or bug got solved or not. Please advise! Thank you in advance.
wudi is offline   Reply With Quote

Old   October 29, 2016, 12:09
Default Update
  #2
New Member
 
Di Wu
Join Date: May 2016
Location: Australia
Posts: 5
Rep Power: 9
wudi is on a distinguished road
Hi,
It's me again. After a few hours investigation, before sleeping I post some updates to this issue.

This error message was caused by the MPI communication type. Specifically, one should put commsType to be nonBlocking in $WM_PROJECT_DIR/etc/controlDict . The default value is blocking. However, this method is only applicable to the versions before foam-extend-3.2.

From foam-extend-3.2, $WM_PROJECT_DIR/etc/controlDict has been removed, and $WM_PROJECT_DIR/etc/controlDict-SAMPLE is added to this folder. Re-name this controlDict-SAMPLE to controlDict won't work.

Now the problem becomes more clear. The key issue is to change commsType in OptimisationSwitches to nonBlocking. I will update the solution later. If all the methods fail, maybe I will consider to downgrade the version back to 3.1.
wudi is offline   Reply With Quote

Old   October 30, 2016, 23:48
Default Still confused
  #3
New Member
 
Di Wu
Join Date: May 2016
Location: Australia
Posts: 5
Rep Power: 9
wudi is on a distinguished road
Still confused. I have re-compiled the whole library with commsType set to be "nonBlocking", however the problem still exists.

My MPI version is OpenMPI 1.10.2, and parallel computation works fine in other standard solver such as "simpleFoam" and "icoFoam", it only fails in "ImmersedBoundary" folder.

Expect someone who can give me a hint. Thanks in advance.

Best regards

Blow is error message:
[wudi-HOME:19883] *** An error occurred in MPI_Recv
[wudi-HOME:19883] *** reported by process [3012755457,0]
[wudi-HOME:19883] *** on communicator MPI_COMM_WORLD
[wudi-HOME:19883] *** MPI_ERR_TRUNCATE: message truncated
[wudi-HOME:19883] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[wudi-HOME:19883] *** and potentially your MPI job)
wudi is offline   Reply With Quote

Old   October 31, 2016, 02:20
Default
  #4
New Member
 
Di Wu
Join Date: May 2016
Location: Australia
Posts: 5
Rep Power: 9
wudi is on a distinguished road
Got it worked in parallel eventually!

Will update later. Thanks for watching.
wudi is offline   Reply With Quote

Old   January 24, 2017, 08:55
Default
  #5
New Member
 
Join Date: Oct 2016
Posts: 4
Rep Power: 9
GFarello is on a distinguished road
Hi wudi,

I have the same problem... how did you fix that?
GFarello is offline   Reply With Quote

Old   January 25, 2017, 03:57
Default
  #6
New Member
 
Di Wu
Join Date: May 2016
Location: Australia
Posts: 5
Rep Power: 9
wudi is on a distinguished road
Quote:
Originally Posted by GFarello View Post
Hi wudi,

I have the same problem... how did you fix that?
Hi GFarello,

It has been a few months since I post this thread. As far as I could remember, potentialIbFoam does NOT support parallel computing, you have to run it sequentially. Once you have done this, you need to modify the initial conditions in the 0 folder, and then you may run icoIbFoam in parallel.

Regards
wudi is offline   Reply With Quote

Reply

Tags
ibm, immersed boundary method, openfoam extend

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
sliding mesh problem in CFX Saima CFX 46 September 11, 2021 07:38
Wind turbine simulation Saturn CFX 58 July 3, 2020 01:13
implementation of the Immersed Boundary Method mi_cfd Main CFD Forum 19 April 24, 2019 01:24
CFD analaysis of Pelton turbine amodpanthee CFX 31 April 19, 2018 18:02
Convective Heat Transfer - Heat Exchanger Mark CFX 6 November 15, 2004 15:55


All times are GMT -4. The time now is 18:20.