CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > General Forums > Main CFD Forum

A fatal problem of immersed boundary layer method!!!

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   November 12, 2018, 04:29
Default A fatal problem of immersed boundary layer method!!!
  #1
New Member
 
kinakamichibun
Join Date: Nov 2018
Posts: 5
Rep Power: 7
kinakamichibun is on a distinguished road
When I tried to run the tutorial of refiningMovingCylinderInChannelIco of foam -extend 4.1 in parallel, it firstly aborts for two undefined keywords, they are numberOfSubdomains and method. But after these two keywords are added, it still aborts after several iterations with the following message:


Fatal error in MPI_Recv: Message truncated, error stack:
MPI_Recv(200).....................: MPI_Recv(buf=0x114f910, count=49, MPI_BYTE, src=0, tag=1, MPI_COMM_WORLD, status=0x7ffd2df53490) failed
MPIDI_CH3U_Receive_data_found(131): Message from rank 0 and tag 1 truncated; 51208 bytes received but buffer size is 49

This tutorial can run well in serial.Now I don't know what to do with it. Can anybody give some advice? Thank you!
kinakamichibun is offline   Reply With Quote

Old   November 12, 2018, 05:38
Default
  #2
Senior Member
 
Join Date: Dec 2017
Posts: 153
Rep Power: 8
AliE is on a distinguished road
Quote:
Originally Posted by kinakamichibun View Post
When I tried to run the tutorial of refiningMovingCylinderInChannelIco of foam -extend 4.1 in parallel, it firstly aborts for two undefined keywords, they are numberOfSubdomains and method. But after these two keywords are added, it still aborts after several iterations with the following message:


Fatal error in MPI_Recv: Message truncated, error stack:
MPI_Recv(200).....................: MPI_Recv(buf=0x114f910, count=49, MPI_BYTE, src=0, tag=1, MPI_COMM_WORLD, status=0x7ffd2df53490) failed
MPIDI_CH3U_Receive_data_found(131): Message from rank 0 and tag 1 truncated; 51208 bytes received but buffer size is 49

This tutorial can run well in serial.Now I don't know what to do with it. Can anybody give some advice? Thank you!
Hi, I am not familiar with openFOAM extended, however this error looks nasty, since it comes from mpi_recv and thus it is located inside the information exchange between processors.

If you are not familiar with mpi, the error reported means that the proc0 is sending a message of a certain size but for some reason the prescribed receiver is not able get it correctly. This might be caused by an incorrect tag or an incorrect receiver id into the mpi_recv call or an incorrect memory allocation of the buffer array.

Since the code comes from the extended version and the serial version is running well, there is a good probability that you have a bug in the sources... Sorry, but it is diffucult to be more helpful than this.
AliE is offline   Reply With Quote

Old   November 12, 2018, 09:57
Default
  #3
Senior Member
 
Lucky
Join Date: Apr 2011
Location: Orlando, FL USA
Posts: 5,682
Rep Power: 66
LuckyTran has a spectacular aura aboutLuckyTran has a spectacular aura aboutLuckyTran has a spectacular aura about
This isn't a problem of immersed boundary layer method but a how to run openfoam in parallel question. Posting the mpi error msg is not at all helpful, what you need is the actual openfoam log from the host-node (the one that looks like the same stuff you'd see in a serial message).

Are you able to run any cases of OF in parallel? I think the answer is no.

In decompseParDict you have to specify numberOfSubdomains and method. Then you need to run decomposePar. After you do this you will find dirs like processor0, processor1, and so on corresponding to the same number that you specified in numberofSubdomains. Are you here yet?

Then you need to launch OF in parallel mode, how to do this depends on the environment. For example, in my case, I have to invoke mpirun.
LuckyTran is offline   Reply With Quote

Old   November 12, 2018, 09:59
Default
  #4
Senior Member
 
Join Date: Dec 2017
Posts: 153
Rep Power: 8
AliE is on a distinguished road
Quote:
Originally Posted by LuckyTran View Post
This isn't a problem of immersed boundary layer method but a how to run openfoam in parallel question. Posting the mpi error msg is not at all helpful, what you need is the actual openfoam log from the host-node (the one that looks like the same stuff you'd see in a serial message).

Are you able to run any cases of OF in parallel? I think the answer is no.

In decompseParDict you have to specify numberOfSubdomains and method. Then you need to run decomposePar. After you do this you will find dirs like processor0, processor1, and so on corresponding to the same number that you specified in numberofSubdomains. Are you here yet?

Then you need to launch OF in parallel mode, how to do this depends on the environment. For example, in my case, I have to invoke mpirun.
Yes, this is another option. I assumed in my answer that you have already done the procedure suggested by luckyTran So hopefully you have missed some steps.
AliE is offline   Reply With Quote

Old   November 12, 2018, 21:40
Default
  #5
New Member
 
kinakamichibun
Join Date: Nov 2018
Posts: 5
Rep Power: 7
kinakamichibun is on a distinguished road
Quote:
Originally Posted by LuckyTran View Post
This isn't a problem of immersed boundary layer method but a how to run openfoam in parallel question. Posting the mpi error msg is not at all helpful, what you need is the actual openfoam log from the host-node (the one that looks like the same stuff you'd see in a serial message).

Are you able to run any cases of OF in parallel? I think the answer is no.

In decompseParDict you have to specify numberOfSubdomains and method. Then you need to run decomposePar. After you do this you will find dirs like processor0, processor1, and so on corresponding to the same number that you specified in numberofSubdomains. Are you here yet?

Then you need to launch OF in parallel mode, how to do this depends on the environment. For example, in my case, I have to invoke mpirun.
Definitely I know how to run foam in parallel, actually I seldom do serial simulations. I have run several cases in parallel with this installation, so problems should not come from my installation. The error message of undefined keywords of numberofsubdomains and method is located at dynamicMeshDict file, which is somewhat interesting
kinakamichibun is offline   Reply With Quote

Old   November 13, 2018, 04:32
Default
  #6
Member
 
Join Date: Aug 2018
Posts: 77
Rep Power: 7
vesp is on a distinguished road
Quote:
Originally Posted by kinakamichibun View Post
Definitely I know how to run foam in parallel, actually I seldom do serial simulations. I have run several cases in parallel with this installation, so problems should not come from my installation. The error message of undefined keywords of numberofsubdomains and method is located at dynamicMeshDict file, which is somewhat interesting


well, if you rule out ansinput errorson your part, it is likely a bug in the code. Just work it out likeyou would any other bug...
If I had a dollar for every time I ran into an MPI issue, I would be rich. Just dig in and get dirty, or talk to support if you are not a programmer.
vesp is offline   Reply With Quote

Old   November 13, 2018, 10:05
Default
  #7
Senior Member
 
Lucky
Join Date: Apr 2011
Location: Orlando, FL USA
Posts: 5,682
Rep Power: 66
LuckyTran has a spectacular aura aboutLuckyTran has a spectacular aura aboutLuckyTran has a spectacular aura about
What even is openfoam extend 4.1? Can you send me the github link?
LuckyTran is offline   Reply With Quote

Old   November 13, 2018, 10:48
Default
  #8
Senior Member
 
akidess's Avatar
 
Anton Kidess
Join Date: May 2009
Location: Germany
Posts: 1,377
Rep Power: 30
akidess will become famous soon enough
[removed...]
akidess is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
[snappyHexMesh] Error defining boundary layer around cube snappyHexMesh crizpi21 OpenFOAM Meshing & Mesh Conversion 5 October 16, 2021 10:56
Problem in setting Boundary Condition Madhatter92 CFX 12 January 12, 2016 04:39
Radiation interface hinca CFX 15 January 26, 2014 17:11
Error finding variable "THERMX" sunilpatil CFX 8 April 26, 2013 07:00
Convective Heat Transfer - Heat Exchanger Mark CFX 6 November 15, 2004 15:55


All times are GMT -4. The time now is 08:44.