CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

Message truncated, error stack: MPI_Recv(224).......................: MPI_Recv(buf=0x

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   March 14, 2025, 21:24
Default Message truncated, error stack: MPI_Recv(224).......................: MPI_Recv(buf=0x
  #1
New Member
 
Vivekananda
Join Date: May 2017
Location: Cardiff, UK
Posts: 29
Rep Power: 9
sinhavivekananda318 is on a distinguished road
I am facing a strange error.

I am running my case with 80 processors.

Now when I am working with particles and a rotating patch called AMI (arbitary meshing interface), Openfoam wants the AMI to be in a single processor.

I did that with toposet utility.

Now, when I am trying to run it with parallel with scotch decomposition method , I am getting the following error.

Fatal error in MPI_Recv: Message truncated, error stack:
MPI_Recv(224).......................: MPI_Recv(buf=0x7fff962cc038, count=8, MPI_BYTE, src=26, tag=1, MPI_COMM_WORLD, sta$
MPIDI_CH3U_Request_unpack_uebuf(618): Message truncated; 24 bytes received but buffer size is 8

Strangely, when I am reducing the processors to 40 and changed the decomposition method to simple, it worked.

But again, with 40 proc and scotch or hierarchical method, it throws the same error.

In summary,

Processor Decomposition Error
80 scotch yes
80 simple yes
40 scotch/hierarchical yes
40 simple NO

I think the issue might be with MPI.

I checked the following link


Message truncated, error stack: MPIDI_CH3U_Receive_data_found

Here, the issue was solved by changing the comms to nonBlocking in the /apps/material/Openfoam/el7/../.../etc/controDict.

I also checked it and found that it's saved as nonBlocking.

Can anyone help?
sinhavivekananda318 is offline   Reply With Quote

Reply

Tags
ami patches, mpi errors, parallel, particle cloud

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Message truncated, error stack: MPIDI_CH3U_Receive_data_found nandiganavishal OpenFOAM Running, Solving & CFD 21 November 19, 2021 01:20
ERROR: unable to find library HJH CFX 6 February 26, 2019 06:52
single directional message transmit in parallel model su_junwei OpenFOAM Programming & Development 1 December 17, 2009 07:00
Error message in CFX-10 CFDworker CFX 2 October 12, 2007 07:23
MPI Message truncated josef OpenFOAM Running, Solving & CFD 1 January 9, 2006 06:29


All times are GMT -4. The time now is 08:10.