|
[Sponsors] |
![]() |
![]() |
#1 |
Member
Andrew Coughtrie
Join Date: May 2011
Posts: 51
Rep Power: 14 ![]() |
Hi everyone,
I'm uncertain if this is a bug or if parallel running hasn't been implemented for the finiteArea method in 1712 yet. Whenever I try to run one of the finiteArea solvers in parallel I get the following error (I know it isn't at all helpful but its all i've got): Code:
[ceg-w617:9451] *** An error occurred in MPI_Recv [ceg-w617:9451] *** reported by process [2378104833,0] [ceg-w617:9451] *** on communicator MPI_COMM_WORLD [ceg-w617:9451] *** MPI_ERR_TRUNCATE: message truncated [ceg-w617:9451] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, [ceg-w617:9451] *** and potentially your MPI job) [ceg-w617:9452] *** An error occurred in MPI_Recv [ceg-w617:9452] *** reported by process [2378104833,1] [ceg-w617:9452] *** on communicator MPI_COMM_WORLD [ceg-w617:9452] *** MPI_ERR_TRUNCATE: message truncated [ceg-w617:9452] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, [ceg-w617:9452] *** and potentially your MPI job) Thanks Andy |
|
![]() |
![]() |
![]() |
![]() |
#2 |
Member
Andrew Coughtrie
Join Date: May 2011
Posts: 51
Rep Power: 14 ![]() |
Seems to work fine when I recompile with 32bit labels
Andy |
|
![]() |
![]() |
![]() |
Tags |
finite area method, mpi error, parallel |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Explicitly filtered LES | saeedi | Main CFD Forum | 16 | October 14, 2015 11:58 |
simpleFoam parallel | AndrewMortimer | OpenFOAM Running, Solving & CFD | 12 | August 7, 2015 18:45 |
Is there a bug when running createBaffles in parallel??? | zfaraday | OpenFOAM Pre-Processing | 1 | May 12, 2015 13:32 |
Parallel Moving Mesh Bug for Multi-patch Case | albcem | OpenFOAM Bugs | 17 | April 28, 2013 23:44 |
Parallel Moving Mesh Bug for Multi-patch Case | albcem | OpenFOAM | 0 | May 21, 2009 00:23 |