|
[Sponsors] |
problems with turbDyMEngineFoam using dynamicTopoFvMesh and load balancing |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
|
April 19, 2014, 17:13 |
problems with turbDyMEngineFoam using dynamicTopoFvMesh and load balancing
|
#1 |
New Member
Join Date: Mar 2014
Posts: 9
Rep Power: 12 |
Hi,
I am running an internal combustion engine with turbDyMEngineFoam using a dynamicTopoFvMesh approach. The case works good in serial but in parallel it cannot load the decomposition libraries. Every 200 time-steps there is a dynamic parallel load-balancing process using parMetis and this is what the terminal says: Code:
--> FOAM Warning : From function decompositionMethod::loadExternalLibraries() in file decompositionMethod/decompositionMethod.C at line 453 Loading of decomposition library libscotchDecomp.so unsuccesful. Some decomposition methods may not be available --> FOAM Warning : From function decompositionMethod::loadExternalLibraries() in file decompositionMethod/decompositionMethod.C at line 453 Loading of decomposition library libmetisDecomp.so unsuccesful. Some decomposition methods may not be available --> FOAM Warning : From function decompositionMethod::loadExternalLibraries() in file decompositionMethod/decompositionMethod.C at line 453 Loading of decomposition library libparMetisDecomp.so unsuccesful. Some decomposition methods may not be available Selecting decompositionMethod parMetis Finally the case crashes almost at the end with 4 subdomains (with 2 subdomains it gets to the end without the following error) showing this error: Code:
[0] --> FOAM FATAL ERROR: [0] Mapping for inserted boundary face is incorrect. Found an empty masterObjects list. Face: 46884 Patch: liner [0] [0] From function void topoPatchMapper::calcInsertedFaceAddressing( ) const in file fieldMapping/topoPatchMapper.C at line 137 [0] FOAM parallel run aborting [0] -------------------------------------------------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1. Thank you very much. |
|
|
|