|
[Sponsors] | |||||
hwloc encountered an error during parallel run on my cluster |
![]() |
|
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
|
|
|
#1 |
|
Super Moderator
Tobias Holzmann
Join Date: Oct 2010
Location: Bad Wörishofen
Posts: 2,716
Blog Entries: 6
Rep Power: 53 ![]() ![]() ![]() |
Hi all,
after a while, I reactivated my cluster and I realized that I get the following error while running snappyHexMesh in parallel: Code:
cfd@OpenFOAM:~/OpenFOAM/cfd-dev/run/Columbia/detailedCFDAnalysis$ mpirun -np 20 snappyHexMesh -parallel
****************************************************************************
* hwloc 1.11.2 has encountered what looks like an error from the operating system.
*
* L3 (cpuset 0x000003f0) intersects with NUMANode (P#0 cpuset 0x0000003f) without inclusion!
* Error occurred in topology.c line 1046
*
* The following FAQ entry in the hwloc documentation may help:
* What should I do when hwloc reports "operating system" warnings?
* Otherwise please report this error message to the hwloc user's mailing list,
* along with the output+tarball generated by the hwloc-gather-topology script.
****************************************************************************
/*---------------------------------------------------------------------------*\
========= |
\\ / F ield | OpenFOAM: The Open Source CFD Toolbox
\\ / O peration | Website: https://openfoam.org
\\ / A nd | Version: dev
\\/ M anipulation |
\*---------------------------------------------------------------------------*/
Build : dev-adea51c53ccb
Exec : snappyHexMesh -parallel
Date : Dec 18 2018
Time : 21:49:53
Host : "OpenFOAM"
PID : 2091
I/O : uncollated
Case : /home/cfd/OpenFOAM/cfd-dev/run/Gibraltar/detailedCFDAnalysis
nProcs : 20
Slaves :
19
(
"OpenFOAM.2092"
"OpenFOAM.2093"
"OpenFOAM.2094"
"OpenFOAM.2095"
"OpenFOAM.2096"
"OpenFOAM.2097"
"OpenFOAM.2098"
"OpenFOAM.2099"
"OpenFOAM.2100"
"OpenFOAM.2101"
"OpenFOAM.2102"
"OpenFOAM.2103"
"OpenFOAM.2104"
"OpenFOAM.2105"
"OpenFOAM.2106"
"OpenFOAM.2107"
"OpenFOAM.2108"
"OpenFOAM.2109"
"OpenFOAM.2110"
)
Pstream initialized with:
floatTransfer : 0
nProcsSimpleSum : 0
commsType : nonBlocking
polling iterations : 0
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster (fileModificationSkew 10)
allowSystemOperations : Allowing user-supplied system call operations
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Any suggestion?
__________________
Keep foaming, Tobias Holzmann |
|
|
|
|
|
![]() |
| Thread Tools | Search this Thread |
| Display Modes | |
|
|
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| MPI error in parallel application | usv001 | OpenFOAM Programming & Development | 2 | September 14, 2017 12:30 |
| chtMultiRegionSimpleFoam: crash on parallel run | student666 | OpenFOAM Running, Solving & CFD | 3 | April 20, 2017 12:05 |
| Pinning the processors in cluster to run a solver | coolcrasher | OpenFOAM Running, Solving & CFD | 0 | November 5, 2015 07:11 |
| OF22x with mvapich2 on redhat cluster doesn't run parallel job | mmmn036 | OpenFOAM Programming & Development | 0 | January 29, 2014 22:42 |
| run parallel distributed in cluster | fevi84 | FLUENT | 0 | June 18, 2012 18:41 |