|
[Sponsors] |
parallel Grief: BoundaryFields ok in single CPU but NOT in Parallel |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
April 8, 2013, 17:01 |
parallel Grief: BoundaryFields ok in single CPU but NOT in Parallel
|
#1 |
Senior Member
Jose Rey
Join Date: Oct 2012
Posts: 134
Rep Power: 17 |
I have a case that works perfectly in a single CPU, but when I try to run it in parallel it gives me an error. It is not adding boundary fields for my STL in each of the processors, which is something it appears to do just fine when not running in parallel. So, the boundaryFields for the files k-nut-omega-p-U contain the boundary definitions for my STL mesh (ducted_vent_horn0_Mesh) in the directory 0, but not in each of the processor*/0 directories.
I am using OpenFoam-2.1.1 This is what I do when I run in a single CPU: Code:
blockMesh | tee log/blockMesh.log snappyHexMesh -overwrite | tee log/snappyHexMesh.log simpleFoam | tee log/simpleFoam.log Code:
blockMesh | tee log/blockMesh.log decomposePar mpirun -np 4 snappyHexMesh -overwrite -parallel | tee log/snappyHexMesh.log mpirun -np 4 simpleFoam -parallel | tee log/simpleFoam.log Code:
/*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox | | \\ / O peration | Version: 2.1.1 | | \\ / A nd | Web: www.OpenFOAM.org | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ Build : 2.1.1-221db2718bbb Exec : simpleFoam -parallel Date : Apr 08 2013 Time : 16:37:52 Host : "admin1-VirtualBox" PID : 5116 Case : /home/admin1/OpenFOAM/models/ductedVent12Par nProcs : 4 Slaves : 3 ( "admin1-VirtualBox.5117" "admin1-VirtualBox.5118" "admin1-VirtualBox.5119" ) Pstream initialized with: floatTransfer : 0 nProcsSimpleSum : 0 commsType : nonBlocking sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE). fileModificationChecking : Monitoring run-time modified files using timeStampMaster allowSystemOperations : Disallowing user-supplied system call operations // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time Create mesh for time = 0 Reading field p [0] [0] [0] --> FOAM FATAL IO ERROR: [0] keyword ducted_vent_horn0_Mesh is undefined in dictionary "/home/admin1/OpenFOAM/models/ductedVent12Par/processor0/0/p::boundaryField" [0] [0] file: /home/admin1/OpenFOAM/models/ductedVent12Par/processor0/0/p::boundaryField from line 26 to line 61. [0] [0] From function dictionary::subDict(const word& keyword) const [0] in file db/dictionary/dictionary.C at line 461. [0] FOAM parallel run exiting [0] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [1] [1] [1] --> FOAM FATAL IO ERROR: [1] keyword ducted_vent_horn0_Mesh is undefined in dictionary "/home/admin1/OpenFOAM/models/ductedVent12Par/processor1/0/p::boundaryField" [1] [1] file: /home/admin1/OpenFOAM/models/ductedVent12Par/processor1/0/p::boundaryField from line 26 to line 61. [1] [1] From function dictionary::subDict(const word& keyword) const [1] in file db/dictionary/dictionary.C at line 461. [1] FOAM parallel run exiting [1] [2] [2] [2] --> FOAM FATAL IO ERROR: [2] keyword ducted_vent_horn0_Mesh is undefined in dictionary "/home/admin1/OpenFOAM/models/ductedVent12Par/processor2/0/p::boundaryField" [2] [2] file: /home/admin1/OpenFOAM/models/ductedVent12Par/processor2/0/p::boundaryField from line 26 to line 61. [2] [2] From function dictionary::subDict(const word& keyword) const [2] in file db/dictionary/dictionary.C at line 461. [2] FOAM parallel run exiting [2] [3] [3] [3] --> FOAM FATAL IO ERROR: [3] keyword ducted_vent_horn0_Mesh is undefined in dictionary "/home/admin1/OpenFOAM/models/ductedVent12Par/processor3/0/p::boundaryField" [3] [3] file: /home/admin1/OpenFOAM/models/ductedVent12Par/processor3/0/p::boundaryField from line 26 to line 61. [3] [3] From function dictionary::subDict(const word& keyword) const [3] in file db/dictionary/dictionary.C at line 461. [3] FOAM parallel run exiting [3] -------------------------------------------------------------------------- mpirun has exited due to process rank 2 with PID 5118 on node admin1-VirtualBox exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- [admin1-VirtualBox:05114] 3 more processes have sent help message help-mpi-api.txt / mpi-abort [admin1-VirtualBox:05114] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages Code:
/*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox | | \\ / O peration | Version: 2.1.1 | | \\ / A nd | Web: www.OpenFOAM.org | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ FoamFile { version 2.0; format ascii; class dictionary; location system; object snappyHexMeshDict; } castellatedMesh true; snap true; addLayers true; geometry { ducted_vent_horn0.stl { type triSurfaceMesh; name ducted_vent_horn0; patchInfo { type wall; } } boxFront { type searchableBox; min ( -2.6 -2.0 2.0 ); max ( -2.4 2.0 6.0 ); } boxBack { type searchableBox; min ( 2.4 -2.0 2.0 ); max ( 2.6 2.0 6.0 ); } boxMid { type searchableBox; min ( -0.1 -0.5 4.0 ); max ( 0.1 0.5 5.0 ); } boxNozzle { type searchableBox; min ( -3.0 -2.5 1.6 ); max ( 3.0 2.5 6.4 ); } boxBig { type searchableBox; min ( -20.0 -4.5 0.0 ); max ( 20.0 4.5 8.0 ); } } castellatedMeshControls { features ( ); refinementSurfaces { ducted_vent_horn0 { regions { } level ( 3 4 ); } } refinementRegions { boxFront { mode inside; levels ((1.0e15 3)); } boxBack { mode inside; levels ((1.0e15 3)); } boxMid { mode inside; levels ((1.0e15 3)); } boxNozzle { mode inside; levels ((1.0e15 2)); } boxBig { mode inside; levels ((1.0e15 1)); } } locationInMesh (5.0 0.0 0.0 ); maxLocalCells 1000000; maxGlobalCells 8000000; minRefinementCells 0; nCellsBetweenLevels 1; resolveFeatureAngle 30; allowFreeStandingZoneFaces false; } snapControls { nSolveIter 30; nSmoothPatch 3; tolerance 4.0; nRelaxIter 5; nFeatureSnapIter 10; } addLayersControls { layers { ducted_vent_horn0_Mesh { nSurfaceLayers 2; } } relativeSizes true; expansionRatio 1.0; finalLayerThickness 0.3; minThickness 0.1; nGrow 1; featureAngle 60; nRelaxIter 5; nSmoothSurfaceNormals 1; nSmoothNormals 3; nSmoothThickness 10; maxFaceThicknessRatio 0.5; maxThicknessToMedialRatio 0.3; minMedianAxisAngle 130; nBufferCellsNoExtrude 0; nLayerIter 50; nRelaxedIter 20; } meshQualityControls { maxNonOrtho 65; maxBoundarySkewness 20; maxInternalSkewness 4; maxConcave 80; minFlatness 0.5; minVol 1.00E-13; minTetQuality -1e30; minArea -1; minTwist 0.05; minDeterminant 0.001; minFaceWeight 0.05; minVolRatio 0.01; minTriangleTwist -1; nSmoothScale 4; errorReduction 0.75; relaxed { maxNonOrtho 75; } } debug 0; mergeTolerance 1E-6; Code:
/*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox | | \\ / O peration | Version: 2.1.1 | | \\ / A nd | Web: www.OpenFOAM.org | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ FoamFile { version 2.0; format ascii; class dictionary; location system; object decomposeParDict; } numberOfSubdomains 4; method hierarchical; hierarchicalCoeffs { n (2 2 1); delta 0.001; order xyz; } distributed false; roots ( ); |
|
April 8, 2013, 19:03 |
parallel SnappyHexMesh correct BoundaryFields Workflow
|
#2 |
Senior Member
Jose Rey
Join Date: Oct 2012
Posts: 134
Rep Power: 17 |
I found the answer, it was reported as a bug in OF-1.7, I can confirm having the same problem with 2.1.1 and 2.2.0. If you ever encounter the same problem, here is the solution:
Code:
blockMesh | tee log/blockMesh.log decomposePar mpirun -np 4 snappyHexMesh -overwrite -parallel | tee log/snappyHexMesh.log reconstructParMesh -constant rm -rf ./processor* decomposePar mpirun -np 4 simpleFoam -parallel | tee log/simpleFoam.log reconstructPar http://www.openfoam.org/mantisbt/view.php?id=162 Does Anybody know if I caused the problem by doing something wrong? |
|
April 19, 2013, 16:49 |
|
#3 |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,980
Blog Entries: 45
Rep Power: 128 |
Greetings Jose,
Unfortunately I've been busy lately and haven't been able to pay much attention to the forum... Nonetheless, I've spotted this thread from another post of yours! This topic (parallel meshing with snappyHexMesh and then simulating) has been discussed some time ago, as well as not so long ago it was reviewed a bit:
Best regards, Bruno
__________________
|
|
Tags |
openfoam 2.1.x, parallel |
Thread Tools | Search this Thread |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Superlinear speedup in OpenFOAM 13 | msrinath80 | OpenFOAM Running, Solving & CFD | 18 | March 3, 2015 05:36 |
Dual cpu workstation VS 2 node cluster single cpu workstation | Verdi | Hardware | 18 | September 2, 2013 03:09 |
using core i7 cpu for parallel solving | feizaghaee | CFX | 28 | June 25, 2012 19:01 |
OpenFOAM 13 Intel quadcore parallel results | msrinath80 | OpenFOAM Running, Solving & CFD | 13 | February 5, 2008 05:26 |
OpenFOAM 13 AMD quadcore parallel results | msrinath80 | OpenFOAM Running, Solving & CFD | 1 | November 10, 2007 23:23 |