|
[Sponsors] |
wrong zero boundary dir in parallel after snappyHexMesh |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
March 8, 2017, 03:41 |
wrong zero boundary dir in parallel after snappyHexMesh
|
#1 |
New Member
Hagen
Join Date: Nov 2016
Posts: 16
Rep Power: 9 |
Hi everyone,
I propably miss something basic, nevertheless I have no clue. I am doing parallel computing with OpenFOAM-v1612+ on 3 processors. Procedure is: Code:
blockMesh mpirun -np 3 redistributePar -decompose -parallel Code:
mpirun -np 3 snappyHexMesh -overwrite -parallel Code:
mpirun -np 3 simpleFoam -parallel Code:
/*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox | | \\ / O peration | Version: v1612+ | | \\ / A nd | Web: www.OpenFOAM.com | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ Build : v1612+ Exec : simpleFoam -parallel Date : Mar 08 2017 Time : 09:30:46 Host : "simmachine" PID : 20842 Case : /home/hagen/OpenFOAM/hagen-v1612+/run/stenose/RAS/kOmegaSST/long/steady nProcs : 3 Slaves : 2 ( "simmachine.20843" "simmachine.20844" ) Pstream initialized with: floatTransfer : 0 nProcsSimpleSum : 0 commsType : nonBlocking polling iterations : 0 sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE). fileModificationChecking : Monitoring run-time modified files using timeStampMaster (fileModificationSkew 10) allowSystemOperations : Allowing user-supplied system call operations // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time Create mesh for time = 0 SIMPLE: convergence criteria field p tolerance 0.01 field U tolerance 0.001 field "(k|epsilon|omega)" tolerance 0.001 Reading field p [0] [0] [0] --> FOAM FATAL IO ERROR: [0] size 0 is not equal to the given value of 784 [0] [0] file: /home/hagen/OpenFOAM/hagen-v1612+/run/stenose/RAS/kOmegaSST/long/steady/processor0/0/p.boundaryField.inlet from line 26 to line 28. [1] [2] [2] [2] --> FOAM FATAL IO ERROR: [2] Cannot find patchField entry for procBoundary2to0 [2] [2] file: /home/hagen/OpenFOAM/hagen-v1612+/run/stenose/RAS/kOmegaSST/long/steady/processor2/0/p.boundaryField from line 26 to line 42. [2] [2] From function void Foam::GeometricField<Type, PatchField, GeoMesh>::Boundary::readField(const Foam::DimensionedField<TypeR, GeoMesh>&, const Foam::dictionary&) [with Type = double; PatchField = Foam::fvPatchField; GeoMesh = Foam::volMesh] [2] in file /home/hagen/OpenFOAM/OpenFOAM-v1612+/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 191. [2] FOAM parallel run exiting [2] [0] [1] [1] --> FOAM FATAL IO ERROR: [1] size 0 is not equal to the given value of 1704 [1] [1] file: /home/hagen/OpenFOAM/hagen-v1612+/run/stenose/RAS/kOmegaSST/long/steady/processor1/0/p.boundaryField.outlet from line [0] 32 to line 33. [1] [1] From function Foam::Field<Type>::Field(const Foam::word&, const Foam::dictionary&, Foam::label) [with Type = double; Foam::label = int] [1] in file From function /home/hagen/OpenFOAM/OpenFOAM-v1612+/src/OpenFOAM/lnInclude/Field.C at line 304. [1] FOAM parallel run exiting Foam::Field<Type>::Field(const Foam::word&, const Foam::dictionary&, Foam::label) [with Type = double; Foam::label = int][1] [0] in file /home/hagen/OpenFOAM/OpenFOAM-v1612+/src/OpenFOAM/lnInclude/Field.C at line 304. -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [0] FOAM parallel run exiting [0] [simmachine:20840] 2 more processes have sent help message help-mpi-api.txt / mpi-abort [simmachine:20840] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages Code:
mpirun -np 3 redistributePar -reconstruct -constant -parallel mpirun -np 3 redistributePar -decompose -parallel If I only decompose again after snappyHexMesh without reconstructing before, I end up getting only the blockMesh. I appreciate any recommondations. Thank you. |
|
Tags |
boundary, parallel, snappyhexmesh |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
sliding mesh problem in CFX | Saima | CFX | 46 | September 11, 2021 07:38 |
Basic Nozzle-Expander Design | karmavatar | CFX | 20 | March 20, 2016 08:44 |
Explicitly filtered LES | saeedi | Main CFD Forum | 16 | October 14, 2015 11:58 |
RPM in Wind Turbine | Pankaj | CFX | 9 | November 23, 2009 04:05 |
New topic on same subject - Flow around race car | Tudor Miron | CFX | 15 | April 2, 2004 06:18 |