CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Meshing & Mesh Conversion (https://www.cfd-online.com/Forums/openfoam-meshing/)
-   -   [snappyHexMesh] shm in parallel with simple decomposition (https://www.cfd-online.com/Forums/openfoam-meshing/120202-shm-parallel-simple-decomposition.html)

mihaipruna July 2, 2013 10:36

shm in parallel with simple decomposition
 
1 Attachment(s)
Hi, I need some help getting SHM to run in parallel on OF 2.1.1

Here is my script:
Code:

echo Started At
date
#!/bin/sh
# Source tutorial run functions
. $WM_PROJECT_DIR/bin/tools/RunFunctions
blockMesh
surfaceFeatureExtract -includedAngle 150 -writeObj constant/triSurface/capri.stl capri
decomposePar
mpirun -np 4 snappyHexMesh -overwrite -parallel
reconstructPar
decomposePar
mpirun -np 4 rhoSimplecFoam -parallel
reconstructPar
sample
sample -dict sampleDictSTL
ptot
echo Finished At
date

decomposeParDict file:

Code:

/*--------------------------------*- C++ -*----------------------------------*\
| =========                |                                                |
| \\      /  F ield        | OpenFOAM: The Open Source CFD Toolbox          |
|  \\    /  O peration    | Version:  2.1.0                                |
|  \\  /    A nd          | Web:      www.OpenFOAM.org                      |
|    \\/    M anipulation  |                                                |
\*---------------------------------------------------------------------------*/
FoamFile
{
    version    2.0;
    format      ascii;
    class      dictionary;
    object      decomposeParDict;
}

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
//assume 4 cores
numberOfSubdomains 4;

method          simple;


simpleCoeffs
{
    n              (4 1 1);
    delta          0.001;
}




// ************************************************************************* //

and the errors:

Code:

--> FOAM FATAL ERROR:
No times selected

    From function reconstructPar
    in file reconstructPar.C at line 139.

FOAM exiting



--> FOAM FATAL ERROR:
Case is already decomposed with 4 domains, use the -force option or manually
remove processor directories before decomposing. e.g.,
    rm -rf /media/data/sduct1mil-parallel/processor*


    From function decomposePar
    in file decomposePar.C at line 253.

FOAM exiting

[0] [1]
[1]
[1] --> FOAM FATAL IO ERROR:
[1] keyword vol1face1 is undefined in dictionary "/media/data/sduct1mil-parallel/processor1/0/p::boundaryField"
[1]
[1] file: /media/data/sduct1mil-parallel/processor1/0/p::boundaryField from line 26 to line 57.
[1]
[1]    From function dictionary::subDict(const word& keyword) const
[1]    in file db/dictionary/dictionary.C at line 461.
[1]
FOAM parallel run exiting
[1]
[2]
[2]
[2] --> FOAM FATAL IO ERROR:
[2] keyword vol1face1 is undefined in dictionary "/media/data/sduct1mil-parallel/processor2/0/p::boundaryField"
[2]
[2] file: /media/data/sduct1mil-parallel/processor2/0/p::boundaryField from line 26 to line 57.
[2]
[2]    From function dictionary::subDict(const word& keyword) const
[2]    in file db/dictionary/dictionary.C at line 461.
[2]
FOAM parallel run exiting
[2]
[3]
[3]
[3] --> FOAM FATAL IO ERROR:
[3] keyword vol1face1 is undefined in dictionary "/media/data/sduct1mil-parallel/processor3/0/p::boundaryField"
[3]
[3] file: /media/data/sduct1mil-parallel/processor3/0/p::boundaryField from line 26 to line 52.
[3]
[3]    From function dictionary::subDict(const word& keyword) const
[3]    in file db/dictionary/dictionary.C at line 461.
[3]
FOAM parallel run exiting
[3]

--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[0]
[0] --> FOAM FATAL IO ERROR:
[0] keyword vol1face1 is undefined in dictionary "/media/data/sduct1mil-parallel/processor0/0/p::boundaryField"
[0]
[0] file: /media/data/sduct1mil-parallel/processor0/0/p::boundaryField from line 26 to line 52.
[0]
[0]    From function dictionary::subDict(const word& keyword) const
[0]    in file db/dictionary/dictionary.C at line 461.
[0]
FOAM parallel run exiting
[0]
--------------------------------------------------------------------------
mpirun has exited due to process rank 3 with PID 23606 on
node ubuntu exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[ubuntu:23602] 3 more processes have sent help message help-mpi-api.txt / mpi-abort
[ubuntu:23602] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages


--> FOAM FATAL ERROR:
No times selected

    From function reconstructPar
    in file reconstructPar.C at line 139.

FOAM exiting

I attached the log as well

Aurelien Thinat July 3, 2013 08:05

Hello,

You should try the command "reconstructParMesh" instead of "reconstructPar".

Regards,

Aurelien

mihaipruna July 3, 2013 09:21

Quote:

Originally Posted by Aurelien Thinat (Post 437545)
Hello,

You should try the command "reconstructParMesh" instead of "reconstructPar".

Regards,

Aurelien

Hi Aurelien, actually I tried that but it only works if I give it the constant folder as parameter and does not recreate the files in time 0.
running it with -time 0 as parameter does not work.

Aurelien Thinat July 3, 2013 09:31

You can't recreate the folder 0 from the parallel output of snappyHexMesh (or at least I'm not aware of such a capability of OpenFOAM). You have to build it by hand before the 2nd call to decomposePar.


blockMesh
surfaceFeatureExtract -includedAngle 150 -writeObj constant/triSurface/capri.stl capri
decomposePar

(you may need to copy paste the capri.eMesh file in the folders processori)

mpirun -np 4 snappyHexMesh -overwrite -parallel reconstructParMesh -constant
(Not sure about the -constant option, this command allow you to have the whole mesh in the folder ./constant/polyMesh )

Here you check that your folder ./0 is OK

decomposePar
mpirun -np 4 rhoSimplecFoam -parallel
reconstructPar -latestTime (this option is optionnal)

Artur July 10, 2013 03:40

Not sure if the previous answers solved your problem but I had the same error when trying to decompose a case with Processor 0, Processor 1, etc. folders already in it. Removing them fixed it for me.

colinB July 10, 2013 04:25

if I got your problem right the processor folders are causing
the error messages so you could use the force flag to avoid deleting them
separately and they will automatically be overwritten:

decomposePar -force
decomposeParMesh -force

for further hints on what flags are available
type:

decomposePar --help
decomposeParMesh --help

regards

louisgag July 16, 2015 04:55

take care that the -force option will delete all your processor* directories even if the times to decompose do not overlap those already decomposed.


All times are GMT -4. The time now is 06:27.