CFD Online Discussion Forums

CFD Online Discussion Forums (http://www.cfd-online.com/Forums/)
-   OpenFOAM Pre-Processing (http://www.cfd-online.com/Forums/openfoam-pre-processing/)
-   -   empty decomposePar (http://www.cfd-online.com/Forums/openfoam-pre-processing/122238-empty-decomposepar.html)

immortality August 15, 2013 01:17

empty decomposePar
 
decomposePar results processors files only including constant folders without fields.why it behaves so now?
Code:

[0] [2] --> FOAM FATAL IO ERROR:
[2]
[2] --> FOAM FATAL IO ERROR:
[2] cannot find file
[2]
[2] file: /home/ehsan/Desktop/WR_3/processor2/0/p at line 0.
[2]
[2]    From function regIOobject::readStream()
[2]    in file db/regIOobject/regIOobjectRead.C at line 73.
[2]
FOAM parallel run exiting
[2]

[0] cannot find file
[0]
[0] file: /home/ehsan/Desktop/WR_3/processor0/0/p at line 0.
[0]
[0] [3]    From function
[3]
[3] --> FOAM FATAL IO ERROR:
[3] cannot find file
[3]
[3] file: /home/ehsan/Desktop/WR_3/processor3/0/p at line 0.
[3]
[3]    From function regIOobject::readStream()
[3]    in file db/regIOobject/regIOobjectRead.C at line 73.
[3]
FOAM parallel run exiting
[3]
regIOobject::readStream()
[0]    in file db/regIOobject/regIOobjectRead.C at line 73.
[1]
[1]
[1] --> FOAM FATAL IO ERROR:
[1] cannot find file
[1]
[1] file: /home/ehsan/Desktop/WR_3/processor1/0/p at line 0.
[1]
[1]    From function regIOobject::readStream()
[1]    in file db/regIOobject/regIOobjectRead.C at line 73.
[1]
FOAM parallel run exiting
[1]
[0]
FOAM parallel run exiting
[0]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 3 with PID 27486 on
node Ehsan-com exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[Ehsan-com:27480] 3 more processes have sent help message help-mpi-api.txt / mpi-abort
[Ehsan-com:27480] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Killing PID 27479
 PyFoam WARNING on line 232 of file /usr/local/lib/python2.7/dist-packages/PyFoam/Execution/FoamThread.py : Process 27479 was already dead


immortality August 15, 2013 02:01

it even occurs when I use serial run.but p file is in the time folder like other fields and like before.:confused:
Code:

/*---------------------------------------------------------------------------*\
| =========                |                                                |
| \\      /  F ield        | OpenFOAM: The Open Source CFD Toolbox          |
|  \\    /  O peration    | Version:  2.2.0                                |
|  \\  /    A nd          | Web:      www.OpenFOAM.org                      |
|    \\/    M anipulation  |                                                |
\*---------------------------------------------------------------------------*/
Build  : 2.2.0-b363e8d14789
Exec  : rhoCentralFoamGasCont
Date  : Aug 15 2013
Time  : 10:25:54
Host  : "Ehsan-com"
PID    : 2529
Case  : /home/ehsan/Desktop/WR_3
nProcs : 1
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0.0103469

Reading thermophysical properties

Selecting thermodynamics package
{
    type            hePsiThermo;
    mixture        pureMixture;
    transport      sutherland;
    thermo          janaf;
    equationOfState perfectGas;
    specie          specie;
    energy          sensibleEnthalpy;
}



--> FOAM FATAL IO ERROR:
cannot find file

file: /home/ehsan/Desktop/WR_3/0.0103469/p at line 0.

    From function regIOobject::readStream()
    in file db/regIOobject/regIOobjectRead.C at line 73.

FOAM exiting


immortality August 15, 2013 02:55

I deleted that time folder and went to a later one and used it:
Code:

decomposePar -time '0.01034593' -force
but an error like before occurred again.
Code:

[0] [1]
[1]
[1] --> FOAM FATAL IO ERROR:
[1] cannot find file
[1]
[1] file: /home/ehsan/Desktop/WR_3/processor1/0.0103459/p at line 0.
[1]
[1]    From function regIOobject::readStream()
[1]    in file db/regIOobject/regIOobjectRead.C at line 73.
[1]
FOAM parallel run exiting
[1]

[0]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[0] --> FOAM FATAL IO ERROR:
[0] cannot find file
[0]
[0] file: /home/ehsan/Desktop/WR_3/processor0/0.0103459/p at line 0.
[0]
[0]    From function regIOobject::readStream()
[0]    in file db/regIOobject/regIOobjectRead.C at line 73.
[0]
FOAM parallel run exiting
[0]
--------------------------------------------------------------------------
mpirun has exited due to process rank 1 with PID 5374 on
node Ehsan-com exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[Ehsan-com:05372] 1 more process has sent help message help-mpi-api.txt / mpi-abort
[Ehsan-com:05372] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Killing PID 5369
 PyFoam WARNING on line 232 of file /usr/local/lib/python2.7/dist-packages/PyFoam/Execution/FoamThread.py : Process 5369 was already dead

why compiler can't read p file in time folder when its over there?

immortality August 21, 2013 07:27

it repeated after a while in the new case I established again:
Code:

/*---------------------------------------------------------------------------*\
| =========                |                                                |
| \\      /  F ield        | OpenFOAM: The Open Source CFD Toolbox          |
|  \\    /  O peration    | Version:  2.2.0                                |
|  \\  /    A nd          | Web:      www.OpenFOAM.org                      |
|    \\/    M anipulation  |                                                |
\*---------------------------------------------------------------------------*/
Build  : 2.2.0-b363e8d14789
Exec  : decomposePar
Date  : Aug 21 2013
Time  : 15:50:11
Host  : "Ehsan-com"
PID    : 28022
Case  : /home/ehsan/Desktop/WR_4
nProcs : 1
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time



Decomposing mesh region0

Create mesh

Calculating distribution of cells
Selecting decompositionMethod simple

Finished decomposition in 0.01 s

Calculating original mesh data

Distributing cells to processors

Distributing faces to processors

Distributing points to processors

Constructing processor meshes

Processor 0
    Number of cells = 9450
    Number of faces shared with processor 1 = 54
    Number of processor patches = 1
    Number of processor faces = 54
    Number of boundary faces = 19304

Processor 1
    Number of cells = 9450
    Number of faces shared with processor 0 = 54
    Number of faces shared with processor 2 = 54
    Number of processor patches = 2
    Number of processor faces = 108
    Number of boundary faces = 19250

Processor 2
    Number of cells = 9450
    Number of faces shared with processor 1 = 54
    Number of faces shared with processor 3 = 54
    Number of processor patches = 2
    Number of processor faces = 108
    Number of boundary faces = 19250

Processor 3
    Number of cells = 9450
    Number of faces shared with processor 2 = 54
    Number of processor patches = 1
    Number of processor faces = 54
    Number of boundary faces = 19304

Number of processor faces = 162
Max number of cells = 9450 (0% above average 9450)
Max number of processor patches = 2 (33.333333333333% above average 1.5)
Max number of faces between processors = 108 (33.333333333333% above average 81)

Time = 0.0111974

Processor 0: field transfer
Processor 1: field transfer
Processor 2: field transfer
Processor 3: field transfer

End.

apparently decomposePar doing well,but when I want to run,this error is shown:
Code:

[1]    From function regIOobject::readStream()
[1]    in file db/regIOobject/regIOobjectRead.C at line 73.
[1]
FOAM parallel run exiting
[1]
[3]
[3]
[3] --> FOAM FATAL IO ERROR:
[3] cannot find file
[3]
[3] file: /home/ehsan/Desktop/WR_4/processor3/0/p at line 0.
[3]
[3]    From function regIOobject::readStream()
[3]    in file db/regIOobject/regIOobjectRead.C at line 73.
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 28183 on
node Ehsan-com exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[Ehsan-com:28181] 2 more processes have sent help message help-mpi-api.txt / mpi-abort
[Ehsan-com:28181] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Killing PID 28177
 PyFoam WARNING on line 232 of file /usr/local/lib/python2.7/dist-packages/PyFoam/Execution/FoamThread.py : Process 28177 was already dead
Getting LinuxMem: [Errno 2] No such file or directory: '/proc/28177/status'

what may be the reason of such decomposePar issues and how to solve?

immortality August 21, 2013 08:16

it resolved:)
I don't know why decomposePar command wanted to decompose .0111974 time folder while the correct name of last time folder was 0.01119737
I used:
Code:

decomposePar -latestTime -force
and then decomposePar worked and copied time folders in processors folders.
but when I wanted to run it again said that the field files aren't in folder 0.0111974.
then I changed
Code:

timePrecision  7;
from 6 to 7 in controlDict and it started working!
I'm not sure about the reason of these all!because I hadn't change anything in controlDict and all was like it was before the error!
is it because of digit in time folders names?
I had time folders like:0.004665(4 significant digits),0.00658536(6 significant digits) and the last time folder had 7 significant digits(0.01119737)
then how should assign time precision so that can be sure it won't occur in future?a large number like 10 is better?


All times are GMT -4. The time now is 09:18.