CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > OpenFOAM Pre-Processing

empty decomposePar

Register Blogs Members List Search Today's Posts Mark Forums Read

Like Tree1Likes
  • 1 Post By immortality

Reply
 
LinkBack Thread Tools Display Modes
Old   August 15, 2013, 01:17
Default empty decomposePar
  #1
Senior Member
 
immortality's Avatar
 
Ehsan
Join Date: Oct 2012
Location: Iran
Posts: 2,186
Rep Power: 16
immortality is on a distinguished road
decomposePar results processors files only including constant folders without fields.why it behaves so now?
Code:
[0] [2] --> FOAM FATAL IO ERROR:
[2]
[2] --> FOAM FATAL IO ERROR:
[2] cannot find file
[2]
[2] file: /home/ehsan/Desktop/WR_3/processor2/0/p at line 0.
[2]
[2]     From function regIOobject::readStream()
[2]     in file db/regIOobject/regIOobjectRead.C at line 73.
[2]
FOAM parallel run exiting
[2]

[0] cannot find file
[0]
[0] file: /home/ehsan/Desktop/WR_3/processor0/0/p at line 0.
[0]
[0] [3]     From function
[3]
[3] --> FOAM FATAL IO ERROR:
[3] cannot find file
[3]
[3] file: /home/ehsan/Desktop/WR_3/processor3/0/p at line 0.
[3]
[3]     From function regIOobject::readStream()
[3]     in file db/regIOobject/regIOobjectRead.C at line 73.
[3]
FOAM parallel run exiting
[3]
regIOobject::readStream()
[0]     in file db/regIOobject/regIOobjectRead.C at line 73.
[1]
[1]
[1] --> FOAM FATAL IO ERROR:
[1] cannot find file
[1]
[1] file: /home/ehsan/Desktop/WR_3/processor1/0/p at line 0.
[1]
[1]     From function regIOobject::readStream()
[1]     in file db/regIOobject/regIOobjectRead.C at line 73.
[1]
FOAM parallel run exiting
[1]
[0]
FOAM parallel run exiting
[0]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 3 with PID 27486 on
node Ehsan-com exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[Ehsan-com:27480] 3 more processes have sent help message help-mpi-api.txt / mpi-abort
[Ehsan-com:27480] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Killing PID 27479
 PyFoam WARNING on line 232 of file /usr/local/lib/python2.7/dist-packages/PyFoam/Execution/FoamThread.py : Process 27479 was already dead
__________________
Injustice Anywhere is a Threat for Justice Everywhere.Martin Luther King.
To Be or Not To Be,Thats the Question!
The Only Stupid Question Is the One that Goes Unasked.
immortality is offline   Reply With Quote

Old   August 15, 2013, 02:01
Default
  #2
Senior Member
 
immortality's Avatar
 
Ehsan
Join Date: Oct 2012
Location: Iran
Posts: 2,186
Rep Power: 16
immortality is on a distinguished road
it even occurs when I use serial run.but p file is in the time folder like other fields and like before.
Code:
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2.2.0                                 |
|   \\  /    A nd           | Web:      www.OpenFOAM.org                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
Build  : 2.2.0-b363e8d14789
Exec   : rhoCentralFoamGasCont
Date   : Aug 15 2013
Time   : 10:25:54
Host   : "Ehsan-com"
PID    : 2529
Case   : /home/ehsan/Desktop/WR_3
nProcs : 1
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0.0103469

Reading thermophysical properties

Selecting thermodynamics package 
{
    type            hePsiThermo;
    mixture         pureMixture;
    transport       sutherland;
    thermo          janaf;
    equationOfState perfectGas;
    specie          specie;
    energy          sensibleEnthalpy;
}



--> FOAM FATAL IO ERROR: 
cannot find file

file: /home/ehsan/Desktop/WR_3/0.0103469/p at line 0.

    From function regIOobject::readStream()
    in file db/regIOobject/regIOobjectRead.C at line 73.

FOAM exiting
__________________
Injustice Anywhere is a Threat for Justice Everywhere.Martin Luther King.
To Be or Not To Be,Thats the Question!
The Only Stupid Question Is the One that Goes Unasked.
immortality is offline   Reply With Quote

Old   August 15, 2013, 02:55
Default
  #3
Senior Member
 
immortality's Avatar
 
Ehsan
Join Date: Oct 2012
Location: Iran
Posts: 2,186
Rep Power: 16
immortality is on a distinguished road
I deleted that time folder and went to a later one and used it:
Code:
decomposePar -time '0.01034593' -force
but an error like before occurred again.
Code:
[0] [1]
[1]
[1] --> FOAM FATAL IO ERROR:
[1] cannot find file
[1]
[1] file: /home/ehsan/Desktop/WR_3/processor1/0.0103459/p at line 0.
[1]
[1]     From function regIOobject::readStream()
[1]     in file db/regIOobject/regIOobjectRead.C at line 73.
[1]
FOAM parallel run exiting
[1]

[0]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[0] --> FOAM FATAL IO ERROR:
[0] cannot find file
[0]
[0] file: /home/ehsan/Desktop/WR_3/processor0/0.0103459/p at line 0.
[0]
[0]     From function regIOobject::readStream()
[0]     in file db/regIOobject/regIOobjectRead.C at line 73.
[0]
FOAM parallel run exiting
[0]
--------------------------------------------------------------------------
mpirun has exited due to process rank 1 with PID 5374 on
node Ehsan-com exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[Ehsan-com:05372] 1 more process has sent help message help-mpi-api.txt / mpi-abort
[Ehsan-com:05372] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Killing PID 5369
 PyFoam WARNING on line 232 of file /usr/local/lib/python2.7/dist-packages/PyFoam/Execution/FoamThread.py : Process 5369 was already dead
why compiler can't read p file in time folder when its over there?
__________________
Injustice Anywhere is a Threat for Justice Everywhere.Martin Luther King.
To Be or Not To Be,Thats the Question!
The Only Stupid Question Is the One that Goes Unasked.
immortality is offline   Reply With Quote

Old   August 21, 2013, 07:27
Default
  #4
Senior Member
 
immortality's Avatar
 
Ehsan
Join Date: Oct 2012
Location: Iran
Posts: 2,186
Rep Power: 16
immortality is on a distinguished road
it repeated after a while in the new case I established again:
Code:
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2.2.0                                 |
|   \\  /    A nd           | Web:      www.OpenFOAM.org                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
Build  : 2.2.0-b363e8d14789
Exec   : decomposePar
Date   : Aug 21 2013
Time   : 15:50:11
Host   : "Ehsan-com"
PID    : 28022
Case   : /home/ehsan/Desktop/WR_4
nProcs : 1
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time



Decomposing mesh region0

Create mesh

Calculating distribution of cells
Selecting decompositionMethod simple

Finished decomposition in 0.01 s

Calculating original mesh data

Distributing cells to processors

Distributing faces to processors

Distributing points to processors

Constructing processor meshes

Processor 0
    Number of cells = 9450
    Number of faces shared with processor 1 = 54
    Number of processor patches = 1
    Number of processor faces = 54
    Number of boundary faces = 19304

Processor 1
    Number of cells = 9450
    Number of faces shared with processor 0 = 54
    Number of faces shared with processor 2 = 54
    Number of processor patches = 2
    Number of processor faces = 108
    Number of boundary faces = 19250

Processor 2
    Number of cells = 9450
    Number of faces shared with processor 1 = 54
    Number of faces shared with processor 3 = 54
    Number of processor patches = 2
    Number of processor faces = 108
    Number of boundary faces = 19250

Processor 3
    Number of cells = 9450
    Number of faces shared with processor 2 = 54
    Number of processor patches = 1
    Number of processor faces = 54
    Number of boundary faces = 19304

Number of processor faces = 162
Max number of cells = 9450 (0% above average 9450)
Max number of processor patches = 2 (33.333333333333% above average 1.5)
Max number of faces between processors = 108 (33.333333333333% above average 81)

Time = 0.0111974

Processor 0: field transfer
Processor 1: field transfer
Processor 2: field transfer
Processor 3: field transfer

End.
apparently decomposePar doing well,but when I want to run,this error is shown:
Code:
[1]     From function regIOobject::readStream()
[1]     in file db/regIOobject/regIOobjectRead.C at line 73.
[1]
FOAM parallel run exiting
[1]
[3]
[3]
[3] --> FOAM FATAL IO ERROR:
[3] cannot find file
[3]
[3] file: /home/ehsan/Desktop/WR_4/processor3/0/p at line 0.
[3]
[3]     From function regIOobject::readStream()
[3]     in file db/regIOobject/regIOobjectRead.C at line 73.
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 28183 on
node Ehsan-com exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[Ehsan-com:28181] 2 more processes have sent help message help-mpi-api.txt / mpi-abort
[Ehsan-com:28181] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Killing PID 28177
 PyFoam WARNING on line 232 of file /usr/local/lib/python2.7/dist-packages/PyFoam/Execution/FoamThread.py : Process 28177 was already dead 
Getting LinuxMem: [Errno 2] No such file or directory: '/proc/28177/status'
what may be the reason of such decomposePar issues and how to solve?
__________________
Injustice Anywhere is a Threat for Justice Everywhere.Martin Luther King.
To Be or Not To Be,Thats the Question!
The Only Stupid Question Is the One that Goes Unasked.
immortality is offline   Reply With Quote

Old   August 21, 2013, 08:16
Default
  #5
Senior Member
 
immortality's Avatar
 
Ehsan
Join Date: Oct 2012
Location: Iran
Posts: 2,186
Rep Power: 16
immortality is on a distinguished road
it resolved
I don't know why decomposePar command wanted to decompose .0111974 time folder while the correct name of last time folder was 0.01119737
I used:
Code:
decomposePar -latestTime -force
and then decomposePar worked and copied time folders in processors folders.
but when I wanted to run it again said that the field files aren't in folder 0.0111974.
then I changed
Code:
timePrecision   7;
from 6 to 7 in controlDict and it started working!
I'm not sure about the reason of these all!because I hadn't change anything in controlDict and all was like it was before the error!
is it because of digit in time folders names?
I had time folders like:0.004665(4 significant digits),0.00658536(6 significant digits) and the last time folder had 7 significant digits(0.01119737)
then how should assign time precision so that can be sure it won't occur in future?a large number like 10 is better?
jensi_t likes this.
__________________
Injustice Anywhere is a Threat for Justice Everywhere.Martin Luther King.
To Be or Not To Be,Thats the Question!
The Only Stupid Question Is the One that Goes Unasked.
immortality is offline   Reply With Quote

Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
decomposePar 4-core warning/error? Boloar OpenFOAM Bugs 23 April 8, 2014 08:57
Instable natural convection case Peter88 OpenFOAM 5 August 18, 2011 01:23
decomposePar gives errors of_user_ OpenFOAM 1 July 4, 2011 05:27
decomposePar: can use this decomposition method only for the whole mesh aloeven OpenFOAM Bugs 0 March 16, 2011 11:15
How to model the NR eqns in a domain with empty space Vasilis Main CFD Forum 1 April 14, 2009 04:35


All times are GMT -4. The time now is 18:16.