|
[Sponsors] |
August 16, 2013, 12:10 |
a laminar case requests k!
|
#1 |
Senior Member
Ehsan
Join Date: Oct 2012
Location: Iran
Posts: 2,208
Rep Power: 26 |
I have used :
Code:
simulationType laminar;//RASModel;// Code:
[2] --> FOAM FATAL IO ERROR: [0] [0] [0] --> FOAM FATAL IO ERROR: [0] cannot find file [0] [0] file: /home/ehsan/Desktop/HeatExchanger/processor0/0/k at line [1] 0. [0] [0] From function regIOobject::readStream() [0] in file db/regIOobject/regIOobjectRead.C at line 73. [0] FOAM parallel run exiting [0] [1] [1] --> FOAM FATAL IO ERROR: [1] cannot find file [1] [1] file: /home/ehsan/Desktop/HeatExchanger/processor1/0/k at line 0--> Upgrading k to employ run-time selectable wall functions [3] . [1] [1] From function regIOobject::readStream() [1] in file db/regIOobject/regIOobjectRead.C at line 73. [1] FOAM parallel run exiting [3] [3] --> FOAM FATAL IO ERROR: [3] cannot find file [3] [1] [3] file: /home/ehsan/Desktop/HeatExchanger/processor3/0/k at line 0. [3] [3] From function regIOobject::readStream() [3] in file db/regIOobject/regIOobjectRead.C at line 73. [3] FOAM parallel run exiting [3] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [2] cannot find file [2] [2] file: /home/ehsan/Desktop/HeatExchanger/processor2/0/k at line 0. [2] [2] From function regIOobject::readStream() [2] in file db/regIOobject/regIOobjectRead.C at line 73. [2] FOAM parallel run exiting [2] -------------------------------------------------------------------------- mpirun has exited due to process rank 3 with PID 20838 on node Ehsan-com exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- [Ehsan-com:20829] 3 more processes have sent help message help-mpi-api.txt / mpi-abort [Ehsan-com:20829] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages Killing PID 20828 PyFoam WARNING on line 232 of file /usr/local/lib/python2.7/dist-packages/PyFoam/Execution/FoamThread.py : Process 20828 was already dead Getting LinuxMem: [Errno 2] No such file or directory: '/proc/20828/status'
__________________
Injustice Anywhere is a Threat for Justice Everywhere.Martin Luther King. To Be or Not To Be,Thats the Question! The Only Stupid Question Is the One that Goes Unasked. |
|
August 16, 2013, 12:34 |
|
#2 |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,975
Blog Entries: 45
Rep Power: 128 |
Hi Ehsan,
Don't you know that turbulence settings are usually defined in OpenFOAM on two files? Namely "constant/RASProperties" (or "constant/LESProperties") and "constant/turbulenceProperties"? Best regards, Bruno
__________________
|
|
August 16, 2013, 12:50 |
|
#3 |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,975
Blog Entries: 45
Rep Power: 128 |
Since Ehsan sent me the case, the problems are:
__________________
|
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Test Case: Laminar Flow Between Rotating Concentric Cylinders | ebertmp | OpenFOAM | 4 | December 3, 2012 11:54 |
OpenFoam/FLUENT difference in cilinder case | RuiVO | OpenFOAM Running, Solving & CFD | 2 | December 12, 2011 14:26 |
SimpleFoam: Laminar vs. Turbulent Convergence | JasonG | OpenFOAM | 0 | June 2, 2011 08:29 |
Laminar field as initial state for turbulent two phase pipe flow | kjetil | OpenFOAM Running, Solving & CFD | 3 | July 21, 2009 09:15 |
Laminar or turbulent case ? | Bharath | FLUENT | 1 | December 6, 2002 03:55 |