HELP NEEDED with TURBFOAM

 Register Blogs Members List Search Today's Posts Mark Forums Read

 September 25, 2008, 07:49 Hi Foamers, i am working on a #61 Member   Vishal Jambhekar Join Date: Mar 2009 Location: University Stuttgart, Stuttgart Germany Posts: 90 Blog Entries: 1 Rep Power: 9 Hi Foamers, i am working on a sonicTurbfoam on prisem case for Supersonic flow conditions...... but the mesh provided with the prism example is exactly reverse. i tried to refine it a lot but bot able to set proper grading ration..... resulting in an error in blockMesh....due to mis match amongst faces.... can any one please tell me how to refine the grid near prism to get reduced Y+..... __________________ Cheers, Vishal Jambhekar... "Simulate the way ahead......!!!"

 November 10, 2008, 07:46 Hello, Creating field DpDt #62 emilianyassenov Guest   Posts: n/a Hello, Creating field DpDt Courant Number mean: 0 max: 25.4 Starting time loop Courant Number mean: 0 max: 25.4 Time = 0.01 diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0 DILUPBiCG: Solving for Ux, Initial residual = 1, Final residual = 9.68187e-06, No Iterations 30 DILUPBiCG: Solving for Uy, Initial residual = 1, Final residual = 7.55278e-06, No Iterations 30 DILUPBiCG: Solving for Uz, Initial residual = 1, Final residual = 8.92605e-06, No Iterations 36 DILUPBiCG: Solving for h, Initial residual = 0.582762, Final residual = 8.10937e-06, No Iterations 32 DICPCG: Solving for p, Initial residual = 1, Final residual = 9.23182e-07, No Iterations 920 diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0 time step continuity errors : sum local = 1.16199e-08, global = -4.76424e-12, cumulative = -4.76424e-12 DILUPBiCG: Solving for h, Initial residual = 1, Final residual = 8.3393e-06, No Iterations 88 DICPCG: Solving for p, Initial residual = 0.636047, Final residual = 9.56417e-07, No Iterations 917 diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0 time step continuity errors : sum local = 4.08715e-08, global = -1.30096e-13, cumulative = -4.89434e-12 DILUPBiCG: Solving for omega, Initial residual = 0.228422, Final residual = 8.70597e-06, No Iterations 83 bounding omega, min: -1759.9 max: 1362.81 average: 697.513 DILUPBiCG: Solving for k, Initial residual = 1, Final residual = 7.36061e-06, No Iterations 48 ExecutionTime = 197.08 s ClockTime = 199 s I hve run the case but it is very slow because of the courant number it is very high. what can I do to increase it... and make faster the time steps. thank you in advance Emo

 November 11, 2008, 06:05 I have include transport equat #63 emilianyassenov Guest   Posts: n/a I have include transport equation but again courant number is high. can someone help me? thanks Emo

 November 11, 2008, 06:44 Hi, The problem could come #64 Member   Pattyn Eric Join Date: Mar 2009 Posts: 61 Rep Power: 9 Hi, The problem could come form the turbulence model kOmegaSST. It seems that you have bounding errors: bounding omega, min: -1759.9 max: 1362.81 average: 697.513 Maybe you should choose more adapted initial values for omega and/or k.

 June 22, 2010, 09:58 turboFoam problem #65 New Member   asmi Join Date: Jun 2010 Posts: 2 Rep Power: 0 hello, i'm new with openfoam and i have a problem with a mpirun, when a type in the terminal : mpirun -np 30 -machinefile machinefile turbFoam -parallel | tee 2>&1 log/turb.log it runs, after a while it stops with this message : Create time Create mesh for time = 4000 Reading field p Reading field U Reading/calculating face flux field phi [23] [23] [23] keyword PISO is undefined in dictionary "/media/OpenFoam/Travaux/p-habitacle/foamProMesh-pisoFoam/turbtest/turbFoam/processor23/system/fvSolution" [23] [23] file: /media/OpenFoam/Travaux/p-habitacle/foamProMesh-pisoFoam/turbtest/turbFoam/processor23/system/fvSolution from line 3 to line 31. [23] [23] From function dictionary::subDict(const word& keyword) const [23] in file db/dictionary/dictionary.C at line 271. [23] FOAM parallel run exiting [23] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 23 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- mpirun has exited due to process rank 23 with PID 30795 on node 10.0.0.7 exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). but when i replace turbFoam in the mpirun with simpleFoam it runs correctly. Someone knows what is the problem? thanks

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is OffTrackbacks are On Pingbacks are On Refbacks are On Forum Rules

 Similar Threads Thread Thread Starter Forum Replies Last Post skabilan OpenFOAM Running, Solving & CFD 2 September 29, 2008 17:43 danie OpenFOAM Running, Solving & CFD 2 July 30, 2008 07:45 hsieh OpenFOAM Running, Solving & CFD 12 July 23, 2008 07:40 jackdaniels83 OpenFOAM Running, Solving & CFD 11 June 27, 2007 14:22 rolando OpenFOAM Running, Solving & CFD 9 June 4, 2007 05:42

All times are GMT -4. The time now is 10:45.