
[Sponsors] 
April 18, 2013, 05:16 
Extremely slow simulation with interDyMFoam

#1 
Member
Anon
Join Date: Oct 2012
Posts: 33
Rep Power: 6 
Sponsored Links
I am running a simulation with interDyMFoam (Air/Water, AMI with rotating and stationary mesh) which runs extremely slow, and I am looking for some ways to improve the performance. The main reason seems to be low time steps (e8) to keep the Courant number < 1, but it should be possible to speed up the calculation of every time step I think. I am looking for some comments on the residual targets/tolerances set in fvSolution, and maybe on my fvSchemes file also? I recently increased the toleranse of the pressure terms to 1e6 instead of 1e8 too increase the speed of the simulations, could this be further increased? I am running decomposed on 128 cores by the way. Any tips on how to improve the performance will be greatly appreciated. Regards, Jone Some output: Code:
Interface Courant Number mean: 6.455324963e06 max: 0.5005681575 Courant Number mean: 0.0002116964997 max: 0.5882487797 deltaT = 5.569042488e08 Time = 0.006919137503 solidBodyMotionFunctions::rotatingMotion::transformation(): Time = 0.006919137503 transformation: ((0 0 0) (0.9738608726 (0 0.2271453297 0))) AMI: Creating addressing and weights between 4453 source faces and 4453 target faces AMI: Patch source weights min/max/average = 0.9985002144, 1.001921397, 1.000047858 AMI: Patch target weights min/max/average = 0.9990857617, 1.001532226, 1.000018088 Execution time for mesh.update() = 0.07 s MULES: Solving for alpha1 Phase1 volume fraction = 0.00295762538 Min(alpha1) = 1.735479458e18 Max(alpha1) = 1 MULES: Solving for alpha1 Phase1 volume fraction = 0.002957636622 Min(alpha1) = 1.731270876e18 Max(alpha1) = 1 MULES: Solving for alpha1 Phase1 volume fraction = 0.002957647861 Min(alpha1) = 1.732182027e18 Max(alpha1) = 1 GAMG: Solving for p_rgh, Initial residual = 0.02411778836, Final residual = 8.942364364e07, No Iterations 40 time step continuity errors : sum local = 2.893569178e13, global = 4.456094558e14, cumulative = 9.7491866e11 GAMG: Solving for p_rgh, Initial residual = 0.01070900624, Final residual = 8.263136668e07, No Iterations 17 time step continuity errors : sum local = 2.632099118e13, global = 7.978696644e15, cumulative = 9.74838873e11 ExecutionTime = 61236.58 s ClockTime = 61388 s Interface Courant Number mean: 6.448986145e06 max: 0.5055292675 Courant Number mean: 0.0002114044481 max: 0.5878898938 deltaT = 5.504594754e08 Time = 0.006919192549 solidBodyMotionFunctions::rotatingMotion::transformation(): Time = 0.006919192549 transformation: ((0 0 0) (0.9738604585 (0 0.2271471051 0))) AMI: Creating addressing and weights between 4453 source faces and 4453 target faces AMI: Patch source weights min/max/average = 0.9985011005, 1.001921476, 1.00004786 AMI: Patch target weights min/max/average = 0.9990854226, 1.001532248, 1.000018089 Execution time for mesh.update() = 0.07 s MULES: Solving for alpha1 Phase1 volume fraction = 0.002957658974 Min(alpha1) = 1.733118704e18 Max(alpha1) = 1 MULES: Solving for alpha1 Phase1 volume fraction = 0.002957670084 Min(alpha1) = 1.733538446e18 Max(alpha1) = 1 MULES: Solving for alpha1 Phase1 volume fraction = 0.002957681198 Min(alpha1) = 1.734437289e18 Max(alpha1) = 1 GAMG: Solving for p_rgh, Initial residual = 0.02904518995, Final residual = 8.139273874e07, No Iterations 38 time step continuity errors : sum local = 2.577957076e13, global = 5.558341087e14, cumulative = 9.742830389e11 GAMG: Solving for p_rgh, Initial residual = 0.01185015756, Final residual = 8.340824853e07, No Iterations 24 time step continuity errors : sum local = 2.717511422e13, global = 4.323848088e14, cumulative = 9.747154237e11 ExecutionTime = 61237.22 s ClockTime = 61389 s Code:
FoamFile { version 2.0; format ascii; class dictionary; location "system"; object fvSolution; } // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // solvers { pcorr { solver GAMG; tolerance 1e06; //1e08 relTol 0; smoother DIC; nPreSweeps 0; nPostSweeps 2; nFinestSweeps 2; cacheAgglomeration true; nCellsInCoarsestLevel 10; agglomerator faceAreaPair; mergeLevels 1; } p_rgh { solver GAMG; tolerance 1e06; //1e08 relTol 0; smoother DIC; nPreSweeps 0; nPostSweeps 2; nFinestSweeps 2; cacheAgglomeration true; nCellsInCoarsestLevel 10; agglomerator faceAreaPair; mergeLevels 1; } p_rghFinal { solver GAMG; tolerance 1e06; //1e08 relTol 0; smoother DIC; nPreSweeps 0; nPostSweeps 2; nFinestSweeps 2; cacheAgglomeration true; nCellsInCoarsestLevel 10; agglomerator faceAreaPair; mergeLevels 1; } U { solver smoothSolver; smoother GaussSeidel; tolerance 1e06; relTol 0; nSweeps 1; } UFinal { solver smoothSolver; smoother GaussSeidel; tolerance 1e06; relTol 0; nSweeps 1; } } PIMPLE { momentumPredictor no; //yes nCorrectors 2; nNonOrthogonalCorrectors 0; nAlphaCorr 1; nAlphaSubCycles 3; cAlpha 1.5; correctPhi no; /* pRefPoint (0.0013 0.0017 0.0017); pRefValue 1e5; */ } relaxationFactors { fields { } equations { "U.*" 1; } } // ************************************************************************* // Code:
FoamFile { version 2.0; format ascii; class dictionary; location "system"; object fvSchemes; } // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // ddtSchemes { default Euler; } gradSchemes { default Gauss linear; } divSchemes { div(rho*phi,U) Gauss limitedLinearV 1; div(phi,alpha) Gauss vanLeer01; div(phirb,alpha) Gauss interfaceCompression; //Following added because of crash on Vilje div((muEff*dev(T(grad(U))))) Gauss linear; div((nuEff*dev(T(grad(U))))) Gauss linear; } laplacianSchemes { default Gauss linear limited 1.0; } interpolationSchemes { default linear; } snGradSchemes { default limited 1.0; } fluxRequired { default no; p_rgh; pcorr; alpha; } // ************************************************************************* // 

Sponsored Links 
April 22, 2013, 11:56 

#2 
Senior Member
Vieri Abolaffio
Join Date: Jul 2010
Location: Always on the move.
Posts: 308
Rep Power: 10 
increasing the tolerance is definitely not what i would raccomand. you are trading a small speedup for accuracy. definitely a nono.
any chanches you can use a coarser mesh? the courant number is set by the cell size and the flow velocity so increasing the cell size lowers the Courant number, thus allowing a higher timestep. also, check your maeh, maybe it is just a cell or two that are small and are slowing down all the simulation. I don't know if this strategy is doable for your case though. another thing you might want to try is to initialize the case with the mesh steady, maybe even using LTSInterFoam, even without turbulent quantities, just to do not start you simulation with big local velocities given by the mesh movement (I'm just guessing, something like a propeller rotating at 300 rpm, inflow velocity 10 ms, and there you go, big relative speed between flow and mesh motion, because the flow haven't picked it up yet...) 

April 22, 2013, 12:45 

#3 
Senior Member
Niels Gjoel Jacobsen
Join Date: Mar 2009
Location: Deltares, Delft, The Netherlands
Posts: 1,749
Rep Power: 29 
Hi Jone,
From my experience with moving meshes and VOF methods, then you really have to consider the following: Code:
momentumPrediction on; nCorrectors 3; // At least. I sometimes use as many as 5 nNonOrthogonalCorrectors <larger>?; In addition, I have read some threads about poor performance of the AMI on a large number of processors. You might consider the need of 128 processors, now that one iteration only takes 0.64 s. Kind regards Niels 

April 22, 2013, 14:21 

#4  
Senior Member
Vieri Abolaffio
Join Date: Jul 2010
Location: Always on the move.
Posts: 308
Rep Power: 10 
Quote:


April 22, 2013, 16:55 

#5 
Member
Anon
Join Date: Oct 2012
Posts: 33
Rep Power: 6 
Vieri and Nils, thank you very much for your answers! I will look into them tomorrow morning.
My mesh is around 1 million cells, so not that big. In CFX it takes less than 1/10 of the time with 4 millions, so I really donīt get it. I agree that the mesh might be the reason, I actually think that there are a couple of very small cells limiting the simulation time. Maybe I can find a way to limit the min cell size in Ansys Meshing to compare the speed. Initialization is a very good idea that I will look into. Have a good evening! Regards, Jone 

April 22, 2013, 17:02 

#6 
Senior Member
Vieri Abolaffio
Join Date: Jul 2010
Location: Always on the move.
Posts: 308
Rep Power: 10 
128 cores for a 1 million mesh it is really too much. I do think that you are getting too much mpi communication overhead.
try with a more reasonable number, like 2040 cores top. 

April 23, 2013, 04:50 

#7 
Member
Anon
Join Date: Oct 2012
Posts: 33
Rep Power: 6 
First of all, I also tried with a coarser mesh. It was ok in checkMesh, but it stilled crashed with the following error:
Code:
#0 Foam::error::printStack(Foam::Ostream&) in "/opt/openfoam211/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" #1 Foam::sigFpe::sigHandler(int) in "/opt/openfoam211/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" #2 in "/lib/x86_64linuxgnu/libc.so.6" #3 void Foam::MULES::limiter<Foam::geometricOneField, Foam::zeroField, Foam::zeroField>(Foam::Field<double>&, Foam::geometricOneField const&, Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh> const&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::zeroField const&, Foam::zeroField const&, double, double, int) in "/opt/openfoam211/platforms/linux64GccDPOpt/lib/libfiniteVolume.so" #4 void Foam::MULES::limit<Foam::geometricOneField, Foam::zeroField, Foam::zeroField>(Foam::geometricOneField const&, Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh> const&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh>&, Foam::zeroField const&, Foam::zeroField const&, double, double, int, bool) in "/opt/openfoam211/platforms/linux64GccDPOpt/lib/libfiniteVolume.so" #5 Foam::MULES::explicitSolve(Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh>&, double, double) in "/opt/openfoam211/platforms/linux64GccDPOpt/lib/libfiniteVolume.so" #6 in "/opt/openfoam211/platforms/linux64GccDPOpt/bin/interDyMFoam" #7 __libc_start_main in "/lib/x86_64linuxgnu/libc.so.6" #8 in "/opt/openfoam211/platforms/linux64GccDPOpt/bin/interDyMFoam" Code:
Interface Courant Number mean: 1.597370001e13 max: 3.367889415e08 Courant Number mean: 8.492343262e05 max: 47.93150517 deltaT = 1.542054914e14 Time = 5.124667456e05 solidBodyMotionFunctions::rotatingMotion::transformation(): Time = 5.124667456e05 transformation: ((0 0 0) (0.9999985598 (0 0.001697167211 0))) AMI: Creating addressing and weights between 1860 source faces and 1860 target faces AMI: Patch source weights min/max/average = 0.9999999855, 1.000273407, 1.00002072 AMI: Patch target weights min/max/average = 0.9999986967, 1.000284245, 1.000020412 Execution time for mesh.update() = 0.25 s > FOAM Warning : From function Time::operator++() in file db/Time/Time.C at line 1010 Increased the timePrecision from 10 to 11 to distinguish between timeNames at time 5.124667455e05 MULES: Solving for alpha1 Phase1 volume fraction = 2.340101492e05 Min(alpha1) = 2.94026316e21 Max(alpha1) = 1 MULES: Solving for alpha1 Phase1 volume fraction = 2.340101492e05 Min(alpha1) = 2.940263151e21 Max(alpha1) = 1 MULES: Solving for alpha1 Phase1 volume fraction = 2.340101493e05 Min(alpha1) = 2.940263144e21 Max(alpha1) = 1 smoothSolver: Solving for Ux, Initial residual = 0.03289536739, Final residual = 5.931840904e09, No Iterations 22 smoothSolver: Solving for Uy, Initial residual = 0.02652586093, Final residual = 6.490399401e09, No Iterations 21 smoothSolver: Solving for Uz, Initial residual = 0.06385161287, Final residual = 9.329710362e09, No Iterations 22 GAMG: Solving for p_rgh, Initial residual = 0.8168173646, Final residual = 5.329552555e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.5691559964, Final residual = 4.333370438e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.6077577617, Final residual = 5.096296929e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.5588035124, Final residual = 4.970930311e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.6553288826, Final residual = 6.082410996e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.6012161847, Final residual = 5.743818456e09, No Iterations 20 time step continuity errors : sum local = 2.767907786e13, global = 6.720125451e14, cumulative = 4.563931424e06 GAMG: Solving for p_rgh, Initial residual = 0.7547558975, Final residual = 8.26595506e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.4767521403, Final residual = 4.33189641e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.500681991, Final residual = 5.307664532e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.4631253002, Final residual = 4.796368045e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.5959068421, Final residual = 6.221265876e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.5300655458, Final residual = 5.512629022e09, No Iterations 20 time step continuity errors : sum local = 6.317765168e13, global = 1.567796998e13, cumulative = 4.563931267e06 GAMG: Solving for p_rgh, Initial residual = 0.6942039022, Final residual = 7.554443004e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.5165126574, Final residual = 5.254193207e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.6475607497, Final residual = 6.787541225e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.5574720976, Final residual = 5.810655172e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.7423454876, Final residual = 7.734423209e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.6165483516, Final residual = 6.41331222e09, No Iterations 20 time step continuity errors : sum local = 1.508767972e12, global = 3.749318923e13, cumulative = 4.563930892e06 GAMG: Solving for p_rgh, Initial residual = 0.8276795512, Final residual = 8.954560476e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.558609031, Final residual = 5.530478839e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.6826850609, Final residual = 7.158367366e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.5769959496, Final residual = 5.989697425e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.7749635251, Final residual = 8.06470252e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.6336308451, Final residual = 6.581617802e09, No Iterations 20 time step continuity errors : sum local = 3.559817726e12, global = 8.850757951e13, cumulative = 4.563930007e06 GAMG: Solving for p_rgh, Initial residual = 0.8547424795, Final residual = 9.199343842e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.5841017513, Final residual = 5.844350229e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.7306124828, Final residual = 7.63158262e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.6069032171, Final residual = 6.294821239e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.8194731067, Final residual = 8.514487236e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.6594746017, Final residual = 6.843822577e09, No Iterations 20 time step continuity errors : sum local = 8.411599725e12, global = 2.092559032e12, cumulative = 4.563927914e06 ExecutionTime = 501.96 s ClockTime = 502 s Interface Courant Number mean: 1.770494893e13 max: 4.980492024e08 Courant Number mean: 0.0002330663221 max: 139.7058538 deltaT = 1.103786901e15 Time = 5.1246674565e05 solidBodyMotionFunctions::rotatingMotion::transformation(): Time = 5.124667456e05 transformation: ((0 0 0) (0.9999985598 (0 0.001697167211 0))) AMI: Creating addressing and weights between 1860 source faces and 1860 target faces AMI: Patch source weights min/max/average = 0.9999999855, 1.000273407, 1.00002072 AMI: Patch target weights min/max/average = 0.9999986967, 1.000284245, 1.000020412 Execution time for mesh.update() = 0.23 s > FOAM Warning : From function Time::operator++() in file db/Time/Time.C at line 1010 Increased the timePrecision from 11 to 12 to distinguish between timeNames at time 5.124667456e05 MULES: Solving for alpha1 Phase1 volume fraction = 2.340101493e05 Min(alpha1) = 2.940263142e21 Max(alpha1) = 1 MULES: Solving for alpha1 Phase1 volume fraction = 2.340101493e05 Min(alpha1) = 2.94026314e21 Max(alpha1) = 1 MULES: Solving for alpha1 Phase1 volume fraction = 2.340101493e05 Min(alpha1) = 2.940263136e21 Max(alpha1) = 1 smoothSolver: Solving for Ux, Initial residual = 0.06477118372, Final residual = 8.642332518e09, No Iterations 26 smoothSolver: Solving for Uy, Initial residual = 0.02933841898, Final residual = 6.713449257e09, No Iterations 25 smoothSolver: Solving for Uz, Initial residual = 0.1070822693, Final residual = 8.780816192e09, No Iterations 27 GAMG: Solving for p_rgh, Initial residual = 0.8893907442, Final residual = 6.03093516e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.2999964081, Final residual = 4.400563944e09, No Iterations 19 GAMG: Solving for p_rgh, Initial residual = 0.2001912473, Final residual = 4.323988997e09, No Iterations 19 GAMG: Solving for p_rgh, Initial residual = 0.1991798232, Final residual = 4.856585103e09, No Iterations 19 GAMG: Solving for p_rgh, Initial residual = 0.2166203943, Final residual = 5.449721145e09, No Iterations 19 GAMG: Solving for p_rgh, Initial residual = 0.2325097647, Final residual = 5.987364552e09, No Iterations 19 time step continuity errors : sum local = 2.31991253e13, global = 4.653817902e14, cumulative = 4.563927868e06 GAMG: Solving for p_rgh, Initial residual = 0.2530189377, Final residual = 6.73790163e09, No Iterations 19 GAMG: Solving for p_rgh, Initial residual = 0.2376544809, Final residual = 6.337231792e09, No Iterations 19 GAMG: Solving for p_rgh, Initial residual = 0.2508918284, Final residual = 7.096666627e09, No Iterations 19 GAMG: Solving for p_rgh, Initial residual = 0.2671706917, Final residual = 7.384764598e09, No Iterations 19 GAMG: Solving for p_rgh, Initial residual = 0.3142132031, Final residual = 8.694850032e09, No Iterations 19 GAMG: Solving for p_rgh, Initial residual = 0.3269917566, Final residual = 8.999909757e09, No Iterations 19 time step continuity errors : sum local = 4.947432806e13, global = 1.061610488e13, cumulative = 4.563927762e06 GAMG: Solving for p_rgh, Initial residual = 0.3896779346, Final residual = 4.01177967e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.3602979394, Final residual = 9.679480202e09, No Iterations 19 GAMG: Solving for p_rgh, Initial residual = 0.4086001377, Final residual = 4.23695469e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.4096242098, Final residual = 4.220163954e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.4891555194, Final residual = 5.018675493e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.4798885981, Final residual = 4.903916905e09, No Iterations 20 time step continuity errors : sum local = 4.304573434e13, global = 9.361216275e14, cumulative = 4.563927668e06 GAMG: Solving for p_rgh, Initial residual = 0.5731626992, Final residual = 5.785088066e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.4692696901, Final residual = 4.65822784e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.5222446816, Final residual = 5.323751511e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.5051875075, Final residual = 5.107623445e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.6048712058, Final residual = 6.10837163e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.5734690483, Final residual = 5.770318327e09, No Iterations 20 time step continuity errors : sum local = 9.866099684e13, global = 2.125564627e13, cumulative = 4.563927455e06 GAMG: Solving for p_rgh, Initial residual = 0.6854054465, Final residual = 6.829505031e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.537832268, Final residual = 5.286631514e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.5994082542, Final residual = 6.028535219e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.5680692402, Final residual = 5.673560985e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.6801833379, Final residual = 6.789791194e09, No Iterations 20 GAMG: Solving for p_rgh, Initial residual = 0.6322346784, Final residual = 6.292624607e09, No Iterations 20 time step continuity errors : sum local = 2.267217457e12, global = 4.844195478e13, cumulative = 4.563926971e06 ExecutionTime = 540.1 s ClockTime = 540 s Interface Courant Number mean: 1.178773889e13 max: 3.493308291e08 Courant Number mean: 8.800536524e05 max: 51.12002034 deltaT = 2.159206695e16 Time = 5.12466745648e05 solidBodyMotionFunctions::rotatingMotion::transformation(): Time = 5.124667456e05 transformation: ((0 0 0) (0.9999985598 (0 0.001697167211 0))) AMI: Creating addressing and weights between 1860 source faces and 1860 target faces AMI: Patch source weights min/max/average = 0.9999999855, 1.000273407, 1.00002072 AMI: Patch target weights min/max/average = 0.9999986967, 1.000284245, 1.000020412 Execution time for mesh.update() = 0.25 s About the number of cores you are right, it is too many. But anyway it has been running way faster with 128 than with 8 cores. I heard before that there should be around 50k cells for each core, so 20 could be a good number of cores. I am trying to initialize with potentialFoam by adding these lines to fvSolutions: Code:
potentialFlow { nNonOrthogonalCorrectors 10; } 

April 23, 2013, 07:55 

#8 
Senior Member
Vieri Abolaffio
Join Date: Jul 2010
Location: Always on the move.
Posts: 308
Rep Power: 10 
your simulation is beyond recovery earlier than that. with the auto time stepping you are already at a DeltaT =1e14 s, average courant number of 8 and max Co 45.
i dont' know what you changed or if it is the mesh but something went horribly wrong earlier. I'd suggest to disable the autotimestep, at least for the tests. in this way if there is an issue the sim crashes instantly instead of dragging up wasting time. consider it as euthanasia. to initialize the solution, change the time scheme in your control dict and just run LTSInterFoam instead of interDyMFoam. it will tell you if something is amiss. if you want to be really cool, add a MRF zone at your rotating region. than you can bring the developed field into your simulation. 

April 23, 2013, 08:01 

#9  
Member
Anon
Join Date: Oct 2012
Posts: 33
Rep Power: 6 
Quote:
So should I run the full simulation with LTSInterFoam, or just the first timestep(s)? How do I apply the result so that it initializes interDyMFoam? About this particular simulation that crashed it must be something wrong with the mesh. I tried to make a coarser mesh just for testing, but I guess it resolves the flow very poorly. 

April 23, 2013, 10:14 

#10 
Senior Member
Vieri Abolaffio
Join Date: Jul 2010
Location: Always on the move.
Posts: 308
Rep Power: 10 
it is very problem dependent, but i'd say run not necessarly until convergence, but not far from it either. 10003000 iterations, maybe?
use mapFields to move the data or just edit manually the BCs if needed. 

Thread Tools  
Display Modes  


Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
InterDyMFoam for breaking wave simulation  yannH  OpenFOAM Running, Solving & CFD  7  July 26, 2010 08:50 
error using interDyMFoam with kOmegaSST to simulate sloshing  anmartin  OpenFOAM Running, Solving & CFD  0  July 20, 2010 13:21 
I lose some fluid during simulation using InterDyMFoam  anmartin  OpenFOAM Running, Solving & CFD  0  April 20, 2010 15:19 
FSI TWOWAY SIMULATION  Smagmon  CFX  1  March 6, 2009 14:24 
slow simulation  Shuo  Main CFD Forum  2  February 28, 2008 20:07 
Sponsored Links 