|
[Sponsors] |
overPimpleDyMFoam: runs in serial but stuck in parallel |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
January 9, 2022, 09:51 |
overPimpleDyMFoam: runs in serial but stuck in parallel
|
#1 |
New Member
Stefano Negri
Join Date: Jan 2022
Location: Italy
Posts: 2
Rep Power: 0 |
Hi guys!
I am a new OpenFOAM user (using v2012 version) and I'm working on the flow around a moving train. I am working on a simplified case with overPimpleDyMFoam (overset mesh), to check that the case is set in the correct way. The mesh is made by merging (mergeMeshes) a background mesh with an overset mesh; the screenshot is attached. I tried to run the simulation in serial and it seemed to work, but when I decompose the domain and try to run in parallel the simulation starts but remains stuck at the first iteration, precisely when the pressure equations are solved (no error appears, the processors work but the simulation does not procede). Parallel case log: // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time Create mesh for time = 0.7 Selecting dynamicFvMesh dynamicOversetFvMesh Selecting motion solver: multiSolidBodyMotionSolver Applying solid body motion to entire mesh Selecting solid-body motion function linearMotion Applying solid body motion linearMotion to 15411 points of cellZone oversetZone PIMPLE: no residual control data found. Calculations will employ 10 corrector loops Reading field p Reading field U Reading/calculating face flux field phi Creating cellMask field to block out hole cells --> FOAM Warning : From bool Foam::oversetPolyPatch::master() const in file oversetPolyPatch/oversetPolyPatch.C at line 151 The master overset patch is not the first patch. Generally the first patch should be an overset patch to guarantee consistent operation. Creating interpolatedCells field Selecting incompressible transport model Newtonian Selecting turbulence model type RAS Selecting RAS turbulence model kEpsilon RAS { RASModel kEpsilon; turbulence on; printCoeffs on; Cmu 0.09; C1 1.44; C2 1.92; C3 0; sigmak 1; sigmaEps 1.3; } Reading/calculating face velocity Uf No MRF models present No finite volume options present Courant Number mean: 0.000929265723051 max: 0.0706134337253 Starting time loop Courant Number mean: 0.000929265723051 max: 0.0706134337253 deltaT = 0.000123915737299 Time = 0.700124 inverseDistance : detected 2 mesh regions zone:0 nCells:24000 voxels:(22 22 22) bb:(42.9999919961 4.99999199609 3.74999199609) (48.0000080039 10.0000080039 7.50000800391) zone:1 nCells:11666 voxels:(22 22 22) bb:(22.5009964727 -3.00018015675 -4.50032015675) (39.5087767862 3.00042015675 4.50108015675) Overset analysis : nCells : 35666 calculated : 33048 interpolated : 2310 (interpolated from local:0 mixed local/remote:0 remote:2310) hole : 308 DICPCG: Solving for pcorr, Initial residual = 1, Final residual = 0.0143876348735, No Iterations 60 PIMPLE: iteration 1 smoothSolver: Solving for Ux, Initial residual = 0.00212050916343, Final residual = 9.48591688485e-06, No Iterations 4 smoothSolver: Solving for Uy, Initial residual = 0.00110282091759, Final residual = 5.20908926019e-06, No Iterations 4 smoothSolver: Solving for Uz, Initial residual = 0.00716174027334, Final residual = 3.28175448585e-05, No Iterations 4 I also attached the Serial case log and the system folder to let you have more information. I think the problem is in the parallel run, so I tried to change decomposition method (from Hierarchical to Scotch) without results; since the simulation seems to stop at pressure calculation, I also tried to change the pressure solver (and modify tolerances) in fvSolution from PBiCGStab to PCG or GAMG, but even in this case the parallel simulation still gets stuck at the same point. If someone have experienced this problem or know how to help, please let me know! Thanks a lot, Stefano MeshScreenshot.jpg Attachments.zip |
|
January 15, 2022, 07:26 |
|
#2 |
New Member
Stefano Negri
Join Date: Jan 2022
Location: Italy
Posts: 2
Rep Power: 0 |
Hi again,
I found the reason of the problem with the parallel run: in fvSolutions, PIMPLE section, I had the "massFluxInterpolation" activated and I discovered that this setting caused the parallel simulation to be stuck, while the serial case was running with no problems. I am not still sure why, but deactivating this massFluxInterpolation the problem seems to be solved. Hope this may help someone. Stefano |
|
December 26, 2022, 02:59 |
|
#3 |
New Member
Himanshu Banait
Join Date: Dec 2022
Posts: 2
Rep Power: 0 |
Thank You! It Worked.
Himanshu |
|
Tags |
overpimpledymfoam, overset, parallel, serial |
Thread Tools | Search this Thread |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
works in serial, but not parallel - GGDH in rhoSimpleFoam | JackW | OpenFOAM Running, Solving & CFD | 1 | November 15, 2019 07:49 |
different results between serial solver and parallel solver | wlt_1985 | FLUENT | 11 | October 12, 2018 09:23 |
Loop through processors and collect cellLabels of celLZone | hxaxtma | OpenFOAM | 13 | March 22, 2017 15:08 |
cell indexing in parallel runs | manuutin | STAR-CD | 0 | May 10, 2015 18:10 |
Help: Serial code to parallel but even slower | Zonexo | Main CFD Forum | 4 | May 14, 2008 11:26 |