EpsilonWallFunction crashes with two connected walls

 User Name Remember Me Password
 Register Blogs Members List Search Today's Posts Mark Forums Read

 October 26, 2021, 05:46 EpsilonWallFunction crashes with two connected walls #1 New Member   Join Date: May 2020 Posts: 23 Rep Power: 5 Dear all, I am implementing an own EpsilonWallFunction for a k-epsilon simulation. Inside the WF I would like to compute both the strain rate and vorticity invariants. Here is the structure of the code : Code: ```void Foam::MyepsilonWallFunctionFvPatchScalarField::calculate ( const turbulenceModel& turbModel, const List& cornerWeights, const fvPatch& patch, scalarField& G0, scalarField& epsilon0 ) { const volVectorField& Ut = turbModel.U(); /* Strain rate invariant */ volSymmTensorField S = symm(fvc::grad(Ut)); /* Vorticity invariant */ volTensorField W = skew(fvc::grad(Ut)); ... }``` If we consider a rectangular wind tunnel, the WF works well for a single wall (let's say the ground), for two different and unconnected walls (e.g the ground and the top wall) but it crashes for two connected walls (e.g the ground and a side wall). The errors are: Code: ```Starting time loop Time = 1 smoothSolver: Solving for Ux, Initial residual = 1, Final residual = 0.000410734772, No Iterations 5 smoothSolver: Solving for Uy, Initial residual = 0.999853259, Final residual = 0.000909577149, No Iterations 4 smoothSolver: Solving for Uz, Initial residual = 0.999796967, Final residual = 0.000793379745, No Iterations 4 GAMG: Solving for p, Initial residual = 1, Final residual = 0.000988755628, No Iterations 23 time step continuity errors : sum local = 4.17199017e-05, global = 1.25994915e-05, cumulative = 1.25994915e-05 [c06c4f9bf5aa:13379] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal [c06c4f9bf5aa:13379] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages [c06c4f9bf5aa:13385] *** An error occurred in MPI_Waitall [c06c4f9bf5aa:13385] *** reported by process [1137901569,0] [c06c4f9bf5aa:13385] *** on communicator MPI COMMUNICATOR 3 SPLIT FROM 0 [c06c4f9bf5aa:13385] *** MPI_ERR_TRUNCATE: message truncated [c06c4f9bf5aa:13385] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, [c06c4f9bf5aa:13385] *** and potentially your MPI job)``` NB: • After some tests, the error occurs at the gradient computation line; • It runs in serial; • It runs for 5 processors but not for 15; I would truly appreciate any help or comments! thanks in advance!

 Tags openfoam 7, wall function

 Thread Tools Search this Thread Search this Thread: Advanced Search Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is OffTrackbacks are Off Pingbacks are On Refbacks are On Forum Rules

 Similar Threads Thread Thread Starter Forum Replies Last Post pmtgt ANSYS Meshing & Geometry 8 June 4, 2021 07:15 [snappyHexMesh] Reading checkMesh output streamline90 OpenFOAM Meshing & Mesh Conversion 4 October 28, 2019 09:52 hester OpenFOAM Running, Solving & CFD 4 May 18, 2016 10:20 [Netgen] Import netgen mesh to OpenFOAM hsieh OpenFOAM Meshing & Mesh Conversion 32 September 13, 2011 05:50 daniel_mills OpenFOAM Running, Solving & CFD 44 February 17, 2011 17:08

All times are GMT -4. The time now is 23:51.

 Contact Us - CFD Online - Privacy Statement - Top