CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Running, Solving & CFD (https://www.cfd-online.com/Forums/openfoam-solving/)
-   -   error on linearUpwind (https://www.cfd-online.com/Forums/openfoam-solving/118204-error-linearupwind.html)

immortality May 22, 2013 16:33

error on linearUpwind
 
what does mean this error when using linearUpwind instead of upwind?why can't use linearUpwind?
Code:

2]    in file /opt/openfoam220/src/finiteVolume/lnInclude/surfaceInterpolationScheme.C at line 82.
[2]
FOAM parallel run exiting
[2]

[0]
[0] --> FOAM FATAL IO ERROR:
[0] Unknown discretisation scheme linearUpwind

Valid schemes are :

42
(
CoBlended
Gamma
MUSCL
Minmod
OSPRE
QUICK
SFCD
SuperBee
UMIST
biLinearFit
blended
clippedLinear
cubic
cubicUpwindFit
downwind
filteredLinear
filteredLinear2
filteredLinear3
fixedBlended
limitWith
limitedCubic
limitedLinear
limiterBlended
linear
linearFit
linearPureUpwindFit
localBlended
localMax
localMin
midPoint
outletStabilised
pointLinear
quadraticFit
quadraticLinearFit
quadraticLinearUpwindFit
quadraticUpwindFit
reverseLinear
skewCorrected
upwind
vanAlbada
vanLeer
weighted
)
[0]
[0]
[0] file: /home/ehsan/Desktop/Central/nonUniformMesh/WR_Main_Central_172*10_nonMesh/processor0/../system/fvSchemes.divSchemes.div(tauMC) at line 55.
[0]
[0]    From function surfaceInterpolationScheme<Type>::New(const fvMesh&, Istream&)
[0]    in file /opt/openfoam220/src/finiteVolume/lnInclude/surfaceInterpolationScheme.C at line 82.
[0]
FOAM parallel run exiting
[0]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 3 with PID 10639 on
node Ehsan-com exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[Ehsan-com:10633] 3 more processes have sent help message help-mpi-api.txt / mpi-abort
[Ehsan-com:10633] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Killing PID 10627
 PyFoam WARNING on line 232 of file /usr/local/lib/python2.7/dist-packages/PyFoam/Execution/FoamThread.py : Process 10627 was already dead
Getting LinuxMem: [Errno 2] No such file or directory: '/proc/10627/status'


wyldckat May 22, 2013 17:56

Hi Ehsan,

Summarizing the error message:
Code:

Unknown discretisation scheme linearUpwind
...
system/fvSchemes.divSchemes.div(tauMC) at line 55.
...
surfaceInterpolationScheme<Type>::New(const fvMesh&, Istream&)

"tauMC" from "rhoCentralFoam" is a tensor field. AFAIK, "linearUpwind" is meant to be used with scalar and volume fields. Therefore, you can't use "linearUpwind" with "tauMC".

I could be (or probably am) wrong, but from the very little I know on this topic, not all fields need to be calculated using second order schemes for the simulation to be considered to be valid.

Best regards,
Bruno

immortality May 22, 2013 18:05

Hi Bruno
thanks.it resolved.is tauMC related to viscous terms or how can find its formula?

wyldckat May 22, 2013 18:20

Hi Ehsan,

I ran this command inside one of your modified solvers:
Code:

grep "tauMC" *
Which gave me this:
Code:

rhoCentralFoamModified.C:        volTensorField tauMC("tauMC", muEff*dev2(Foam::T(fvc::grad(U))));
rhoCentralFoamModified.C:              - fvc::div(tauMC)
rhoCentralFoamModified.C:              + (mesh.Sf() & fvc::interpolate(tauMC))

So it's only a matter of opening the file and search for them from inside the text editor ;)

Best regards,
Bruno


All times are GMT -4. The time now is 10:23.