CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Running, Solving & CFD (https://www.cfd-online.com/Forums/openfoam-solving/)
-   -   Solver breaks down on finer mesh (https://www.cfd-online.com/Forums/openfoam-solving/172185-solver-breaks-down-finer-mesh.html)

KingKraut May 25, 2016 11:34

Solver breaks down on finer mesh
 
5 Attachment(s)
Dear all,

I have a problem with a custom solver that gives fine results on a mesh of a certain size (~3 Mio cells). When I try to run a simulation on a finer mesh on the same geometry (~6 Mio cells), the solver breaks down at a certain point with the following error message:

Quote:

#0 Foam::error::printStack(Foam::Ostream&) at ??:?
#1 Foam::sigFpe::sigHandler(int) at ??:?
#2
at sigaction.c:?
#3 void Foam::divide<Foam::fvPatchField, Foam::Vector<double> >(Foam::FieldField<Foam::fvPatchField, Foam::Vector<double> >&, Foam::FieldField<Foam::fvPatchField, Foam::Vector<double> > const&, Foam::FieldField<Foam::fvPatchField, double> const&) at ??:?
#4 Foam::tmp<Foam::GeometricField<Foam::Vector<double >, Foam::fvPatchField, Foam::volMesh> > Foam::operator/<Foam::Vector<double>, Foam::fvPatchField, Foam::volMesh>(Foam::tmp<Foam::GeometricField<Foam ::Vector<double>, Foam::fvPatchField, Foam::volMesh> > const&, Foam::tmp<Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh> > const&) at ??:?
#5 Foam::simpleFilter::operator()(Foam::tmp<Foam::Geo metricField<Foam::Vector<double>, Foam::fvPatchField, Foam::volMesh> > const&) const at ??:?
#6 Foam::incompressible::LESModels::dynamicSmagorinsk y::cD(Foam::GeometricField<Foam::SymmTensor<double >, Foam::fvPatchField, Foam::volMesh> const&) const at ??:?
#7 Foam::incompressible::LESModels::dynamicSmagorinsk y::updateSubGridScaleFields(Foam::GeometricField<F oam::SymmTensor<double>, Foam::fvPatchField, Foam::volMesh> const&) at ??:?
#8 Foam::incompressible::LESModels::dynamicSmagorinsk y::dynamicSmagorinsky(Foam::GeometricField<Foam::V ector<double>, Foam::fvPatchField, Foam::volMesh> const&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::transportModel&, Foam::word const&, Foam::word const&) at ??:?
#9 Foam::incompressible::LESModel::adddictionaryConst ructorToTable<Foam::incompressible::LESModels::dyn amicSmagorinsky>::New(Foam::GeometricField<Foam::V ector<double>, Foam::fvPatchField, Foam::volMesh> const&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::transportModel&, Foam::word const&) at ??:?
#10 Foam::incompressible::LESModel::New(Foam::Geometri cField<Foam::Vector<double>, Foam::fvPatchField, Foam::volMesh> const&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::transportModel&, Foam::word const&) at ??:?
#11 Foam::incompressible::turbulenceModel::addturbulen ceModelConstructorToTable<Foam::incompressible::LE SModel>::NewturbulenceModel(Foam::GeometricField<F oam::Vector<double>, Foam::fvPatchField, Foam::volMesh> const&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::transportModel&, Foam::word const&) at ??:?
#12 Foam::incompressible::turbulenceModel::New(Foam::G eometricField<Foam::Vector<double>, Foam::fvPatchField, Foam::volMesh> const&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::transportModel&, Foam::word const&) at ??:?
#13
at ??:?
#14 __libc_start_main at ??:?
#15
at ??:?
I have attached the both log_files with the error message attached for the larger mesh (log_heapFoam_Intermediate). The smaller case I only attached, to show, that it continues running. I also attached the according checkMesh files for both grids. They both lack a little in the mesh non-orthogonality. But I tackled that with some non-orthogonality correctors in the fvsolution file.

All the other input files (boundary conditions for k, nuSgs, U, p and so on) are completely identical for both cases! From different posts I found in the forum I deduced, that this error can be due to some erroneously chosen BCs (dividing by zero for example). However, since the BCs worked for the smaller grid, I think this should not be the reason? Anyway I played around a little there, too, but to no avail...
(I will attach these in a second post. right now)

I am really stuck here at the moment. Can anybody tell me in which direction I need to look for a fix?
By the way I tried the case with a simpler solver, too. icoFoam and pisoFoam fail, too...

I guess this is an issue with the mesh, which bothers me a lot... It was hard enough to produce this one with ICEM. Maybe there is a possibility to cure this, with some more changes in fvSchemes or vfSolution?

Thanks a lot to anyone looking into this! Any help is highly appreciated.
I hope I did not miss out any information. If so, please let me know!

Thanks a lot!

Johannes

KingKraut May 25, 2016 11:35

k, nuSgs, p, U
 
4 Attachment(s)
As promised the input files for k, nuSgs, p, U in the "0" folder.

All help is appreciated!!

Thanks again!

arsalan.dryi May 25, 2016 15:07

Hi Johannes,

As the log file shows, the finer mesh crashes at Time 0.8s, did you try the coarser one to this time or you are using from the results of a coarser mesh as initial values of the finer one?!

Regards.

KingKraut May 26, 2016 05:58

Dear arsalan,

thanks a lot for your reply.
I have not used the results from the coarser mesh as starting conditions for the finer mesh.
I tried both cases both with starting times 0 and 0.8 seconds. But the results were the same. the coarser mesh runs fine and on the finer mesh the solver crashes.
Sorry for the mixup with the startTime in the logfiles. It was just the ones I grabbed.

But as I think about it now. Maybe this is not a bad idea. Maybe I can start with the coarser mesh at times 0.7 and use the results computed for time 0.8 then as starting conditions on the finer mesh? OpenFOAM does give this option, does it not? I will try this as soon as I get back to it next monday. Thanks a lot for the idea :-)
However I am not to optimistic with that, since as I said with the exact same starting conditions the coarser mesh works fine, however the larger mesh fails...

So if you (or anybody else) has another idea, which I could check?


Anyway thanks a lot for looking into this!!!

Best regards
Johannes

KingKraut May 30, 2016 04:58

mapFields crashes
 
2 Attachment(s)
Dear all,

I tried mapping the results from the coarse mesh, on which the computations work, onto the finer mesh. However, the mapFields command crashes with the attached log-file and output to the terminal. In the end a segmentation fault occurs...

Does this mean my mesh is bogus? Apparently, the mapFields command crashes while mapping the field nu. Maybe only this field is "bad"?

As I said I am really stuck on this right now! :-( checkMesh says the mesh is OK and the same BCs work on the coarser grid...

Anybody any other idea?!

Thanks a lot again for any time you dedicate into this!!

Best regards
Johannes

KingKraut May 30, 2016 06:01

checkMesh -allGeometry
 
2 Attachment(s)
Dear all,

from the warning in the mapFields logfile
Quote:

FOAM Warning :
From function Foam::List<Foam::tetIndices> Foam::polyMeshTetDecomposition::faceTetIndices(con st polyMesh&, label, label)
in file meshes/polyMesh/polyMeshTetDecomposition/polyMeshTetDecomposition.C at line 570
No base point for face 830513, 4(1571 1651 1661 1581), produces a valid tet decomposition.
I found in other forum threads, that the command checkMesh -allGeometry might give more clues about the mesh quality.
I attached the output of these commands for both meshes. Like I expected, the larger mesh is worse than the smaller one, however, the meshCheck fails for the same two reasons. So I don't think that this is the cause for the failing computations.

Anyone with any furhter ideas on this...??? :-(

Thanks again!!

Best regards
Johannes

TobM May 31, 2016 02:31

In my experience you have to get rid of the face tets error. Also, in some solvers, like interFoam, small determinants can produce an error like this. Non-orthogonality is not great either, but I don't think thats your main problem.
checkMesh produces face- and cellSets from its quality report, so you can visualize where in your mesh the problem is.
You have to put significant more work into your mesh. Maybe try another meshing tool like cfMesh for example.

KingKraut May 31, 2016 03:30

Dear TobM,

thanks a lot for your reply.

I understand, that the mesh has flaws. And these become worse, the more complex the geometry becomes and the more cells the grid has. This makes the meshing more difficult, too. These meshes already took quite some time! To be honest I am not actually sure if it is possible at all, for the used geometries of blood vessels to further optimize the mesh. These meshes have been created with ICEM by ANSYS. I have not yet had the time to look into cfMesh, but I am not sure, if this will be able to produce meshes of higher quality for these purpose then ICEM does... :-(
Thanks anyway for the remark. I will surely give it a try!!

With regard to this problem of the solver code just breaking off, I found an answer in this thread:
http://www.cfd-online.com/Forums/ope...ctingfoam.html
I used the command
Quote:

unset FOAM_SIGFPE
from within the terminal from where I started the simulation and it starts off fine. The computed results also appear to be sensible so far. The inclusion within a batch jobscript also worked fine on a cluster.

I found this thread already a couple of days ago, when searching for a solution of the problem, but did not want to use the command, because I was a little suspicious - which I still am. I could not find good explanations of what problems this command might cause in the end. I understand this as a warning, that at some point, I MIGHT be dividing by zero (or taking the squareroot of a negative number or any other forbidden mathematical operation) but this does not happen necessarily. Or how can I understand, that the computations start off initially if I unset FOAM_SIGFPE?
Does anyone have a good explanation for this behaviour?
I would be glad if anyone could shed some light on this!!

Thanks again for all remarks and suggestions.

Best regards
Johannes

akidess May 31, 2016 03:39

Your mesh is really bad, and blood vessels are typically fairly simple geometries (you might want to include a picture to convince me otherwise). I'm absolutely convinced you can do better.

TobM May 31, 2016 03:42

I know that meshing can take a lot of time.
Different meshing tools mesh with respect to different quality criteria. I don't know a lot about ICEM, so I can't help you there.
It is often possible to run a simulation with a mesh which is not good when using limited schemes, orthogonal correction and small time steps and so on. Normally it is worth it to put more time into the mesh...

KingKraut May 31, 2016 04:06

2 Attachment(s)
Dear all,

thanks again for your remarks.

In the attachment you find pictures of the vessel geometries, we meshed. These are coronary vessels of sizes down to 0.1 micrometers.
They were not simple to mesh!
The smaller model took around 50 hours and the larger model around 200 hours to achieve a mesh of the quality of which I uploaded the checkMesh-files.

In order to get optimized results for the directed flows in the tube-like regions a pure hexahedral meshing formalism was applied and further refinement at the vessel walls was integrated. This takes time because especially at the bifurcations (often more than 2 or 3 branches at one bifurcation) things get very complicated!
We are now aiming at a hybrid mesh of tet cells at the bifurcations and hex cells in the longer vessels without bifurcations. This will reduce the accuracy of the results, since hex meshes are better for such directed flows, but the meshing just takes too long at the moment, since in order to get a good mesh with ICEM many things need to be done by hand...

You are right, the mesh could be done better, but we simply don't have the time to do this...

Anyway thanks again for the remarks!!
I will have a look into cfMesh. Maybe this can give some more clues...

Best regards
Johannes

akidess May 31, 2016 04:35

Honestly, it doesn't look too bad. Do you have an STL of the geometry? Have you tried snappyHexMesh (cfMesh is indeed also worth checking out)? I think you can get a better mesh in less time.

KingKraut May 31, 2016 04:44

Thanks for your answer.

We gave snappyHexMesh a try. However, the problem with this is that the generated hexa-cells are not oriented with the direction of flow (which they are with ICEM's approach). For good computational results the cell faces should be ideally perpendicular to the flow direction.

We do have stls of these geometries, but due to reasons of confidentiality I cannot give them out...

akidess May 31, 2016 06:38

Well, since you are considering using tetras you are giving flow alignment away anyway, and your mesh quality will also impact the accuracy ;) If you add layers the mesh will at least be aligned with the flow at the walls.

KingKraut May 31, 2016 07:33

Thats true. =) I am not too happy with the hybrid-tet-hex-approach, either...

However, with this approach we would still be keeping well aligned hex cells in the tube-like regions of the model. These cells are usually pretty fast to mesh and of high quality.
It is only at the bifurcations (which cause the problems with the hexahedral meshing) where we would be using tets in order to speed up the meshing process at the cost of computational accuracy...
in the hope, that this is a good compromise...

Thanks again! Your suggestions are well appreciated!!


All times are GMT -4. The time now is 12:12.