CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > OpenFOAM Native Meshers: snappyHexMesh and Others

snappyhexmesh: Running out of memory (without reason?)

Register Blogs Members List Search Today's Posts Mark Forums Read

Like Tree13Likes

Reply
 
LinkBack Thread Tools Display Modes
Old   February 25, 2013, 08:10
Default snappyhexmesh: Running out of memory (without reason?)
  #1
New Member
 
Marco Müller
Join Date: Feb 2013
Posts: 5
Rep Power: 5
marco.müller is on a distinguished road
Hi,

I'm using SHM for OF 2.1.1 out of box on a VirtualBox machine on Windows 7 with 48 GB RAM and 8 physical cores. 40 GB and all cores are enabled for VirtualBox.

My model runs fine with 2 or 4 cores. When I switch to 8 cores SHM aborts (while not even 10% RAM are used) randomly like this:

Code:
Shell refinement iteration 1
----------------------------

Marked for refinement due to refinement shells    : 10644 cells.
Determined cells to refine in = 2.02 s
Selected for internal refinement : 39957 cells (out of 1056197)
(5): ERROR: dgraphFold2: out of memory (2)
(4): ERROR: dgraphFold2: out of memory (2)
--------------------------------------------------------------------------
mpirun has exited due to process rank 5 with PID 3976 on
I cannot even find this error in the forum. Any hints?

Thanks
Marco
marco.müller is offline   Reply With Quote

Old   February 27, 2013, 19:23
Default
  #2
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 9,659
Blog Entries: 39
Rep Power: 99
wyldckat is just really nicewyldckat is just really nicewyldckat is just really nicewyldckat is just really nicewyldckat is just really nice
Hi Marco,

A few questions:
  1. Which decomposition method are you using while snappyHexMesh is running?
  2. Which OpenFOAM version are you using?
  3. What Linux distribution and architecture are you using?
Best regards,
Bruno
__________________
wyldckat is offline   Reply With Quote

Old   March 6, 2013, 11:03
Default
  #3
New Member
 
Marco Müller
Join Date: Feb 2013
Posts: 5
Rep Power: 5
marco.müller is on a distinguished road
Hi Bruno,

thanks for your answer.

1. I tried ptscotch. with simple there is no such error. So finally this is an acceptable solution.
2. 2.1.1.
3. the ubuntu version this OF version was built on officially.

Marco
marco.müller is offline   Reply With Quote

Old   March 6, 2013, 17:31
Default
  #4
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 9,659
Blog Entries: 39
Rep Power: 99
wyldckat is just really nicewyldckat is just really nicewyldckat is just really nicewyldckat is just really nicewyldckat is just really nice
Hi Marco,

Which architecture of Ubuntu? i386/i686 or x86_64/amd64? You can check by running:
Code:
uname -m
If you are using i386/i686, it's only natural that it crashes when it reaches 2GiB of RAM per process.

Best regards,
Bruno
__________________
wyldckat is offline   Reply With Quote

Old   May 11, 2013, 16:31
Default snappyHexMesh - ERROR: dgraphFold2: out of memory (2)
  #5
New Member
 
caduqued
Join Date: Apr 2009
Location: UK
Posts: 15
Rep Power: 9
caduqued is on a distinguished road
Well, now I have encountered the same problem. Trying to get the meshing with snappyHexMesh, I got:

[QUOTE]...
Feature refinement iteration 24
------------------------------

Marked for refinement due to explicit features : 92 cells.
Determined cells to refine in = 13.28 s
Selected for feature refinement : 136 cells (out of 10647165)
Edge intersection testing:
Number of edges : 36562736
Number of edges to retest : 8226
Number of intersected edges : 4230508
Refined mesh in = 2.76 s
After refinement feature refinement iteration 24 : cells:10648117 faces:36562736 points:15512183
Cells per refinement level:
0 1149402
1 228530
2 967512
3 2859801
4 5442872
(14): ERROR: dgraphFold2: out of memory (2)
(13): (16): ERROR: (9): (15): ERROR: dgraphFold2: out of memory (3)
dgraphFold2: out of memory (2)ERROR: [17]
dgraphFold2: out of memory (2)ERROR:
dgraphFold2: out of memory (2)#[11] #
0 0 Foam::error: printStack(Foam::Ostream&)Foam::error: printStack(Foam::Ostream&)--------------------------------------------------------------------------
mpirun has exited due to process rank 9 with PID 23598 on
...
[/QUOTE]

The only available information seems to be related with the scotch library, but not much info really.

In my case, I am running snappyHexMesh in a cluster using 64bit (x86_64), 36 cores, and requesting 4GB per core (way too decent for this type of mesh)... and still... ohh surprise, out of memory problem. Does anyone know how to solve this?

Regards,
caduqued is offline   Reply With Quote

Old   May 11, 2013, 16:48
Default
  #6
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 9,659
Blog Entries: 39
Rep Power: 99
wyldckat is just really nicewyldckat is just really nicewyldckat is just really nicewyldckat is just really nicewyldckat is just really nice
Greetings caduqued,

If you ran in parallel, then the library "ptscotch" does have some limitations.

By the way, which OpenFOAM version are you using? Because versions older than OpenFOAM 2.2.0 are using Scotch 5.1.1; while as of OpenFOAM 2.2.0, it uses Scotch 6.0.0, which is allegedly (missing reference ) a lot better!

Best regards,
Bruno
__________________
wyldckat is offline   Reply With Quote

Old   May 11, 2013, 16:58
Default
  #7
New Member
 
caduqued
Join Date: Apr 2009
Location: UK
Posts: 15
Rep Power: 9
caduqued is on a distinguished road
Hi Bruno,

Thanks for your quick response. Yes, I am now using OpenFOAM 2.2.0, with (in fact) scotch 2.0.0. The parallelization method I am using is scotch (not ptscotch, as it used to be before), but I assume that is irrelevant, as it is just the name of the method (library being always scotch). With this hindsight, I can now (sadly !) confirm that it seems this new scotch version also exhibits some problems, although I must confess didn't know about them. So, in the meantime, waiting for a fix, I assume it will be safer to just stick to one of the traditional methods, right?

Regards,

Carlos
caduqued is offline   Reply With Quote

Old   May 11, 2013, 17:08
Default
  #8
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 9,659
Blog Entries: 39
Rep Power: 99
wyldckat is just really nicewyldckat is just really nicewyldckat is just really nicewyldckat is just really nicewyldckat is just really nice
Hi Carlos,

I forgot that they did rename "ptscotch" to "scotch" when used in parallel, so that the same dictionary could be used for the decomposition.

The only other thing I can remember about is the "multilevel" decomposition strategy... I mentioned it some time ago here: SnappyHexmesh crashes with many processes - post #8. It's not the silver bullet, but if used correctly, it might help to get the best of both worlds...

Best regards,
Bruno
JR22 likes this.
__________________
wyldckat is offline   Reply With Quote

Old   May 11, 2013, 20:31
Default
  #9
New Member
 
caduqued
Join Date: Apr 2009
Location: UK
Posts: 15
Rep Power: 9
caduqued is on a distinguished road
Quote:
Originally Posted by wyldckat View Post
Hi Carlos,

I forgot that they did rename "ptscotch" to "scotch" when used in parallel, so that the same dictionary could be used for the decomposition.

The only other thing I can remember about is the "multilevel" decomposition strategy... I mentioned it some time ago here: SnappyHexmesh crashes with many processes - post #8. It's not the silver bullet, but if used correctly, it might help to get the best of both worlds...

Best regards,
Bruno
Hi Bruno,

Thanks again. I did not know about the multilevel strategy, thanks a lot for bring my attention to it!!! (with openFOAM every day is a learning day!!!)

I will try it...

Regards,

Carlos
caduqued is offline   Reply With Quote

Old   October 13, 2014, 00:00
Default about dgraphFold2: out of memory
  #10
New Member
 
Join Date: Mar 2013
Posts: 14
Rep Power: 5
Slanth is on a distinguished road
Hello, caduqued, about the dgraphFold2: out of memory error, have you solved it? I met this problem and couldn't find a solution.
Slanth is offline   Reply With Quote

Old   October 18, 2014, 14:50
Default
  #11
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 9,659
Blog Entries: 39
Rep Power: 99
wyldckat is just really nicewyldckat is just really nicewyldckat is just really nicewyldckat is just really nicewyldckat is just really nice
Greetings Slanth,

You'll have to provide more specific details about the case you're meshing, because the error message:
Code:
dgraphFold2: out of memory error
can occur for a lot of reasons.
It is even possible that the problem is:
  1. With your base mesh.
  2. With the decomposition method being used.
  3. With how many cores being used.
  4. With which OpenFOAM version you're using.
Best regards,
Bruno
__________________
wyldckat is offline   Reply With Quote

Old   October 21, 2014, 05:33
Default
  #12
New Member
 
Chris
Join Date: May 2011
Posts: 12
Rep Power: 7
capucsc is on a distinguished road
I likewise encounter this error frequently. I'm meshing on a 32 core Opteron machine. I used 'scotch' for decomposition and I work with OF2.3.0. I get messages like:

Quote:
(16): ERROR: dgraphFold2: out of memory (2)
(17): ERROR: dgraphFold2: out of memory (2)
(28): (29): ERROR: dgraphFold2: out of memory (2)
(31): ERROR: dgraphFold2: out of memory (2)
ERROR: dgraphFold2: out of memory (2)
about 4 out of 10 runs. I've watched this happen with top running, and as other commented have noted, there isn't any issue with available RAM. It would be great to work out a solution, as this is the most unreliable aspect of using OF.

Other details:
uname -a:
Quote:
Linux sim2 3.13.0-24-generic #47-Ubuntu SMP Fri May 2 23:30:00 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux
ldd /home/someuser/OpenFOAM/OpenFOAM-2.3.0/platforms/linux64GccDPOpt/bin/snappyHexMesh:
Quote:
linux-vdso.so.1 => (0x00007fff095b1000)
libfiniteVolume.so => /home/someuser/OpenFOAM/OpenFOAM-2.3.0/platforms/linux64GccDPOpt/lib/libfiniteVolume.so (0x00007fefdfe65000)
libdecompositionMethods.so => /home/someuser/OpenFOAM/OpenFOAM-2.3.0/platforms/linux64GccDPOpt/lib/libdecompositionMethods.so (0x00007fefdfc10000)
libptscotchDecomp.so => /home/someuser/OpenFOAM/OpenFOAM-2.3.0/platforms/linux64GccDPOpt/lib/openmpi-1.6.5/libptscotchDecomp.so (0x00007fefdfa01000)
libmeshTools.so => /home/someuser/OpenFOAM/OpenFOAM-2.3.0/platforms/linux64GccDPOpt/lib/libmeshTools.so (0x00007fefdf461000)
libsurfMesh.so => /home/someuser/OpenFOAM/OpenFOAM-2.3.0/platforms/linux64GccDPOpt/lib/libsurfMesh.so (0x00007fefdf168000)
libfileFormats.so => /home/someuser/OpenFOAM/OpenFOAM-2.3.0/platforms/linux64GccDPOpt/lib/libfileFormats.so (0x00007fefdeef0000)
libdynamicMesh.so => /home/someuser/OpenFOAM/OpenFOAM-2.3.0/platforms/linux64GccDPOpt/lib/libdynamicMesh.so (0x00007fefde962000)
libautoMesh.so => /home/someuser/OpenFOAM/OpenFOAM-2.3.0/platforms/linux64GccDPOpt/lib/libautoMesh.so (0x00007fefde559000)
libOpenFOAM.so => /home/someuser/OpenFOAM/OpenFOAM-2.3.0/platforms/linux64GccDPOpt/lib/libOpenFOAM.so (0x00007fefddc35000)
libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007fefdda16000)
libstdc++.so.6 => /usr/lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007fefdd712000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007fefdd40c000)
libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007fefdd1f5000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007fefdce2f000)
libPstream.so => /home/someuser/OpenFOAM/OpenFOAM-2.3.0/platforms/linux64GccDPOpt/lib/openmpi-1.6.5/libPstream.so (0x00007fefdcc21000)
libtriSurface.so => /home/someuser/OpenFOAM/OpenFOAM-2.3.0/platforms/linux64GccDPOpt/lib/libtriSurface.so (0x00007fefdc98e000)
libptscotch.so => /home/someuser/OpenFOAM/ThirdParty-2.3.0/platforms/linux64GccDPOpt/lib/openmpi-1.6.5/libptscotch.so (0x00007fefdc742000)
libptscotcherrexit.so => /home/someuser/OpenFOAM/ThirdParty-2.3.0/platforms/linux64GccDPOpt/lib/openmpi-1.6.5/libptscotcherrexit.so (0x00007fefdc53e000)
libscotch.so => /home/someuser/OpenFOAM/ThirdParty-2.3.0/platforms/linux64GccDPOpt/lib/openmpi-1.6.5/libscotch.so (0x00007fefdc2b5000)
librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007fefdc0ad000)
libextrudeModel.so => /home/someuser/OpenFOAM/OpenFOAM-2.3.0/platforms/linux64GccDPOpt/lib/libextrudeModel.so (0x00007fefdbe96000)
liblagrangian.so => /home/someuser/OpenFOAM/OpenFOAM-2.3.0/platforms/linux64GccDPOpt/lib/liblagrangian.so (0x00007fefdbc77000)
libedgeMesh.so => /home/someuser/OpenFOAM/OpenFOAM-2.3.0/platforms/linux64GccDPOpt/lib/libedgeMesh.so (0x00007fefdba0e000)
libdistributed.so => /home/someuser/OpenFOAM/OpenFOAM-2.3.0/platforms/linux64GccDPOpt/lib/libdistributed.so (0x00007fefdb7ba000)
libz.so.1 => /lib/x86_64-linux-gnu/libz.so.1 (0x00007fefdb5a0000)
/lib64/ld-linux-x86-64.so.2 (0x00007fefe128c000)
libmpi.so.1 => /home/someuser/OpenFOAM/ThirdParty-2.3.0/platforms/linux64Gcc/openmpi-1.6.5/lib64/libmpi.so.1 (0x00007fefdb1f8000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007fefdafd9000)
libutil.so.1 => /lib/x86_64-linux-gnu/libutil.so.1 (0x00007fefdadd6000)
capucsc is offline   Reply With Quote

Old   October 26, 2014, 11:35
Default
  #13
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 9,659
Blog Entries: 39
Rep Power: 99
wyldckat is just really nicewyldckat is just really nicewyldckat is just really nicewyldckat is just really nicewyldckat is just really nice
Greetings capucsc,

Quote:
Originally Posted by capucsc View Post
I likewise encounter this error frequently. I'm meshing on a 32 core Opteron machine. I used 'scotch' for decomposition and I work with OF2.3.0. I get messages like:
[...]
about 4 out of 10 runs.
You're able to reproduce this error a lot more than I was expecting
Any chance you have one of those examples that have this issue in a small file/case size and be allowed to share it with us, either publicly or privately? This way it would make it a lot easier to try and diagnose the source of this problem!

Best regards,
Bruno
capucsc likes this.
__________________
wyldckat is offline   Reply With Quote

Old   October 27, 2014, 18:30
Default
  #14
New Member
 
Chris
Join Date: May 2011
Posts: 12
Rep Power: 7
capucsc is on a distinguished road
Hi Bruno,

The cases can be found here:

https://www.dropbox.com/s/4u4jk2ps6o...es.tar.gz?dl=0

Thanks for taking a look!
-Chris
wyldckat likes this.
capucsc is offline   Reply With Quote

Old   October 30, 2014, 13:16
Default Same issue
  #15
Member
 
Brock Lee
Join Date: Sep 2012
Location: Midwest
Posts: 38
Rep Power: 6
GRAUPS is on a distinguished road
Gentlemen,

I'm also seeing this same issue on a meshing case every once in a while. I'm on an Intel 16 core machine using OF 2.3.x and the scotch decomposition. Unfortunately, I don't have a case I can share at the moment, but please do let me know if a public bug report gets created for this so that I can keep an eye on it. If I come across a case I can share, I will do so!

Thanks!

Brock
capucsc likes this.
GRAUPS is offline   Reply With Quote

Old   October 30, 2014, 13:28
Default
  #16
New Member
 
Chris
Join Date: May 2011
Posts: 12
Rep Power: 7
capucsc is on a distinguished road
I believe, although I have little in way of evidence, that the occurrence of this error is tied to the number of cores used. I never encounter problems when running with 6 cores, but about 70% of runs fail when using 32 cores. If we were to make a bug report, where would it go? OpenFOAM or scotch or somewhere else?
capucsc is offline   Reply With Quote

Old   October 30, 2014, 14:29
Default
  #17
Member
 
Brock Lee
Join Date: Sep 2012
Location: Midwest
Posts: 38
Rep Power: 6
GRAUPS is on a distinguished road
The main place to report bugs for openfoam is here...

http://www.openfoam.org/mantisbt/

But if you report something, make sure you give the developer everything he needs to fix or at least diagnose the problem. It's best if you can isolate the problem down to a small model and then give them that model to debug with. I've had some good luck in reporting, one of the bugs I reported was fixed within 24 hrs!

Brock
wyldckat likes this.
GRAUPS is offline   Reply With Quote

Old   November 21, 2014, 11:12
Default
  #18
Member
 
Simon Arne
Join Date: May 2012
Posts: 42
Rep Power: 6
simpomann is on a distinguished road
Hey,
I encountered the same problem today while also using scotch decomposition with 32 processors on a 64bit cluster headnode. Before, I did a similiar case with the same geometry, but I split up the .stl-files and now I get this error every time.

Case is running if i switch to 27 cores though.

Code:
(16): ERROR: dgraphFold2: out of memory (2)
(17): ERROR: dgraphFold2: out of memory (2)
(28): (29): ERROR: dgraphFold2: out of memory (2)
(31): ERROR: dgraphFold2: out of memory (2)
ERROR: dgraphFold2: out of memory (2)
Is there a solution somewhere right now? I looked through the bug tracker but couldnt find a ticket.
simpomann is offline   Reply With Quote

Old   November 21, 2014, 11:39
Default
  #19
New Member
 
Chris
Join Date: May 2011
Posts: 12
Rep Power: 7
capucsc is on a distinguished road
I have yet to pinpoint the exact reason this happens, so I haven't submitted a bug report. Debugging this turns out to be rather tricky. I just use fewer processors to mesh.
capucsc is offline   Reply With Quote

Old   November 24, 2014, 03:38
Default
  #20
New Member
 
Join Date: Mar 2013
Posts: 14
Rep Power: 5
Slanth is on a distinguished road
I have tried many times and finally walk around. At first, you should use surfaceCheck utility to make sure your stl file is closed and does not have any unconneceted parts. Even the total stl file has been decomposed into several parts, it should be closed when combining all of them.
Slanth is offline   Reply With Quote

Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
How to optimize the memory usage when using FEM vasilis Main CFD Forum 11 August 24, 2009 23:57
Run-Time memory conifguration Error Graeme CFX 1 February 21, 2006 00:04
"Continuous Memory" Errors in COSMOS FloWorks 2005 Dave Main CFD Forum 2 November 16, 2005 20:03
How to free memory after running Fluent ? David FLUENT 1 February 27, 2004 04:59
ECC Memory Needed for Clusters? Jonas Larsson Main CFD Forum 1 January 17, 2001 09:08


All times are GMT -4. The time now is 21:48.