ERROR #001100279 - (malloc) Not enough space
hi everyone, can someone point to how do i fix this error? solver is giving me this after a few hours running. already tried running as "double precision", "large problem", and putting memory allocation factor of 2 on partitioner, solver and interpolator settings. never had this happen before so im kind of lost. got space on hard drive as well - although it says malloc, so memory issue -, and tried running on other machine.
thanks in advance :) Code:
+--------------------------------------------------------------------+ |
I think the malloc error comes about when it has run out of system memory (RAM). Are you sure you have enough RAM for what you are trying to do?
Also consider running over multiple systems in multiprocessor mode, or over more nodes if you are already doing this. |
Quote:
i tried increasing the OS pagefile and it still happened. its been running under intel mpi local, 4 cores (student version limitation). trying now as serial to see if it works. |
Very interesting error.
May I ask which release version you are using? Which OS, Windows or Linux? Your model seems to have a lot of CAD parts, FS23416 is FaceSet 23411 in domain/zone 1. |
Quote:
|
tried doubling ram to 32gb, same error. going crazy here.
|
Maybe share the complete output file (upload in the "Go advanced"-menu? This might give us more insight.
|
hi, here is the output.txt from my last run. thanks :)
edit: sorry, it says "output.txt: Your file of 427.5 KB bytes exceeds the forum's limit of 195.3 KB for this filetype." :( |
Quote:
https://we.tl/t-tQpV8f4hDt thanks! |
I do not download stuff from unknown sites.
Edit the output file to get it under the file size limit. If there are lots of iterations then chop out the iterations where nothing much happens. |
1 Attachment(s)
hey Glenn! I removed some stuff from the middle of the .txt and attached it :)
thanks! |
You are writing a results file every time step, and each results file contains the mesh and full particle track information. This is going to generate a lot of increasingly large transient results files.
Some suggestions: * Do you really need to save a results file every time step? * Do you need to include the particle track data every time step? * Do you need to include the mesh every time step? * Do you need to include all the variables every time step? (eg fluxes and all the other stuff which is rarely used) If you can remove any of these things from the transient results file, or just save them less frequently it will probably make a big difference. Other points: * This simulation is using 5GB on a 32GB machine. It should run OK. * The mesh is quite small (197k nodes) * Simulation is transient, laminar, particle tracking. Nothing too unusual here. |
hey! thanks for the answer!
Quote:
Quote:
Quote:
Quote:
thanks!!!!!! |
You can select the transient results file options in CXFX-Pre, in the output tab (transient results file options).
Everything is pretty obvious, except including the mesh has some implications: * If you include the mesh then all transient results files are complete and can be loaded stand-alone, or used for initial conditions for other runs. * If you do not include the mesh you save the space of the mesh in each file (which if the mesh is stationary makes a lot of sense), but it means you cannot use the no-mesh transient results for initial conditions, and you can only read the data from the time step if you later do write a results file including the mesh. This happens by default at the end of a run. But if your run crashes before you get to the end (like your run is doing now) then all your transient results files are unreadable as there is no mesh information. There is also a lot of options on the particle tracks - how many to include, how much resolution to write, what variables to include and so on. Between the transient results file options and the particle track results file options you will need to reduce the amount of stuff you are saving to avoid this malloc error. |
Quote:
still didnt run results to find out if ill miss anything, but ill post here as soon as i do. now that it ran, would you guess the reason why it wont survive under “standard”? thanks again, leo |
The file size of standard is much larger. So it has filled your hard drive up, or the particle tracks are taking up so much space that it runs out of memory working it all out.
If you want to work out exactly why it is failing you are going to have to try a few options (eg is it the particle tracks? or the general results file?) and check things like memory useage and disk space during a run. Also, don't forget that as you appear to be a student you probably have a disk space limit. Maybe the CFX temporary files got big enough they hit your disk space limit? All of these things are unique to your setup, so you are going to have to check them out. We cannot tell you what the problem is. |
Quote:
best, leo |
CFX has no space limit. It has been scaled to billions of nodes and thousands of partitions. The student version does have a node and partition limit but if it hits those it should give an error clearly saying it has hit that limit.
|
All times are GMT -4. The time now is 17:37. |