Paraview Free/Delete all memory
Hello,
I am encountering some difficulties to remove all the data from paraview. After loading several time steps of a pvd files (Multi-Block Data) using a python script, the pc runs out of memory and paraview crashes. However, after each time steps, I delete all the variables with instructions "Delete(var)" and "del var". It is the same problem if I do a Edit/Delete all, I still have the memory used according to the Memory inspector "client"... The only way I see to free the memory is to kill the process paraview from a terminal or to kill the client in the memory inspector. Nevertheless, this also closes paraview. Does someone knows how to delete all the memory between the first time-step and the second load? Maybe there is a way using server connections. My idea would be : 1) create a connection. 2) load the data and run a job. 3) close the server before reopening a new one to compute the second load... Do you have a solution, suggestions or other ideas? Have a good day, Eric |
Greetings Eric,
I know that ParaView is known to have memory leaking problems, but last reports I've seen indicate that ParaView 3.98.0 no longer has this problem... it's here: http://www.cfd-online.com/Forums/ope...ls-my-ram.html Best regards, Bruno |
Dear Bruno,
Thank you a lot for your comment. Indeed, I read that too and I am currently using this version of paraview (3.98) because of this problem and because it has additional export possibilities in vectorial format... Nevertheless, I still get memory leaking problems with this version. Maybe because I am controlling paraview from python scripts... In fact, I am using scripts because I did not find some capabilities like saving automatically contours in new vtk files using filters, or computing and saving computed area of surfaces or volumes but I know how to do that using scripts... Thanks again and best regards, Eric |
Dear Foamers,
this discussion is a little old, but maybe somebody may still be interested in the topic. I'm using Paraview 4.1.0 (the version that comes with OF 2.4.0) and I'm trying to export in batch some data (from vtk files to csv files) using pvpython. The script loops over some 1000s of files in several directories and after 6000-7000 files a memory leak error happens, which is quite annoying considering that I'm using a 32 GB workstation...:( In the script loop I delete all the objects I generated within a single loop instance to try free memory but this doesn't help. Has anybody found a solution to this memory leak issue? Thanks and Best Regards Cristian |
Quick answer: Without example code for reproducing the same error, it's pretty hard to try and diagnose the issue. Either way, have you seen this answer: http://stackoverflow.com/a/26514796 ?
|
Dear wyldckat,
thanks for your reply. I was already aware of the solution proposed in the link you posted. I actually found it on the main Paraview wiki: http://www.paraview.org/Wiki/ParaView/Python_Scripting I am not a python expert and maybe I made some mistakes with the Delete() and del functions. I here report my script (I hope I am doing it in the right way): Code:
#Script used to export csv from vtk datafiles Thanks for the help, Best Regards, Cristian |
Hi Cristian,
Probably a better way to loop over files is to first build the pipeline and then only change the reader's and writer's filenames. Here is an example script: Code:
reader = OpenDataFile(files[0][0]) -Mikko |
Dear Mikko,
Thanks for your suggestion. Your snippet made my day! :) Now paraview doesn't leak memory while the script is running!. Maybe creating / deleting a reader and a writer at any loop instance (as I used to do) made paraview leaving some garbage in memory... I report here the new version of the script since maybe it could be useful to others having similar needs/issues: Code:
#Script used to export csv from rho datafiles Best regards, Cristian |
I'm glad that it worked. :)
I just wanted to add that I think Paraview is not leaking because I've made same kind of loops with big datasets and for many files. I think you are not deleting the writer. Thus at the end of your loop you should delete all the sources in inverse order than generating them: Code:
# Delete the generated objects Code:
for key, value in GetSources().items(): |
Dear Mikko,
Thanks again for your reply. I actually tried several combination of del / Delete() deleting the pipeline objects in reverse order, but without success. In particular paraview complained for the "Delete(writer)" line... You can see an attempt of that in my second post where I just do "del writer" Probably I made some mistakes somewhere.... I anyway prefer the solution you posted since changing with a loop the reader input seems to me more clever and efficient rather than creating/deleting the reader at every time. I actually wasn't aware of that possibility...good to know for my next scripts :) Thanks and Best Regards, Cristian |
All times are GMT -4. The time now is 21:25. |