Computational resource requirements of file updates
Hello everyone!
Before I start explaining, I just want to clarify that this is not technically about a problem I am having, but rather something interesting I have noticed. How I came across this (can be skipped (I want to put this into a spoiler wrapper but do not know if that works here)) I run a MATLAB script that executes SU2 in a loop, (several instances at the same time, starting a new one when a previous one finishes). This led to many command prompts being open and idling after finishing. To avoid that, I decided to have MATLAB execute a batch script that runs SU2 and exits when it finishes. In doing so, I noticed that SU2 was actually about 30% faster (0.37 instead of 0.52 seconds per iteration). However, I ran into the problem that the history file was not being written (and actually, no other file that I was aware of), meaning that it did not work properly. Upon closer examination, I found that my COMODO Internet Security was actually sandboxing the batch script, preventing it from writing to any real location on my computer. After setting up an exception, everything worked, but the computation time was also roughly the same as without the batch script, so no gains there. To summarise: a sandboxed SU2 is much faster than a normal SU2, but does not write any of the vital files. Further investigation
Questions
I am curious to read what more experienced and knowledgeable people think about this. |
The last time I profiled the code (few months back) the output routines barely register (<< 1%)
Is it possible you are writing the output files on every iteration? |
All times are GMT -4. The time now is 17:04. |