CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   SU2 (https://www.cfd-online.com/Forums/su2/)
-   -   .cfg not actually writing paraview vtk (or general outputs) (https://www.cfd-online.com/Forums/su2/176402-cfg-not-actually-writing-paraview-vtk-general-outputs.html)

Jasfss August 17, 2016 12:51

.cfg not actually writing paraview vtk (or general outputs)
 
I've been running the simple Euler NACA0012 test case to validate my SU2 installation (after having to remake and install to allow mpi support) and I've noticed something. Even though I've specified "PARAVIEW" as the output format and it shows up as this, not only is nothing being written in paraview .vtk, but the only things being written are the adjoint files (as .dat). Not really sure what to check for or fix in this case. I can give a copy of the .cfg and the terminal outputs if that would help.

hlk August 17, 2016 14:57

Quote:

Originally Posted by Jasfss (Post 614398)
I've been running the simple Euler NACA0012 test case to validate my SU2 installation (after having to remake and install to allow mpi support) and I've noticed something. Even though I've specified "PARAVIEW" as the output format and it shows up as this, not only is nothing being written in paraview .vtk, but the only things being written are the adjoint files (as .dat). Not really sure what to check for or fix in this case. I can give a copy of the .cfg and the terminal outputs if that would help.

Thanks for your question.
It sounds like it had previously worked in serial, so without seeing more detail my only guess is that something went wrong with the parallel installation.
A common problem that would confirm that the parallel installation failed is if you see repeated lines of output - aka, several lines with iteration #0. Please check whether you have this problem, and the installation instructions, and if that still doesn't fix it please post more detail about your output, especially any error messages.

Jasfss August 17, 2016 15:21

Quote:

Originally Posted by hlk (Post 614411)
A common problem that would confirm that the parallel installation failed is if you see repeated lines of output - aka, several lines with iteration #0.

This had happened in previous failed attempts to install with mpirun support, but the install I have now does not have this issue, and I can confirm that it is successfully utilizing mpi support.

For example, if I attempt to run the euler test case for the naca0012, simply modifying the .cfg to output to Paraview rather than Tecplot, say doing
Code:

mpirun -np 4 SU2_CFD inv_NACA0012.cfg
in the terminal after navigating to the correct directory, I can see in the output from the terminal:

Quote:

-------------------------- Output Information ---------------------------
Writing a flow solution every 250 iterations.
Writing the convergence history every 1 iterations.
The output file format is Paraview ASCII (.vtk).
Convergence history file name: history.
Forces breakdown file name: forces_breakdown.dat.
Surface flow coefficients file name: surface_flow.
Flow variables file name: flow.
Restart flow file name: restart_flow.dat.
So it recognizes that the output files should output as .vtk paraview files, like I input in the .cfg. After it goes through the iterations, I then get
Quote:

-------------------------- File Output Summary --------------------------
Writing comma-separated values (CSV) surface files.
Merging coordinates in the Master node.
Merging solution in the Master node.
Writing SU2 native restart file.
Writing the forces breakdown file.
But, the flow and solution_flow outputs are not actually rewritten into .vtk. I still have older versions sitting in the directory from when I was running on serial and outputting to Tecplot, but nothing new has been written with regards to these solutions.

talbring August 18, 2016 11:48

Hi Michael,

this information here should solve your problem (especially the bold text).

Tim

hlk August 18, 2016 12:36

Quote:

Originally Posted by talbring (Post 614576)
Hi Michael,

this information here should solve your problem (especially the bold text).

Tim

Thanks Tim!
I'd also note, if you use parallel_computation.py the SU2_SOL routine is run automatically.
See the "Turbulent ONERA M6" tutorial.

Jasfss August 22, 2016 12:33

Thanks Tim and Heather, all is good now. Appreciate all the help.


All times are GMT -4. The time now is 09:42.