|
[Sponsors] |
November 12, 2015, 12:59 |
Segfault with a periodic domain parallel
|
#1 |
New Member
Join Date: Nov 2012
Posts: 7
Rep Power: 13 |
Dear all.
I am trying to perform a computation on a periodic mesh (a 10 degrees sector of a rotor). The mesh was generated with Ansys meshing and exported in cgns, then I run SU2_MSH convert_mesh.cfg to generate a su2 mesh file with the ghost nodes. If I run SU2_CFD comp_o1.cfg it runs fine, but as soon as I run it in parallel I get a segfault. valgrind gives a first invalid write when calling parmetis, and gdb gives a backtrace in CPhysicalGeometry::SetSendReceive. The files are available here: https://drive.google.com/folderview?...GM&usp=sharing Do you know what goes wrong? Best regards, Xavier |
|
December 3, 2015, 08:11 |
|
#2 |
New Member
Join Date: Nov 2012
Posts: 7
Rep Power: 13 |
I am sorry the mesh file was missing. I added it as well as a smaller 2D example (folder LS89) for which the problem also appears.
The 2D case runs fine on 1 and 2 processes, but a segfault occurs for 4 processes. I am not sure it is related, but SU2_MSH gives mismtches between the periodic nodes of the order of 1e-8 or smaller for all nodes, while mpiexec -n 2 SU2_CFD LS89.cfg gives much higher values, for axample Bad match for point 9439. Nearest donor distance: 1.9809954119e-02. Best regards |
|
December 3, 2015, 11:58 |
|
#3 |
New Member
Join Date: Dec 2013
Location: Italy
Posts: 29
Rep Power: 12 |
Dear Xavier,
I tested the comp_o1.cfg using the original CGNS grid and both in serial and parallel (up to 8 procs) there is no problem at all. When running SU2_MSH I get the same error as your: "Bad match for point 638439. Nearest donor distance: 3.7528684773e-06" when checking Periodic1, Periodic2 Comparing the two grid in su2 format (the one from SU2_MSH and the one that is the output of SU2_CFD with the input grid in CGNS) they are identical. If I run the RO37.su2 grid on a single processor it's ok but at the end of the run I get this message "The surface element (5, 6096) doesn't have an associated volume element." So even if it runs I don't know if the solution is well computed. On two procs instead the code stop when calling the mpi communication. The same happens if I load the mesh_out.su2 as obvious. Unfortunately, I didn't get where is the problem and for now my suggestion is to read directly the CGNS grid format in SU2_CFD and run without loading the SU2 format. |
|
December 16, 2015, 07:44 |
|
#4 |
New Member
Join Date: Nov 2012
Posts: 7
Rep Power: 13 |
Dear Jiba,
As far as I understand, it is necessary to use SU2_MSH to convert the mesh to su2 format in order to add the ghost nodes / ghost cells, otherwise periodicity is not enforced. In the LS89 test case for example, the computation on the cgns mesh crashes (if I only perform a few iterations, the solution is not periodic). Is is possible to run a periodic computation directly from the cgns mesh without any conversion? Thank you very much for your help. Best regards Xavier |
|
June 16, 2016, 04:51 |
|
#5 |
New Member
Join Date: Jun 2016
Posts: 7
Rep Power: 10 |
Hi Xavier,
Now I am running Rotor 37 using SU2, and I also face the same problem about segmentation fault in parallel running, do you have any ideal how to solve it? Thanks! |
|
June 29, 2016, 07:30 |
|
#6 |
New Member
Join Date: Nov 2012
Posts: 7
Rep Power: 13 |
No I didn't figure out a way t osolve this problem. Sorry!
|
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Setting rotating frame of referece. | RPFigueiredo | CFX | 3 | October 28, 2014 04:59 |
Nusselt number for flow through parallel plate under periodic boundary condition | Vipul_Patel | FLUENT | 0 | October 18, 2014 14:24 |
Is periodic calculations for the whole domain or one slice??? | otsigun | FLUENT | 1 | January 15, 2014 07:02 |
Floating point exception: Zero divide | liladhar | CFX | 11 | December 16, 2013 04:07 |
injection problem | Mark New | FLUENT | 0 | August 4, 2013 01:30 |