Running CFD parallel. DDC isn't working.
Hello guys! I tried to run Onera M6 in parallel using parallel_computing.py and got an error reporting that There is no geometry file (GetnZone))!
What might that be? Code:
the command: mpirun -np 8 -machinefile hosts /scratch/ramos/su2mpi/bin/SU2_CFD config_CFD.cfg Code:
config_CFD.cfg |
This error indicates that the program was not able to find the mesh file. From the error message, it is looking for "mesh_ONERAM6_inv_1.su2", so most likely that mesh file does not exist in the working directory.
Based on your scratch file, it looks like what might be happening is that the mesh is never decomposed. The parallel computation looks for the mesh file name with an "_n" at the end because it expects that the mesh has already been split. This is executed automatically if you use the parallel_computation.py script, and you can also use SU2_DDC manually if you would prefer. Since it looks like neither of these are being executed, the mesh is not divided. I suggest using parallel_computation.py. For more details, please see http://adl-public.stanford.edu/docs/...ED/Running+SU2 |
Quote:
I realized that DDC wasn't creating new meshes. After correcting some other minor mistakes now I'm facing a new error in which DDC is sending all the original mesh points to the first Domain only. Code:
---------------------- Read grid file information ----------------------- Code:
Command = mpirun -np 4 -machinefile hosts /scratch/ramos/su2mpi/bin/SU2_DDC config_DDC.cfg Thanks! |
I'm not certain (maybe someone else on the forum can jump in if they recognize this error), but what I would suggest as a first step is to double check that the code was correctly configured for parallel.
For more details on configuring with parallel tools: http://adl-public.stanford.edu/docs/...on+from+Source |
Quote:
Well.. After some googling I think that error 139 is C's default code for segmentation fault. Another thing is that DDC is sending all the grid's points to the first Domain (1st mpi rank). Leaving nothing to the others as you can see on this log. Code:
Domain 1: 108396 points (0 ghost points). Comm buff: 21.98MB of 50.00MB. Can you help me? |
Hi CrashLaker,
I believe Heather's correct, the error is clearly in the partitioning of the mesh and I would suspect it has something to do with the configuration of SU2 and/or it's link to Metis. Please be absolutely sure you've run the SU2 configure script with a link to your MPI compiler and by enabling Metis. Also, please be sure to 'make clean' to ensure that there are no old binaries hanging around with configuration settings that are out-of-date. -Sean |
Quote:
Isn't there anything else I should try? I already compiled it lots of times. |
All times are GMT -4. The time now is 15:23. |