compiling with MPI, METIS, CGNS on iMac
Hi,
I am trying to compile on my mac. I have downloaded and compiled openmpi and cgns with success, but I had a lot of warnings with metis - I used v4.0.3 as suggested elsewhere on here. My configure command for SU2 is as specified on the website, with the output from make below. Looks like problems with the metis libs? Any help much appreciated. Thanks. Making all in trunk/Common/lib /usr/local/bin/mpicxx -DPACKAGE_NAME=\"SU\^2\" -DPACKAGE_TARNAME=\"SU\^2\" -DPACKAGE_VERSION=\"2.0\" -DPACKAGE_STRING=\"SU\^2\ 2.0\" -DPACKAGE_BUGREPORT=\"susquared-dev@lists.stanford.edu\" -DPACKAGE=\"SU\^2\" -DVERSION=\"2.0\" -I. -I/Users/Simon/CFD/cgnslib_3.1.4/src/ -I/Users/Simon/CFD/metis-4.0.3/Lib -DMETIS_5 -DNO_TECIO -g -O2 -MT ../src/libSU2_a-config_structure.o -MD -MP -MF ../src/.deps/libSU2_a-config_structure.Tpo -c -o ../src/libSU2_a-config_structure.o `test -f '../src/config_structure.cpp' || echo './'`../src/config_structure.cpp mv -f ../src/.deps/libSU2_a-config_structure.Tpo ../src/.deps/libSU2_a-config_structure.Po /usr/local/bin/mpicxx -DPACKAGE_NAME=\"SU\^2\" -DPACKAGE_TARNAME=\"SU\^2\" -DPACKAGE_VERSION=\"2.0\" -DPACKAGE_STRING=\"SU\^2\ 2.0\" -DPACKAGE_BUGREPORT=\"susquared-dev@lists.stanford.edu\" -DPACKAGE=\"SU\^2\" -DVERSION=\"2.0\" -I. -I/Users/Simon/CFD/cgnslib_3.1.4/src/ -I/Users/Simon/CFD/metis-4.0.3/Lib -DMETIS_5 -DNO_TECIO -g -O2 -MT ../src/libSU2_a-dual_grid_structure.o -MD -MP -MF ../src/.deps/libSU2_a-dual_grid_structure.Tpo -c -o ../src/libSU2_a-dual_grid_structure.o `test -f '../src/dual_grid_structure.cpp' || echo './'`../src/dual_grid_structure.cpp mv -f ../src/.deps/libSU2_a-dual_grid_structure.Tpo ../src/.deps/libSU2_a-dual_grid_structure.Po /usr/local/bin/mpicxx -DPACKAGE_NAME=\"SU\^2\" -DPACKAGE_TARNAME=\"SU\^2\" -DPACKAGE_VERSION=\"2.0\" -DPACKAGE_STRING=\"SU\^2\ 2.0\" -DPACKAGE_BUGREPORT=\"susquared-dev@lists.stanford.edu\" -DPACKAGE=\"SU\^2\" -DVERSION=\"2.0\" -I. -I/Users/Simon/CFD/cgnslib_3.1.4/src/ -I/Users/Simon/CFD/metis-4.0.3/Lib -DMETIS_5 -DNO_TECIO -g -O2 -MT ../src/libSU2_a-geometry_structure.o -MD -MP -MF ../src/.deps/libSU2_a-geometry_structure.Tpo -c -o ../src/libSU2_a-geometry_structure.o `test -f '../src/geometry_structure.cpp' || echo './'`../src/geometry_structure.cpp ../src/geometry_structure.cpp: In constructor ‘CPhysicalGeometry::CPhysicalGeometry(CConfig*, std::string, short unsigned int, short unsigned int, short unsigned int)’: ../src/geometry_structure.cpp:1131: warning: format ‘%8i’ expects type ‘int’, but argument 3 has type ‘long int’ ../src/geometry_structure.cpp:1131: warning: format ‘%8i’ expects type ‘int’, but argument 3 has type ‘long int’ ../src/geometry_structure.cpp:1183: warning: format ‘%8i’ expects type ‘int’, but argument 3 has type ‘long int’ ../src/geometry_structure.cpp:1183: warning: format ‘%8i’ expects type ‘int’, but argument 3 has type ‘long int’ ../src/geometry_structure.cpp: In member function ‘virtual void CPhysicalGeometry::SetColorGrid(CConfig*)’: ../src/geometry_structure.cpp:4297: error: ‘METIS_NOPTIONS’ was not declared in this scope ../src/geometry_structure.cpp:4298: error: ‘options’ was not declared in this scope ../src/geometry_structure.cpp:4298: error: ‘METIS_SetDefaultOptions’ was not declared in this scope ../src/geometry_structure.cpp:4299: error: ‘METIS_OPTION_OBJTYPE’ was not declared in this scope ../src/geometry_structure.cpp:4299: error: ‘METIS_OBJTYPE_CUT’ was not declared in this scope /Users/Simon/CFD/metis-4.0.3/Lib/proto.h:233: error: too many arguments to function ‘void METIS_PartMeshNodal(int*, int*, idxtype*, int*, int*, int*, int*, idxtype*, idxtype*)’ ../src/geometry_structure.cpp:4300: error: at this point in file make[1]: *** [../src/libSU2_a-geometry_structure.o] Error 1 make: *** [all-recursive] Error 1 |
Quote:
Could you please get rid of --with-Metis-version=5 in the configure command? Cheers, Francisco |
Hi Francisco,
Thanks, removing --with-Metis-version=5 has done the trick. I forgot I had left it in the ./configure command having switched from metis v5 to v4. I've run the oneram6 tutorial following the parallel computation approach. I ran the parallel_computation.py script in /usr/local/bin where the SU2 exes are but it couldnt find the DDC or SU2 exes. Looking at the python script it was calling to $SU2_HOME/SU2Py which doesnt contain any of teh SU2 exes (I have set my $SU2_HOME to the trunk folder as specified). I modified the python script to just call SU2_DDC and CFD (calling from /usr/local/bin) and that has worked fine. I'm also curious though as there are also SU2_ exes in $SU2_HOME? I also have a question on memory usage - running the oneram6 example in serial the memory usage is ~1150MB but in parallel on 4 cores it is ~450MB per core which is almost twice as much in total (4*450=1800MB)? Thanks for your help, Simon. |
$su2_run
Have you gone to the directory to confirm the location of the python scripts?
It depends on which release you are using, but I noticed a change in the file structure when I started following the developer releases. For instance, my SU2_RUN stuff (duly added to my path) is all located under usr/local/bin...but my SU2_HOME is just pointing to SU2_v2.0.3...ie: no trunk anymore. |
I am using v2.0. It seems there are python scripts in SU2_HOME/trunk/SU2Py and also in SU2_RUN (/usr/local/bin) with the compiled exes. There are also exes in SU2_HOME/trunk dated 8th Jan so I assume they are binaries compiled at the v2.0 release...
Does anyone have any guidance on memory usage? A rule of thumb for no elements vs RAM would be really handy.. |
Quote:
|
All times are GMT -4. The time now is 23:35. |