CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > SU2

HEG cylinder boundary conditions in NEMO

Register Blogs Community New Posts Updated Threads Search

Like Tree1Likes

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   March 22, 2024, 21:24
Default
  #21
Member
 
AN
Join Date: Jan 2024
Posts: 37
Rep Power: 2
KleinMoretti is on a distinguished road
Quote:
Originally Posted by CatarinaGarbacz View Post
For the explicit scheme the CFL will vary depending on the mesh and the state of the simulation, and if you're using 1st or 2nd order, so there's no specific wrong or right answer.

It's the user's role to test different values and see what works, and gain their own sensitivity to this parameter. When explicit 1st order simulations converge easily CFL = 0.5-0.75 usually works. If using 2nd order or a mesh leading to more difficult convergence, CFL = 0.1 could be used. In some rare occasions, where the bow shock is very strong and may lead to carbuncle problem, I have had to use CFL = 0.01 to get an initial developed flow, and then raised it again. It's a bit of trial and error exercise that's very case dependent.
Thanks for your advice! After compiling the NEMO_AUSMPLUSM_rollback branch, I found that in the same situation of single core using explicit scheme, this branch is much slower than the 8.0.0 version, even about a tenth of the speed. Then when I use 8 cores, I get stuck in cmd. Does this branch not support parallel? Or is there something wrong with my compilation?
KleinMoretti is offline   Reply With Quote

Old   March 24, 2024, 16:12
Default
  #22
New Member
 
Catarina Garbacz
Join Date: Jun 2020
Posts: 23
Rep Power: 5
CatarinaGarbacz is on a distinguished road
Yes, this branch is much slower and it dates back a couple years and is not up to date with recent performance developments. We are currently working on bringing this fix to develop branch.

This said, it should work in parallel. It could be a problem with your compilation. If you provide me with the *exact* commands you used, I can check it anything is off
CatarinaGarbacz is offline   Reply With Quote

Old   March 24, 2024, 20:41
Default
  #23
Member
 
AN
Join Date: Jan 2024
Posts: 37
Rep Power: 2
KleinMoretti is on a distinguished road
Quote:
Originally Posted by CatarinaGarbacz View Post
Yes, this branch is much slower and it dates back a couple years and is not up to date with recent performance developments. We are currently working on bringing this fix to develop branch.

This said, it should work in parallel. It could be a problem with your compilation. If you provide me with the *exact* commands you used, I can check it anything is off
I just use 'python meson.py build' and 'ninja -C build'.

I looked at the instructions on the SU2 official website, should I set Dwith-mpi=enabled? Are there any other options I need to set?
KleinMoretti is offline   Reply With Quote

Old   March 24, 2024, 22:30
Default
  #24
Member
 
AN
Join Date: Jan 2024
Posts: 37
Rep Power: 2
KleinMoretti is on a distinguished road
Quote:
Originally Posted by CatarinaGarbacz View Post
Yes, this branch is much slower and it dates back a couple years and is not up to date with recent performance developments. We are currently working on bringing this fix to develop branch.

This said, it should work in parallel. It could be a problem with your compilation. If you provide me with the *exact* commands you used, I can check it anything is off
when I use 'python meson.py buildmpi -Dwith-mpi =enabled', I get some errors.
Attached Images
File Type: jpg 325.jpg (122.5 KB, 5 views)

Last edited by KleinMoretti; March 25, 2024 at 01:32.
KleinMoretti is offline   Reply With Quote

Old   March 24, 2024, 22:45
Default
  #25
Member
 
AN
Join Date: Jan 2024
Posts: 37
Rep Power: 2
KleinMoretti is on a distinguished road
Quote:
Originally Posted by CatarinaGarbacz View Post
Yes, this branch is much slower and it dates back a couple years and is not up to date with recent performance developments. We are currently working on bringing this fix to develop branch.

This said, it should work in parallel. It could be a problem with your compilation. If you provide me with the *exact* commands you used, I can check it anything is off
Build started at 2024-03-25T11:40:24.394888
Main binary: C:\Users\35014\AppData\Local\Programs\Python\Pytho n38\python.exe
Build Options: -Dwith-mpi=enabled
Python system: Windows
The Meson build system
Version: 0.54.999
Source dir: C:\Users\35014\Desktop\SU2\SU2\SU2-NEMO_AUSMPLUSM_Rollback
Build dir: C:\Users\35014\Desktop\SU2\SU2\SU2-NEMO_AUSMPLUSM_Rollback\buildmpi
Build type: native build
Using 'PKG_CONFIG_PATH' from environment with value: 'C:\\msys64\\mingw64\\lib\\pkgconfig'
Using 'PKG_CONFIG_PATH' from environment with value: 'C:\\msys64\\mingw64\\lib\\pkgconfig'
Project name: SU2
Project version: 7.0.8 "Blackbird"
None of 'CC' are defined in the environment, not changing global flags.
None of 'CFLAGS' are defined in the environment, not changing global flags.
None of 'LDFLAGS' are defined in the environment, not changing global flags.
None of 'CPPFLAGS' are defined in the environment, not changing global flags.
None of 'CC_LD' are defined in the environment, not changing global flags.
Sanity testing C compiler: gcc
Is cross compiler: False.
None of 'CC_LD' are defined in the environment, not changing global flags.
Sanity check compiler command line: gcc C:\Users\35014\Desktop\SU2\SU2\SU2-NEMO_AUSMPLUSM_Rollback\buildmpi\meson-private\sanitycheckc.c -o C:\Users\35014\Desktop\SU2\SU2\SU2-NEMO_AUSMPLUSM_Rollback\buildmpi\meson-private\sanitycheckc.exe -pipe
Sanity check compile stdout:

-----
Sanity check compile stderr:

-----
Running test binary command: C:\Users\35014\Desktop\SU2\SU2\SU2-NEMO_AUSMPLUSM_Rollback\buildmpi\meson-private\sanitycheckc.exe
C compiler for the build machine: gcc (gcc 8.1.0 "gcc (x86_64-win32-seh-rev0, Built by MinGW-W64 project) 8.1.0")
C linker for the build machine: gcc ld.bfd 2.30
None of 'AR' are defined in the environment, not changing global flags.
None of 'CXX' are defined in the environment, not changing global flags.
None of 'CXXFLAGS' are defined in the environment, not changing global flags.
None of 'LDFLAGS' are defined in the environment, not changing global flags.
None of 'CPPFLAGS' are defined in the environment, not changing global flags.
None of 'CXX_LD' are defined in the environment, not changing global flags.
Sanity testing C++ compiler: c++
Is cross compiler: False.
None of 'CXX_LD' are defined in the environment, not changing global flags.
Sanity check compiler command line: c++ C:\Users\35014\Desktop\SU2\SU2\SU2-NEMO_AUSMPLUSM_Rollback\buildmpi\meson-private\sanitycheckcpp.cc -o C:\Users\35014\Desktop\SU2\SU2\SU2-NEMO_AUSMPLUSM_Rollback\buildmpi\meson-private\sanitycheckcpp.exe -pipe
Sanity check compile stdout:

-----
Sanity check compile stderr:

-----
Running test binary command: C:\Users\35014\Desktop\SU2\SU2\SU2-NEMO_AUSMPLUSM_Rollback\buildmpi\meson-private\sanitycheckcpp.exe
C++ compiler for the build machine: c++ (gcc 8.1.0 "c++ (x86_64-win32-seh-rev0, Built by MinGW-W64 project) 8.1.0")
C++ linker for the build machine: c++ ld.bfd 2.30
None of 'CC' are defined in the environment, not changing global flags.
None of 'CFLAGS' are defined in the environment, not changing global flags.
None of 'LDFLAGS' are defined in the environment, not changing global flags.
None of 'CPPFLAGS' are defined in the environment, not changing global flags.
None of 'CC_LD' are defined in the environment, not changing global flags.
Sanity testing C compiler: gcc
Is cross compiler: False.
None of 'CC_LD' are defined in the environment, not changing global flags.
Sanity check compiler command line: gcc C:\Users\35014\Desktop\SU2\SU2\SU2-NEMO_AUSMPLUSM_Rollback\buildmpi\meson-private\sanitycheckc.c -o C:\Users\35014\Desktop\SU2\SU2\SU2-NEMO_AUSMPLUSM_Rollback\buildmpi\meson-private\sanitycheckc.exe -pipe
Sanity check compile stdout:

-----
Sanity check compile stderr:

-----
Running test binary command: C:\Users\35014\Desktop\SU2\SU2\SU2-NEMO_AUSMPLUSM_Rollback\buildmpi\meson-private\sanitycheckc.exe
C compiler for the host machine: gcc (gcc 8.1.0 "gcc (x86_64-win32-seh-rev0, Built by MinGW-W64 project) 8.1.0")
C linker for the host machine: gcc ld.bfd 2.30
None of 'AR' are defined in the environment, not changing global flags.
None of 'CXX' are defined in the environment, not changing global flags.
None of 'CXXFLAGS' are defined in the environment, not changing global flags.
None of 'LDFLAGS' are defined in the environment, not changing global flags.
None of 'CPPFLAGS' are defined in the environment, not changing global flags.
None of 'CXX_LD' are defined in the environment, not changing global flags.
Sanity testing C++ compiler: c++
Is cross compiler: False.
None of 'CXX_LD' are defined in the environment, not changing global flags.
Sanity check compiler command line: c++ C:\Users\35014\Desktop\SU2\SU2\SU2-NEMO_AUSMPLUSM_Rollback\buildmpi\meson-private\sanitycheckcpp.cc -o C:\Users\35014\Desktop\SU2\SU2\SU2-NEMO_AUSMPLUSM_Rollback\buildmpi\meson-private\sanitycheckcpp.exe -pipe
Sanity check compile stdout:

-----
Sanity check compile stderr:

-----
Running test binary command: C:\Users\35014\Desktop\SU2\SU2\SU2-NEMO_AUSMPLUSM_Rollback\buildmpi\meson-private\sanitycheckcpp.exe
C++ compiler for the host machine: c++ (gcc 8.1.0 "c++ (x86_64-win32-seh-rev0, Built by MinGW-W64 project) 8.1.0")
C++ linker for the host machine: c++ ld.bfd 2.30
Build machine cpu family: x86_64
Build machine cpu: x86_64
Host machine cpu family: x86_64
Host machine cpu: x86_64
Target machine cpu family: x86_64
Target machine cpu: x86_64
Program python3 found: YES (C:\Users\35014\AppData\Local\Programs\Python\Pyth on38\python.exe)
Pkg-config binary for MachineChoice.HOST is not cached.
None of 'PKG_CONFIG' are defined in the environment, not changing global flags.
Pkg-config binary missing from cross or native file, or env var undefined.
Trying a default Pkg-config fallback at pkg-config
Trying pkg-config binary pkg-config for machine MachineChoice.HOST at ['C:\\msys64\\mingw64\\bin\\pkg-config.EXE']
Found pkg-config: C:\msys64\mingw64\bin\pkg-config.EXE (0.29.2)
Determining dependency 'ompi-c' with pkg-config executable 'C:\\msys64\\mingw64\\bin\\pkg-config.EXE'
PKG_CONFIG_PATH: C:\msys64\mingw64\lib\pkgconfig
Called `C:\msys64\mingw64\bin\pkg-config.EXE --modversion ompi-c` -> 1

mpicc binary missing from cross or native file, or env var undefined.
Trying a default mpicc fallback at mpicc
mpicc found: NO
Run-time dependency MPI for c found: YES
Pkg-config binary for MachineChoice.HOST is cached.
Determining dependency 'ompi-cxx' with pkg-config executable 'C:\\msys64\\mingw64\\bin\\pkg-config.EXE'
PKG_CONFIG_PATH: C:\msys64\mingw64\lib\pkgconfig
Called `C:\msys64\mingw64\bin\pkg-config.EXE --modversion ompi-cxx` -> 1

mpic++ binary missing from cross or native file, or env var undefined.
Trying a default mpic++ fallback at mpic++
Trying a default mpic++ fallback at mpicxx
Trying a default mpic++ fallback at mpiCC
mpic++ found: NO
Run-time dependency MPI for cpp found: NO (tried pkgconfig and config-tool)

meson.build:38:2: ERROR: Dependency "mpi" not found, tried pkgconfig and config-tool




This is my meson.log.
KleinMoretti is offline   Reply With Quote

Old   March 25, 2024, 02:42
Default
  #26
Member
 
AN
Join Date: Jan 2024
Posts: 37
Rep Power: 2
KleinMoretti is on a distinguished road
It looks like there's something wrong with msmpi. But I don't know how to solve it.
KleinMoretti is offline   Reply With Quote

Old   March 25, 2024, 08:33
Default
  #27
New Member
 
Catarina Garbacz
Join Date: Jun 2020
Posts: 23
Rep Power: 5
CatarinaGarbacz is on a distinguished road
so the mpi features are not being found, as per the log file:

HTML Code:
mpicc binary missing from cross or native file, or env var undefined.Trying a default mpicc fallback at mpiccmpicc found: NORun-time dependency MPI for c found: YESPkg-config binary for MachineChoice.HOST is cached.Determining dependency 'ompi-cxx' with pkg-config executable 'C:\\msys64\\mingw64\\bin\\pkg-config.EXE'PKG_CONFIG_PATH: C:\msys64\mingw64\lib\pkgconfigCalled `C:\msys64\mingw64\bin\pkg-config.EXE --modversion ompi-cxx` -> 1mpic++ binary missing from cross or native file, or env var undefined.Trying a default mpic++ fallback at mpic++Trying a default mpic++ fallback at mpicxxTrying a default mpic++ fallback at mpiCCmpic++ found: NORun-time dependency MPI for cpp found: NO (tried pkgconfig and config-tool)meson.build:38:2: ERROR: Dependency "mpi" not found, tried pkgconfig and config-tool
you need to find the paths to mpicc and mpicxx, by doing, in the terminal

which mpicc
which mpicxx

then add those paths to the bashrc file and source it.

export MPICC=<path_to_mpicc>
export MPICXX=<path_to_mpicxx>
CatarinaGarbacz is offline   Reply With Quote

Old   March 26, 2024, 03:38
Default
  #28
Member
 
AN
Join Date: Jan 2024
Posts: 37
Rep Power: 2
KleinMoretti is on a distinguished road
Quote:
Originally Posted by CatarinaGarbacz View Post
so the mpi features are not being found, as per the log file:

HTML Code:
mpicc binary missing from cross or native file, or env var undefined.Trying a default mpicc fallback at mpiccmpicc found: NORun-time dependency MPI for c found: YESPkg-config binary for MachineChoice.HOST is cached.Determining dependency 'ompi-cxx' with pkg-config executable 'C:\\msys64\\mingw64\\bin\\pkg-config.EXE'PKG_CONFIG_PATH: C:\msys64\mingw64\lib\pkgconfigCalled `C:\msys64\mingw64\bin\pkg-config.EXE --modversion ompi-cxx` -> 1mpic++ binary missing from cross or native file, or env var undefined.Trying a default mpic++ fallback at mpic++Trying a default mpic++ fallback at mpicxxTrying a default mpic++ fallback at mpiCCmpic++ found: NORun-time dependency MPI for cpp found: NO (tried pkgconfig and config-tool)meson.build:38:2: ERROR: Dependency "mpi" not found, tried pkgconfig and config-tool
you need to find the paths to mpicc and mpicxx, by doing, in the terminal

which mpicc
which mpicxx

then add those paths to the bashrc file and source it.

export MPICC=<path_to_mpicc>
export MPICXX=<path_to_mpicxx>
Thank you, but that still doesn't work...
KleinMoretti is offline   Reply With Quote

Old   March 26, 2024, 03:48
Default
  #29
Member
 
AN
Join Date: Jan 2024
Posts: 37
Rep Power: 2
KleinMoretti is on a distinguished road
Quote:
Originally Posted by CatarinaGarbacz View Post
Yes, this branch is much slower and it dates back a couple years and is not up to date with recent performance developments. We are currently working on bringing this fix to develop branch.

This said, it should work in parallel. It could be a problem with your compilation. If you provide me with the *exact* commands you used, I can check it anything is off
Hi, the computation speed of this branch is really too slow. You said before that this branch just modified the AUSMPW scheme in the spatial discretization part, right? All the other codes about wall heat flux are the same or similar, right?

I want to transport the AUSMPW scheme in this branch to 8.0.0 code and then compile it, but I am afraid that this branch code is different from 8.0.0 more than AUSMPW scheme, so I want to make sure about this.
KleinMoretti is offline   Reply With Quote

Old   March 26, 2024, 07:22
Default
  #30
New Member
 
Catarina Garbacz
Join Date: Jun 2020
Posts: 23
Rep Power: 5
CatarinaGarbacz is on a distinguished road
1- it's AUSMPLUSM as per the name of the branch, not AUSMPW


2- if you're only able to run it in serial, yes of course it will be very slow, but this is not a problem in the code, but in the compilation you did. The only way I can help with that is perhaps to have an online meeting where I can try and compile the code with you

3- the models are the same yes, but there are structural differences in the way they are implemented. so you cannot simply "copy paste". you'd have to dedicate a bit of time to understand those differences and how to do this correctly
KleinMoretti likes this.
CatarinaGarbacz is offline   Reply With Quote

Old   March 26, 2024, 08:17
Default
  #31
Member
 
AN
Join Date: Jan 2024
Posts: 37
Rep Power: 2
KleinMoretti is on a distinguished road
Quote:
Originally Posted by CatarinaGarbacz View Post
1- it's AUSMPLUSM as per the name of the branch, not AUSMPW


2- if you're only able to run it in serial, yes of course it will be very slow, but this is not a problem in the code, but in the compilation you did. The only way I can help with that is perhaps to have an online meeting where I can try and compile the code with you

3- the models are the same yes, but there are structural differences in the way they are implemented. so you cannot simply "copy paste". you'd have to dedicate a bit of time to understand those differences and how to do this correctly
Thank you for your kind answer! My idea is to re-implement the scheme that can accurately calculate the wall heat flux in 8.0.0. Could you please provide some references?
KleinMoretti is offline   Reply With Quote

Old   March 26, 2024, 08:24
Default
  #32
Member
 
AN
Join Date: Jan 2024
Posts: 37
Rep Power: 2
KleinMoretti is on a distinguished road
I have successfully compile mpi mode in linux, and I set mpiexec -np 4, it is a little faster, but the improvement is not very significant. The speed has increased by about 15%. Is this normal?
KleinMoretti is offline   Reply With Quote

Old   March 26, 2024, 08:55
Default
  #33
New Member
 
Catarina Garbacz
Join Date: Jun 2020
Posts: 23
Rep Power: 5
CatarinaGarbacz is on a distinguished road
Quote:
Originally Posted by KleinMoretti View Post
I have successfully compile mpi mode in linux, and I set mpiexec -np 4, it is a little faster, but the improvement is not very significant. The speed has increased by about 15%. Is this normal?

It depends on how many cores you're using. in any case, it is known that previous versions are slower, the only thing you can do is use more cores
CatarinaGarbacz is offline   Reply With Quote

Old   March 26, 2024, 09:09
Default
  #34
New Member
 
Catarina Garbacz
Join Date: Jun 2020
Posts: 23
Rep Power: 5
CatarinaGarbacz is on a distinguished road
reference for AUMSPLUSM


"Impact of Anisotropic Mesh Adaptation on the Aerothermodynamics of Atmospheric Reentry" - Fábio Morgado, Catarina Garbacz and Marco Fossati
CatarinaGarbacz is offline   Reply With Quote

Old   March 28, 2024, 02:30
Default
  #35
New Member
 
Liming Yang
Join Date: Sep 2023
Posts: 25
Rep Power: 2
CFDWhite is on a distinguished road
Quote:
Originally Posted by KleinMoretti View Post
I have successfully compile mpi mode in linux, and I set mpiexec -np 4, it is a little faster, but the improvement is not very significant. The speed has increased by about 15%. Is this normal?
Hello, what additional things did you do to compile successfully? My compilation process didn't report any errors, but it didn't seem to work with mpi.
Attached Images
File Type: jpg mpierror.jpg (79.4 KB, 5 views)
CFDWhite is offline   Reply With Quote

Old   March 28, 2024, 07:18
Default
  #36
New Member
 
Liming Yang
Join Date: Sep 2023
Posts: 25
Rep Power: 2
CFDWhite is on a distinguished road
Quote:
Originally Posted by CatarinaGarbacz View Post
reference for AUMSPLUSM


"Impact of Anisotropic Mesh Adaptation on the Aerothermodynamics of Atmospheric Reentry" - Fábio Morgado, Catarina Garbacz and Marco Fossati
One more question, do you remember the shock position of your original calculation? Have you compared relative physical quantities such as shock position and component mass fraction? Thank you in advance for your answers
CFDWhite is offline   Reply With Quote

Old   March 28, 2024, 08:41
Default
  #37
New Member
 
Catarina Garbacz
Join Date: Jun 2020
Posts: 23
Rep Power: 5
CatarinaGarbacz is on a distinguished road
about compilation, I do the following steps:

./meson.py <build_folder> --prefix = <path_to_su2> -Denable-mpp=True -Dwith-mpi = enabled




./ninja -C <build_folder> install


I add the following lines to the bashrc file:
export SU2_RUN=<path_to_su2>/bin
export SU2_HOME=<path_to_su2>

export PATH=$PATH:$SU2_RUN
export PYTHONPATH=$PYTHONPATH:$SU2_RUN
export MPP_DATA_DIRECTORY=$SU2_HOME/subprojects/Mutationpp/data
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$SU2_HOME/build/subprojects/Mutationpp



the point is: your compilation seems to be correct but it seems that mpi is not being found in your machine. You need to check the meson.log again and see if MPICC and MPICXX are found. If these are NOT found, mpi will not work... so that's the one thing you need to address, it seems. in my case, to address that, I also added to the bashrc file:

export MPICC=/usr/bin/mpicc
export MPICXX=/usr/bin/mpicxx



about the questions on the paper, sorry but this has been done some time ago and I'm not first author, so I'd say the only information we have available at the moment is what you can find in the paper.



CatarinaGarbacz is offline   Reply With Quote

Old   April 16, 2024, 06:02
Default
  #38
Member
 
AN
Join Date: Jan 2024
Posts: 37
Rep Power: 2
KleinMoretti is on a distinguished road
Quote:
Originally Posted by CatarinaGarbacz View Post
It depends on how many cores you're using. in any case, it is known that previous versions are slower, the only thing you can do is use more cores
Hi,Catarina!

Thanks for your help some days ago.

I carefully read the rollback code and found a lot of different places, including the AUSMPLUSM function to calculate the conv flux residual, the implementation of farfield, sym plane, supersonic outlet and isothermal-noncatalytic wall boundary conditions. I think I've got almost everything right, but there are still a few things I'm not very sure.

One of my puzzles is h_k or h in AUSMPLUSM scheme, which is called the pressure sensor. I set "ITER=1" and tried using "cout<<h_k;" to print h_k in rollback and h_k in 8.0.0 on the same grid. To my surprise, there was a noticeable difference.
KleinMoretti is offline   Reply With Quote

Old   April 16, 2024, 06:07
Default
  #39
Member
 
AN
Join Date: Jan 2024
Posts: 37
Rep Power: 2
KleinMoretti is on a distinguished road
Quote:
Originally Posted by KleinMoretti View Post
Hi,Catarina!

Thanks for your help some days ago.

I carefully read the rollback code and found a lot of different places, including the AUSMPLUSM function to calculate the conv flux residual, the implementation of farfield, sym plane, supersonic outlet and isothermal-noncatalytic wall boundary conditions. I think I've got almost everything right, but there are still a few things I'm not very sure.

One of my puzzles is h_k or h in AUSMPLUSM scheme, which is called the pressure sensor. I set "ITER=1" and tried using "cout<<h_k;" to print h_k in rollback and h_k in 8.0.0 on the same grid. To my surprise, there was a noticeable difference.
Simulation Run using the Single-zone Driver
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback
3.25229e-316:this is h_k in rollback




Simulation Run using the Single-zone Driver
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
1:this is h in 8.0.0
KleinMoretti is offline   Reply With Quote

Old   April 16, 2024, 06:17
Default
  #40
Member
 
AN
Join Date: Jan 2024
Posts: 37
Rep Power: 2
KleinMoretti is on a distinguished road
I used "cout" on the 5*5 grid of thermal bath example, h is 1 in 8.0.0, which is agreed with my expectation, because the initial field is given by the free stream condition, the pressure at all points on the grid is the same, so h is 1, but why is it close to 0 in rollback?
KleinMoretti is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
CFD analaysis of Pelton turbine amodpanthee CFX 31 April 19, 2018 18:02
Out File does not show Imbalance in % Mmaragann CFX 5 January 20, 2017 10:20
Waterwheel shaped turbine inside a pipe simulation problem mshahed91 CFX 3 January 10, 2015 11:19
An error has occurred in cfx5solve: volo87 CFX 5 June 14, 2013 17:44
Error finding variable "THERMX" sunilpatil CFX 8 April 26, 2013 07:00


All times are GMT -4. The time now is 21:34.