|
[Sponsors] |
April 12, 2015, 19:18 |
externalWallHeatFluxTemperature
|
#1 |
New Member
Nicholas Wimer
Join Date: Jul 2013
Posts: 3
Rep Power: 12 |
Hello,
I am trying to incorporate a forced convection heat transfer wall boundary condition in my OpenFOAM simulation using buoyantPimpleFoam. The simulation is of a bottom heated air domain with a moving top wall which is cooled by chilled water with an h = 500W/m^2-K. I have implemented the boundary condition as: roller { type externalWallHeatFluxTemperature; kappa fluidThermo; Ta 300.0; h -500.0; value uniform 300.0; kappaName none; } I believe this is set up correctly as it runs with an h = -10, but OpenFOAM break with this higher h value. I have runtime adjustable timestep with Co = 0.3. The error is: Courant Number mean: -3.90052 max: 5168.52 deltaT = 4.78629e-17 --> FOAM Warning : From function Time:perator++() in file db/Time/Time.C at line 1029 Increased the timePrecision from 9 to 10 to distinguish between timeNames at time 0.00159593 Time = 0.001595932725 diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0 DILUPBiCG: Solving for Ux, Initial residual = 1.61392e-05, Final residual = 4.86299e-11, No Iterations 1 DILUPBiCG: Solving for Uy, Initial residual = 2.36344e-05, Final residual = 5.92983e-12, No Iterations 1 DILUPBiCG: Solving for Uz, Initial residual = 1.87123e-05, Final residual = 2.90491e-11, No Iterations 1 [5] [6] [7] [7] [7] --> FOAM FATAL ERROR: [8] [8] [9] [9] [9] --> FOAM FATAL ERROR: [9] Maximum number of iterations exceeded [9] [9] From function [10] [11] [11] [11] --> FOAM FATAL ERROR: [11] Maximum number of iterations exceeded [11] [11] From function thermo<Thermo, Type>::T(scalar f, scalar T0, scalar (thermo<Thermo, Type>::*F)(const scalar) const, scalar (thermo<Thermo, Type>::*dFdT)(const scalar) const, scalar (thermo<Thermo, Type>::*limit)(const scalar) const) const [11] in file /curc/tools/x_86_64/rh6/openfoam/2.2.1/OpenFOAM-2.2.1/src/thermophysicalModels/specie/lnInclude/thermoI.H at line [0] Thank you, Nick |
|
April 12, 2015, 20:01 |
|
#2 |
New Member
Nicholas Wimer
Join Date: Jul 2013
Posts: 3
Rep Power: 12 |
Here is more error message with a Co number that doesn't explode -- any help would be greatly appreciated!
Courant Number mean: 0.000315859 max: 0.0189003 deltaT = 2.29968e-06 Time = 1.34995e-05 diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0 DILUPBiCG: Solving for Ux, Initial residual = 0.361192, Final residual = 1.16526e-09, No Iterations 2 DILUPBiCG: Solving for Uy, Initial residual = 0.222113, Final residual = 5.77092e-10, No Iterations 2 DILUPBiCG: Solving for Uz, Initial residual = 0.36592, Final residual = 1.04334e-09, No Iterations 2 DILUPBiCG: Solving for h, Initial residual = 0.995624, Final residual = 3.51201e-09, No Iterations 2 DICPCG: Solving for G, Initial residual = 0.982306, Final residual = 0.0962459, No Iterations 168 DICPCG: Solving for p_rgh, Initial residual = 0.996869, Final residual = 0.0023765, No Iterations 2 diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0 time step continuity errors : sum local = 0.00075017, global = 4.02773e-05, cumulative = 3.93826e-05 DICPCG: Solving for p_rgh, Initial residual = 0.00344259, Final residual = 6.32211e-09, No Iterations 14 diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0 time step continuity errors : sum local = 3.77057e-09, global = -1.147e-09, cumulative = 3.93814e-05 [39] #0 Foam::error:rintStack(Foam::Ostream&)-------------------------------------------------------------------------- An MPI process has executed an operation involving a call to the "fork()" system call to create a child process. Open MPI is currently operating in a condition that could result in memory corruption or other system errors; your MPI job may hang, crash, or produce silent data corruption. The use of fork() (or system() or other calls that create child processes) is strongly discouraged. The process that invoked fork was: Local host: node1010 (PID 12768) MPI_COMM_WORLD rank: 39 If you are *absolutely sure* that your application will successfully and correctly survive a call to fork(), you may disable this warning by setting the mpi_warn_on_fork MCA parameter to 0. -------------------------------------------------------------------------- addr2line failed [39] #1 Foam::sigFpe::sigHandler(int) addr2line failed [39] #2 addr2line failed [39] #3 Foam::sqrt(Foam::Field<double>&, Foam::UList<double> const&) addr2line failed [39] #4 Foam::sqrt(Foam::tmp<Foam::Field<double> > const&) addr2line failed [39] #5 Foam::compressible::LESModels::vanDriestDelta::cal cDelta() addr2line failed [39] #6 Foam::compressible::LESModels:neEqEddy::correct( Foam::tmp<Foam::GeometricField<Foam::Tensor<double >, Foam::fvPatchField, Foam::volMesh> > const&) addr2line failed [39] #7 Foam::compressible::LESModel::correct() addr2line failed [39] #8 [39] [39] #9 __libc_start_main addr2line failed [39] #10 [39] [node1010:12768] *** Process received signal *** [node1010:12768] Signal: Floating point exception (8) [node1010:12768] Signal code: (-6) [node1010:12768] Failing at address: 0x5b974000031e0 [node1010:12768] [ 0] /lib64/libc.so.6(+0x32920) [0x2b509bf3c920] [node1010:12768] [ 1] /lib64/libc.so.6(gsignal+0x35) [0x2b509bf3c8a5] [node1010:12768] [ 2] /lib64/libc.so.6(+0x32920) [0x2b509bf3c920] [node1010:12768] [ 3] /curc/tools/x_86_64/rh6/openfoam/2.2.1/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/libOpenFOAM.so(_ZN4Foam4sqrtERNS_5FieldIdEERKNS_5U ListIdEE+0x30) [0x2b509b171310] [node1010:12768] [ 4] /curc/tools/x_86_64/rh6/openfoam/2.2.1/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/libOpenFOAM.so(_ZN4Foam4sqrtERKNS_3tmpINS_5FieldId EEEE+0x8a) [0x2b509b17bc3a] [node1010:12768] [ 5] /curc/tools/x_86_64/rh6/openfoam/2.2.1/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/libcompressibleLESModels.so(_ZN4Foam12compressible 9LESModels14vanDriestDelta9calcDeltaEv+0x516) [0x2b509aa38776] [node1010:12768] [ 6] /curc/tools/x_86_64/rh6/openfoam/2.2.1/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/libcompressibleLESModels.so(_ZN4Foam12compressible 9LESModels9oneEqEddy7correctERKNS_3tmpINS_14Geomet ricFieldINS_6TensorIdEENS_12fvPatchFieldENS_7volMe shEEEEE+0x49) [0x2b509a9f3029] [node1010:12768] [ 7] /curc/tools/x_86_64/rh6/openfoam/2.2.1/OpenFOAM-2.2.1/platforms/linux64GccDPOpt/lib/libcompressibleLESModels.so(_ZN4Foam12compressible 8LESModel7correctEv+0x2f) [0x2b509a99f6ff] [node1010:12768] [ 8] buoyantPimpleFoam() [0x488ffb] [node1010:12768] [ 9] /lib64/libc.so.6(__libc_start_main+0xfd) [0x2b509bf28cdd] [node1010:12768] [10] buoyantPimpleFoam() [0x432215] [node1010:12768] *** End of error message *** -------------------------------------------------------------------------- mpirun noticed that process rank 39 with PID 12768 on node node1010 exited on signal 8 (Floating point exception). |
|
April 13, 2015, 09:38 |
|
#3 | |
New Member
Join Date: Mar 2015
Location: Earth yet
Posts: 25
Rep Power: 11 |
Quote:
did the absolute temperature go negative at previous step? It usually does this kind of output when it happens, I've seen it sooo many times! |
||
|
|