CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Running, Solving & CFD (https://www.cfd-online.com/Forums/openfoam-solving/)
-   -   rhoPimpleFoam floating point error (https://www.cfd-online.com/Forums/openfoam-solving/101345-rhopimplefoam-floating-point-error.html)

dancfd May 1, 2012 23:50

rhoPimpleFoam floating point error
 
1 Attachment(s)
Hello all,

I am trying to run a NACA0012 airfoil in OF 2.0.1. My case works in pimpleFoam, and I am trying to make it work in rhoPimpleFoam. This is where I run into trouble. After some iterations, I get a floating point error:

Code:

[3] #0  Foam::error::printStack(Foam::Ostream&) in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[3] #1  Foam::sigFpe::sigHandler(int) in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[3] #2  in "/lib/x86_64-linux-gnu/libc.so.6"
[3] #3  Foam::hPsiThermo<Foam::pureMixture<Foam::sutherlandTransport<Foam::specieThermo<Foam::hConstThermo<Foam::perfectGas> > > > >::calculate() in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so"
[3] #4  Foam::hPsiThermo<Foam::pureMixture<Foam::sutherlandTransport<Foam::specieThermo<Foam::hConstThermo<Foam::perfectGas> > > > >::correct() in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so"
[3] #5 
[3]  in "/opt/openfoam201/platforms/linux64GccDPOpt/bin/rhoPimpleFoam"
[3] #6  __libc_start_main in "/lib/x86_64-linux-gnu/libc.so.6"
[3] #7 
[3]  in "/opt/openfoam201/platforms/linux64GccDPOpt/bin/rhoPimpleFoam"
[dan-XPS-8300:00764] *** Process received signal ***
[dan-XPS-8300:00764] Signal: Floating point exception (8)
[dan-XPS-8300:00764] Signal code:  (-6)
[dan-XPS-8300:00764] Failing at address: 0x3e8000002fc
[dan-XPS-8300:00764] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x36420) [0x7fbeeb902420]
[dan-XPS-8300:00764] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0x35) [0x7fbeeb9023a5]
[dan-XPS-8300:00764] [ 2] /lib/x86_64-linux-gnu/libc.so.6(+0x36420) [0x7fbeeb902420]
[dan-XPS-8300:00764] [ 3] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so(_ZN4Foam10hPsiThermoINS_11pureMixtureINS_19sutherlandTransportINS_12specieThermoINS_12hConstThermoINS_10perfectGasEEEEEEEEEE9calculateEv+0x342) [0x7fbeef093c42]
[dan-XPS-8300:00764] [ 4] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so(_ZN4Foam10hPsiThermoINS_11pureMixtureINS_19sutherlandTransportINS_12specieThermoINS_12hConstThermoINS_10perfectGasEEEEEEEEEE7correctEv+0x32) [0x7fbeef0990c2]
[dan-XPS-8300:00764] [ 5] rhoPimpleFoam() [0x41e185]
[dan-XPS-8300:00764] [ 6] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xed) [0x7fbeeb8ed30d]
[dan-XPS-8300:00764] [ 7] rhoPimpleFoam() [0x419b69]
[dan-XPS-8300:00764] *** End of error message ***
--------------------------------------------------------------------------
mpirun noticed that process rank 3 with PID 764 on node dan-XPS-8300 exited on signal 8 (Floating point exception).

I found a reference that suggested I should add the following to OpenFOAM's bashrc:

Code:

unset FOAM_SIGFPE
But when I do that I get a new error, nan, that appears to start with the k & omega and then spreads to the rest on the next iteration:

Code:

Courant Number mean: 0.0240962 max: 0.900198
deltaT = 1.6889e-05
Time = 0.133221

diagonal:  Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0
PIMPLE: iteration 1
smoothSolver:  Solving for Ux, Initial residual = 0.000366418, Final residual = 2.27145e-09, No Iterations 2
smoothSolver:  Solving for Uy, Initial residual = 0.000487366, Final residual = 1.37609e-08, No Iterations 2
smoothSolver:  Solving for h, Initial residual = 0.000464378, Final residual = 2.25201e-09, No Iterations 2
GAMG:  Solving for p, Initial residual = 0.00780075, Final residual = 1.05014e-08, No Iterations 1
diagonal:  Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0
time step continuity errors : sum local = 0.00378365, global = 0.00357667, cumulative = 4.21379
rho max/min : 2.50066 0.494607
GAMG:  Solving for p, Initial residual = 0.00388573, Final residual = 5.73108e-09, No Iterations 1
diagonal:  Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0
time step continuity errors : sum local = 0.00378365, global = 0.00357667, cumulative = 4.21737
rho max/min : 2.50066 0.494607
PIMPLE: iteration 2
smoothSolver:  Solving for Ux, Initial residual = 0.000173541, Final residual = 3.69469e-11, No Iterations 4
smoothSolver:  Solving for Uy, Initial residual = 0.000244685, Final residual = 6.60247e-11, No Iterations 5
smoothSolver:  Solving for h, Initial residual = 0.000945961, Final residual = 8.78996e-11, No Iterations 4
GAMG:  Solving for p, Initial residual = 0.0278345, Final residual = 7.14718e-09, No Iterations 1
diagonal:  Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0
time step continuity errors : sum local = 0.00378365, global = 0.00357667, cumulative = 4.22095
rho max/min : 2.50033 0.497303
GAMG:  Solving for p, Initial residual = 0.0141121, Final residual = 4.07318e-09, No Iterations 1
diagonal:  Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0
time step continuity errors : sum local = 0.00378365, global = 0.00357667, cumulative = 4.22452
rho max/min : 2.50033 0.497303
smoothSolver:  Solving for omega, Initial residual = 1.71392e-07, Final residual = 3.26293e-11, No Iterations 3
smoothSolver:  Solving for k, Initial residual = 4.32051e-06, Final residual = 7.16007e-11, No Iterations 4
PIMPLE: not converged within 2 iterations
ExecutionTime = 1261.98 s  ClockTime = 1270 s

Courant Number mean: 0.0240905 max: 0.90035
deltaT = 1.68805e-05
Time = 0.133238

diagonal:  Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0
PIMPLE: iteration 1
smoothSolver:  Solving for Ux, Initial residual = 0.000366052, Final residual = 2.27079e-09, No Iterations 2
smoothSolver:  Solving for Uy, Initial residual = 0.000487299, Final residual = 1.35382e-08, No Iterations 2
smoothSolver:  Solving for h, Initial residual = 0.000767231, Final residual = 2.28888e-09, No Iterations 2
GAMG:  Solving for p, Initial residual = 0.0288855, Final residual = 1.03388e-08, No Iterations 1
diagonal:  Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0
time step continuity errors : sum local = 0.00479274, global = 0.00458523, cumulative = 4.22911
rho max/min : 2.50067 0.494011
GAMG:  Solving for p, Initial residual = 0.0142369, Final residual = 5.55694e-09, No Iterations 1
diagonal:  Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0
time step continuity errors : sum local = 0.00479274, global = 0.00458523, cumulative = 4.23369
rho max/min : 2.50067 0.494011
PIMPLE: iteration 2
smoothSolver:  Solving for Ux, Initial residual = 0.000170074, Final residual = 3.71863e-11, No Iterations 4
smoothSolver:  Solving for Uy, Initial residual = 0.000244105, Final residual = 6.48768e-11, No Iterations 5
smoothSolver:  Solving for h, Initial residual = 0.00239322, Final residual = 8.97771e-11, No Iterations 4
GAMG:  Solving for p, Initial residual = 0.145301, Final residual = 7.51346e-09, No Iterations 1
diagonal:  Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0
time step continuity errors : sum local = 0.00479275, global = 0.00458523, cumulative = 4.23828
rho max/min : 2.50034 0.497625
GAMG:  Solving for p, Initial residual = 0.0783762, Final residual = 4.55685e-09, No Iterations 1
diagonal:  Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0
time step continuity errors : sum local = 0.00479275, global = 0.00458523, cumulative = 4.24286
rho max/min : 2.50034 0.497625
smoothSolver:  Solving for omega, Initial residual = nan, Final residual = nan, No Iterations 1000
smoothSolver:  Solving for k, Initial residual = nan, Final residual = nan, No Iterations 1000
[0]
[0]
[0] --> FOAM FATAL IO ERROR:
[0] wrong token type - expected Scalar, found on line 3 the word 'nan'
[0]
[0] [1]
[1]
[1] --> FOAM FATAL IO ERROR:
[1] wrong token type - expected Scalar, found on line 3 the word 'nan'
[1]
[1] file: /home/dan/OpenFOAM/dan-2.0.1/run/cases/comp_trans_finetune/transBackRho/processor1/system/data::solverPerformance::k at line 3.
[1]
[1]    From function operator>>(Istream&, Scalar&)[2]
[2]
[2] --> FOAM FATAL IO ERROR:
[2] wrong token type - expected Scalar, found on line 3 the word 'nan'
[2]
[2] file: /home/dan/OpenFOAM/dan-2.0.1/run/cases/comp_trans_finetune/transBackRho/processor2/system/data::solverPerformance::k at line 3.
[2]
[2]    From function operator>>(Istream&, Scalar&)
[2]    in file lnInclude/Scalar.C at line 91.
[2]
FOAM parallel run exiting
[2]
[3]
[3]
[3] --> FOAM FATAL IO ERROR:
[3] wrong token type - expected Scalar, found on line 3 the word 'nan'
[3]
[3] file: /home/dan/OpenFOAM/dan-2.0.1/run/cases/comp_trans_finetune/transBackRho/processor3/system/data::solverPerformance::k at line 3.
[3]
[3]    From function operator>>(Istream&, Scalar&)
[3]    in file lnInclude/Scalar.C at line 91.
[3]
FOAM parallel run exiting
[3]
file: /home/dan/OpenFOAM/dan-2.0.1/run/cases/comp_trans_finetune/transBackRho/processor0/system/data::solverPerformance::k at line 3.
[0]
[0]    From function operator>>(Istream&, Scalar&)
[0]    in file lnInclude/Scalar.C at line 91.
[0]
FOAM parallel run exiting
[0]

[1]    in file lnInclude/Scalar.C at line 91.
[1]
FOAM parallel run exiting
[1]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 2 with PID 30679 on
node dan-XPS-8300 exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[dan-XPS-8300:30676] 3 more processes have sent help message help-mpi-api.txt / mpi-abort
[dan-XPS-8300:30676] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

Can someone suggest how I can fix this? fvSchemes and fvSolution are attached if it is of assistance.

Thanks in advance,
Dan

olivierG May 2, 2012 06:12

hello,
Try to set relTol at zero for p, and lower you maxCo in control dict (max 0.25)

regards,
olivier

dancfd May 2, 2012 20:28

Hello Olivier,

Thanks for the response. I will try that, though I am concerned about using maxCo 0.25 for practical reasons. Using an implicit code, and all of the time schemes are implicit in OF, I should be able to get away with using a much higher Co - my advisor suggested 20-50 is not unreasonable. Why is it that OF seems to require such low Co numbers to function?

Thanks,
Dan

olivierG May 3, 2012 03:09

hello,
Can you share your BC, (k & omega BC) ? Something may be wrong with that.

regards,
olivier

dancfd May 3, 2012 20:05

1 Attachment(s)
Hello Olivier,

Thanks for your interest. Please find attached my 0 directory.

Regards,

Dan

sihaqqi December 12, 2013 22:08

Daniel

I am having the same error using simpleFoam. Can you please advise how you resolved this nan issue.

regards

dancfd January 5, 2014 20:57

Hello,

Unfortunately, I never did resolve the issue. Moved to rhoCentralFoam instead.

Regards,

Daniel


All times are GMT -4. The time now is 14:55.