CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

#0 Foam::error::printStack(Foam::Ostream&) at ??:? - parallel run

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   March 28, 2023, 15:17
Default #0 Foam::error::printStack(Foam::Ostream&) at ??:? - parallel run
  #1
New Member
 
harshavardhan
Join Date: Nov 2017
Posts: 20
Rep Power: 8
harsha002 is on a distinguished road
Hi
I am running a spray injection simulation studying the evaporative cooling in a domain. The domain is 1.9m in length and 0.585m in width. The simulation is transient and parallel. I run the just the flow for the first 5 seconds and inject the droplets after 5 seconds. The simulation is running for 5second and throwing a error once the spray is initialized. For the inlet P = zero gradient and U = surface normal fixed value and for the outlet P= fixed value(100000) and U = pressure inlet-outlet velocity.

I am new to Openfoam I am getting ahold to an extent but could not understand the below error which I am usually facing in this simulation. Any guidance would be of great help.





Loading OpenFOAM/2112
Loading requirement: openmpi/4.1.2
[25] #0 Foam::error:rintStack(Foam::Ostream&) at ??:?
[25] #1 Foam::sigFpe::sigHandler(int) at ??:?
[25] #2 ? in /lib64/libpthread.so.0
[25] #3 Foam::divide(Foam::Field<double>&, Foam::UList<double> const&, Foam::UList<double> const&) at ??:?
[25] #4 void Foam::divide<Foam::fvPatchField, Foam::volMesh>(Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>&, Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh> const&, Foam::GeometricField<double, Foam::f$
[25] #5 Foam::tmp<Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh> > Foam:perator/<Foam::fvPatchField, Foam::volMesh>(Foam::tmp<Foam::GeometricField<doub le, Foam::fvPatchField, Foam::volMesh> > const&, Foam::GeometricF$
[25] #6 Foam::kOmegaSSTBase<Foam::eddyViscosity<Foam::RASM odel<Foam::EddyDiffusivity<Foam::ThermalDiffusivit y<Foam::CompressibleTurbulenceModel<Foam::fluidThe rmo> > > > > >::F1(Foam::GeometricField<double, Foam::fvPatchField, Foam::vol$
[25] #7 Foam::kOmegaSSTBase<Foam::eddyViscosity<Foam::RASM odel<Foam::EddyDiffusivity<Foam::ThermalDiffusivit y<Foam::CompressibleTurbulenceModel<Foam::fluidThe rmo> > > > > >::correct() at ??:?
[25] #8 ? at ??:?
[25] #9 __libc_start_main in /lib64/libc.so.6
[25] #10 ? at ??:?
[gadi-cpu-clx-0861:2926199:0:2926199] Caught signal 8 (Floating point exception: tkill(2) or tgkill(2))
==== backtrace (tid:2926199) ====
0 0x0000000000012cf0 __funlockfile() :0
1 0x0000000000012b8f gsignal() :0
2 0x0000000000012cf0 __funlockfile() :0
3 0x000000000089469a Foam::divide() ???:0
4 0x0000000000474e1f Foam::divide<Foam::fvPatchField, Foam::volMesh>() ???:0
5 0x0000000000196c8b Foam:perator/<Foam::fvPatchField, Foam::volMesh>() ???:0
6 0x00000000001bd22c Foam::kOmegaSSTBase<Foam::eddyViscosity<Foam::RASM odel<Foam::EddyDiffusivity<Foam::ThermalDiffusivit y<Foam::CompressibleTurbulenceModel<Foam::fluidThe rmo> > > > > >::F1() ???:0
7 0x0000000000203552 Foam::kOmegaSSTBase<Foam::eddyViscosity<Foam::RASM odel<Foam::EddyDiffusivity<Foam::ThermalDiffusivit y<Foam::CompressibleTurbulenceModel<Foam::fluidThe rmo> > > > > >::correct() ???:0
8 0x000000000044c8a7 main() ???:0
9 0x000000000003ad85 __libc_start_main() ???:0
10 0x000000000044de9e _start() ???:0
=================================
--------------------------------------------------------------------------
Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 25 with PID 0 on node gadi-cpu-clx-0861 exited on signal 8 (Floating point exception).
--------------------------------------------------------------------------
[gadi-cpu-clx-0861:2926551] *** An error occurred in MPI_Recv
[gadi-cpu-clx-0861:2926551] *** reported by process [2610561025,2]
[gadi-cpu-clx-0861:2926551] *** on communicator MPI_COMM_WORLD
[gadi-cpu-clx-0861:2926551] *** MPI_ERR_TRUNCATE: message truncated
[gadi-cpu-clx-0861:2926551] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[gadi-cpu-clx-0861:2926551] *** and potentially your MPI job)
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[0]
[0]
[0] --> FOAM FATAL IO ERROR: (openfoam-2112)
[0] Bad token - could not get fileName
[0]
[0] file: stream at line 0.
[0]
[0] From Foam::Istream& Foam:perator>>(Foam::Istream&, Foam::fileName&)
[0] in file primitives/strings/fileName/fileNameIO.C at line 69.
[0]
harsha002 is offline   Reply With Quote

Old   March 28, 2023, 15:19
Default
  #2
New Member
 
harshavardhan
Join Date: Nov 2017
Posts: 20
Rep Power: 8
harsha002 is on a distinguished road
I understand that there is a division by zero but not sure why this is happening.
harsha002 is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
The problem when i use parallel computation for mesh deforming. Hiroaki Sumikawa OpenFOAM Running, Solving & CFD 0 November 20, 2018 02:58
chtMultiRegionSimpleFoam: crash on parallel run student666 OpenFOAM Running, Solving & CFD 3 April 20, 2017 11:05
User fortran to input/output file in a parallel run doublestrong CFX 5 March 31, 2017 08:15
parallel Grief: BoundaryFields ok in single CPU but NOT in Parallel JR22 OpenFOAM Running, Solving & CFD 2 April 19, 2013 16:49
Parallel Run on dynamically mounted partition braennstroem OpenFOAM Running, Solving & CFD 14 October 5, 2010 14:43


All times are GMT -4. The time now is 08:12.