CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

Foam fatal error with rhoPimpleDyMFoam in parallel

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   July 7, 2017, 03:42
Default Foam fatal error with rhoPimpleDyMFoam in parallel
  #1
Member
 
Join Date: May 2017
Posts: 38
Rep Power: 8
decibelle is on a distinguished road
Hello,

I work on a compressible case in subsonic for external aerodynamic. I used rhoPimpleDyMFoam. When I run the simulation, after few iteration I see this fatal error:

Code:
[14]
[14] --> FOAM FATAL ERROR:
[14] Maximum number of iterations exceeded
[14]
[14]     From function Foam::scalar Foam::species::thermo<Thermo, Type>::T(Foam::scalar, Foam::scalar, Foam::scalar, Foam::scalar (Foam::species::thermo<Thermo, Type>::*)(Foam::scalar, Foam::scalar)const, Foam::scalar (Foam::species::thermo<Thermo, Type>::*)(Foam::scalar, Foam::scalar)const, Foam::scalar (Foam::species::thermo<Thermo, Type>::*)(Foam::scalar)const) const [with Thermo = Foam::hConstThermo<Foam::perfectGas<Foam::specie> >; Type = Foam::sensibleInternalEnergy; Foam::scalar = double; Foam::species::thermo<Thermo, Type> = Foam::species::thermo<Foam::hConstThermo<Foam::perfectGas<Foam::specie> >, Foam::sensibleInternalEnergy>]
[14]     in file /Software/OpenFOAM/OpenFOAM-v1606+/src/thermophysicalModels/specie/lnInclude/thermoI.H at line 66.
[14]
FOAM parallel run aborting
[14]
[14] #0  Foam::error::printStack(Foam::Ostream&)--------------------------------------------------------------------------
An MPI process has executed an operation involving a call to the
"fork()" system call to create a child process.  Open MPI is currently
operating in a condition that could result in memory corruption or
other system errors; your MPI job may hang, crash, or produce silent
data corruption.  The use of fork() (or system() or other calls that
create child processes) is strongly discouraged.

The process that invoked fork was:

  Local host:          cfdnode09 (PID 31528)
  MPI_COMM_WORLD rank: 14

If you are *absolutely sure* that your application will successfully
and correctly survive a call to fork(), you may disable this warning
by setting the mpi_warn_on_fork MCA parameter to 0.
--------------------------------------------------------------------------
 in "/Software/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so"
[14] #1  Foam::error::abort() in "/Software/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so"
[14] #2  Foam::species::thermo<Foam::hConstThermo<Foam::perfectGas<Foam::specie> >, Foam::sensibleInternalEnergy>::TEs(double, double, double) const in "/Software/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/lib/libfluidThermophysicalModels.so"
[14] #3  Foam::hePsiThermo<Foam::psiThermo, Foam::pureMixture<Foam::constTransport<Foam::species::thermo<Foam::hConstThermo<Foam::perfectGas<Foam::specie> >, Foam::sensibleInternalEnergy> > > >::calculate() in "/Software/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/lib/libfluidThermophysicalModels.so"
[14] #4  Foam::hePsiThermo<Foam::psiThermo, Foam::pureMixture<Foam::constTransport<Foam::species::thermo<Foam::hConstThermo<Foam::perfectGas<Foam::specie> >, Foam::sensibleInternalEnergy> > > >::correct() in "/Software/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/lib/libfluidThermophysicalModels.so"
[14] #5  ? in "/Software/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/bin/rhoPimpleDyMFoam"
[14] #6  __libc_start_main in "/lib64/libc.so.6"
[14] #7  ? in "/Software/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/bin/rhoPimpleDyMFoam"
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 14 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
I check my boundary condition and my relaxation factor but I don't see any mistake.

this is my fvSolution file:
Code:
solvers
{
    "rho.*"
    {
        solver          diagonal;
    }

    "p.*"
    {
        solver          GAMG;
        preconditioner  DILU;
        tolerance       1e-7;
        relTol          0;
        smoother        GaussSeidel;
    }

    "(U|e).*"
    {
        $p;
        tolerance       1e-6;
        smoother  DILU;
    }

    "(k|epsilon).*"
    {
        $p;
        tolerance       1e-6;
    }
    cellDisplacement
    {
        solver          GAMG;
        tolerance       1e-5;
        relTol          0;
        smoother        GaussSeidel;
        cacheAgglomeration true;
        nCellsInCoarsestLevel 10;
        agglomerator    faceAreaPair;
        mergeLevels     1;
    }

}
PIMPLE
{
    nOuterCorrectors 1;
    nCorrectors      2;
    nNonOrthogonalCorrectors 0;
}

relaxationFactors
{
    fields
    {
        p               0.3;
    }
    equations
    {
        "(U|k|epsilon)"   0.7;
        "(U|k|epsilon)Final" 1.0;
    }
}
and this is my boundary conditions:
- for U
Code:
dimensions      [0 1 -1 0 0 0 0];

internalField   uniform (0 238 0);

boundaryField
{
    inlet
    {
        type            freestream;
        pInf            100000;
        TInf            300;
        UInf            (0 238 0);
        gamma           1.4;
        freestreamValue           uniform (0 238 0);
    }

    outlet
    {
        type            inletOutlet;
        inletValue      uniform (0 238 0);
        value           uniform (0 238 0);
    }

    capsule
    {
        type            zeroGradient;
    }

    far_field
    {
        type            noSlip;
    }
}
-for p:
Code:
dimensions      [1 -1 -2 0 0 0 0];

internalField   uniform 100000;

boundaryField
{
    inlet
    {
        type            zeroGradient;
    }

    outlet
    {
        type            inletOutlet;
        inletValue      uniform 100000;
        value           uniform 100000;
    }

    capsule
    {
        type            zeroGradient;
    }

    far_field
    {
        type            zeroGradient;
    }
}
for T:
Code:
dimensions      [0 0 0 1 0 0 0];

internalField   uniform 300;

boundaryField
{
    inlet
    {
        type            inletOutlet;
        inletValue      uniform 300;
        value           uniform 300;
    }

    outlet
    {
        type            inletOutlet;
        inletValue      uniform 300;
        value           uniform 300;
    }

    capsule
    {
        type            zeroGradient;
    }

    far_field
    {
        type            zeroGradient;
    }
}
for nut:
Code:
dimensions      [0 2 -1 0 0 0 0];

internalField   uniform 0;

boundaryField
{
    inlet
    {
        type            calculated;
        value           uniform 0;
    }
    outlet
    {
        type            calculated;
        value           uniform 0;
    }
    capsule
    {
        type            nutkWallFunction;
        Cmu             0.09;
        kappa           0.41;
        E               9.8;
        value           uniform 0;
    }
    far_field
    {
        type            zeroGradient;
    }
}
for alphat
Code:
dimensions      [1 -1 -1 0 0 0 0];

internalField   uniform 0;

boundaryField
{
    inlet
    {
        type            calculated;
        value           uniform 0;
    }
    outlet
    {
        type            calculated;
        value           uniform 0;
    }
    capsule
    {
        type            compressible::alphatWallFunction;
        Prt             0.85;
        value           uniform 0;
    }
    far_field
    {
        type            zeroGradient;
    }
}
Someone can help me to understand where is the mistake?

thanks in advance
decibelle is offline   Reply With Quote

Old   July 7, 2017, 05:15
Default
  #2
Senior Member
 
sheaker's Avatar
 
Oskar
Join Date: Nov 2015
Location: Poland
Posts: 184
Rep Power: 10
sheaker is on a distinguished road
I have found topic with similar problem. Cannot analyse it deeper cause lack of free time. I hope You will find something useful here:
http://archive.is/58Isy

I wish You success!
sheaker
sheaker is offline   Reply With Quote

Old   July 7, 2017, 06:01
Default
  #3
Member
 
Join Date: May 2017
Posts: 38
Rep Power: 8
decibelle is on a distinguished road
thanks for your reply, but I didn't find the solution in this post.
decibelle is offline   Reply With Quote

Reply

Tags
rhopimpledymfoam


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Foam::error::PrintStack almir OpenFOAM Running, Solving & CFD 91 December 21, 2022 04:50
chtMultiRegionSimpleFoam: crash on parallel run student666 OpenFOAM Running, Solving & CFD 3 April 20, 2017 11:05
[mesh manipulation] Importing Multiple Meshes thomasnwalshiii OpenFOAM Meshing & Mesh Conversion 18 December 19, 2015 18:57
simpleFoam parallel AndrewMortimer OpenFOAM Running, Solving & CFD 12 August 7, 2015 18:45
decomposePar is missing a library whk1992 OpenFOAM Pre-Processing 8 March 7, 2015 07:53


All times are GMT -4. The time now is 22:47.