CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Community Contributions (https://www.cfd-online.com/Forums/openfoam-community-contributions/)
-   -   [swak4Foam] swak4foam-0.3.0 make parallel run crash on foam-extend-3.0 (https://www.cfd-online.com/Forums/openfoam-community-contributions/132416-swak4foam-0-3-0-make-parallel-run-crash-foam-extend-3-0-a.html)

Aleksey_R March 31, 2014 12:09

swak4foam-0.3.0 make parallel run crash on foam-extend-3.0
 
Hello, dear foamers!

Suddenly I've encountered following problem. After installation of latest foam-extend-3.0 and swak4foam-0.3.0 my parallel runs crash. The same case runs ok on serial run. In addition, same case runs ok in parallel on OF-1.6-ext and swak4foam 0.2.0. Terminal output of crash (simpleFoam):

Code:

/*---------------------------------------------------------------------------*\
| =========                |                                                |
| \\      /  F ield        | foam-extend: Open Source CFD                    |
|  \\    /  O peration    | Version:  3.0                                  |
|  \\  /    A nd          | Web:        http://www.extend-project.de      |
|    \\/    M anipulation  |                                                |
\*---------------------------------------------------------------------------*/
Build    : 3.0-8c304619f538
Exec    : simpleFoam -parallel
Date    : Mar 31 2014
Time    : 19:52:42
Host    : chaos2
PID      : 16636
CtrlDict : /home/aleksey/foam/foam-extend-3.0/etc/controlDict
Case    : /home/aleksey/foam/aleksey-3.0/run/trial
nProcs  : 4
Slaves :
3
(
chaos2.16637
chaos2.16638
chaos2.16639
)

Pstream initialized with:
    floatTransfer    : 0
    nProcsSimpleSum  : 0
    commsType        : blocking
SigFpe  : Enabling floating point exception trapping (FOAM_SIGFPE).

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0

Reading field p

Reading field U

Reading/calculating face flux field phi

Selecting incompressible transport model Newtonian
Selecting RAS turbulence model laminar

Starting time loop

Time = 0.1

DILUPBiCG:  Solving for Ux, Initial residual = 0, Final residual = 0, No Iterations 0
DILUPBiCG:  Solving for Uy, Initial residual = 0, Final residual = 0, No Iterations 0
DILUPBiCG:  Solving for Uz, Initial residual = 1, Final residual = 1.95781538458e-10, No Iterations 11
DICPCG:  Solving for p, Initial residual = 1, Final residual = 9.9567414437e-10, No Iterations 311
DICPCG:  Solving for p, Initial residual = 0.351569807727, Final residual = 9.39034466264e-10, No Iterations 268
DICPCG:  Solving for p, Initial residual = 0.0602889779533, Final residual = 9.35013783622e-10, No Iterations 257
DICPCG:  Solving for p, Initial residual = 0.0177328464787, Final residual = 9.218992822e-10, No Iterations 252
DICPCG:  Solving for p, Initial residual = 0.0056772888986, Final residual = 9.11894751162e-10, No Iterations 246
time step continuity errors : sum local = 4.36977402214e-07, global = 3.01209409715e-10, cumulative = 3.01209409715e-10
[chaos2:16636] *** An error occurred in MPI_Recv
[chaos2:16636] *** on communicator MPI_COMM_WORLD
[chaos2:16636] *** MPI_ERR_TRUNCATE: message truncated
[chaos2:16636] *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort
--------------------------------------------------------------------------
mpirun has exited due to process rank 1 with PID 16637 on
node chaos2 exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[chaos2:16635] 3 more processes have sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[chaos2:16635] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

Using ThirdParty OpenMPI 1.6.5 and ThirdParty gcc 4.6.3. OS: OpenSUSE 13.1 64 bit.

Dear colleagues, I'd like to ask you, whether your parallel runs crash with new foam-extend and swak4foam.


Best regards,
Aleksey.

gschaider March 31, 2014 19:49

Quote:

Originally Posted by Aleksey_R (Post 483053)
Hello, dear foamers!

Suddenly I've encountered following problem. After installation of latest foam-extend-3.0 and swak4foam-0.3.0 my parallel runs crash. The same case runs ok on serial run. In addition, same case runs ok in parallel on OF-1.6-ext and swak4foam 0.2.0. Terminal output of crash (simpleFoam):

Code:

/*---------------------------------------------------------------------------*\
| =========                |                                                |
| \\      /  F ield        | foam-extend: Open Source CFD                    |
|  \\    /  O peration    | Version:  3.0                                  |
|  \\  /    A nd          | Web:        http://www.extend-project.de      |
|    \\/    M anipulation  |                                                |
\*---------------------------------------------------------------------------*/
Build    : 3.0-8c304619f538
Exec    : simpleFoam -parallel
Date    : Mar 31 2014
Time    : 19:52:42
Host    : chaos2
PID      : 16636
CtrlDict : /home/aleksey/foam/foam-extend-3.0/etc/controlDict
Case    : /home/aleksey/foam/aleksey-3.0/run/trial
nProcs  : 4
Slaves :
3
(
chaos2.16637
chaos2.16638
chaos2.16639
)

Pstream initialized with:
    floatTransfer    : 0
    nProcsSimpleSum  : 0
    commsType        : blocking
SigFpe  : Enabling floating point exception trapping (FOAM_SIGFPE).

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0

Reading field p

Reading field U

Reading/calculating face flux field phi

Selecting incompressible transport model Newtonian
Selecting RAS turbulence model laminar

Starting time loop

Time = 0.1

DILUPBiCG:  Solving for Ux, Initial residual = 0, Final residual = 0, No Iterations 0
DILUPBiCG:  Solving for Uy, Initial residual = 0, Final residual = 0, No Iterations 0
DILUPBiCG:  Solving for Uz, Initial residual = 1, Final residual = 1.95781538458e-10, No Iterations 11
DICPCG:  Solving for p, Initial residual = 1, Final residual = 9.9567414437e-10, No Iterations 311
DICPCG:  Solving for p, Initial residual = 0.351569807727, Final residual = 9.39034466264e-10, No Iterations 268
DICPCG:  Solving for p, Initial residual = 0.0602889779533, Final residual = 9.35013783622e-10, No Iterations 257
DICPCG:  Solving for p, Initial residual = 0.0177328464787, Final residual = 9.218992822e-10, No Iterations 252
DICPCG:  Solving for p, Initial residual = 0.0056772888986, Final residual = 9.11894751162e-10, No Iterations 246
time step continuity errors : sum local = 4.36977402214e-07, global = 3.01209409715e-10, cumulative = 3.01209409715e-10
[chaos2:16636] *** An error occurred in MPI_Recv
[chaos2:16636] *** on communicator MPI_COMM_WORLD
[chaos2:16636] *** MPI_ERR_TRUNCATE: message truncated
[chaos2:16636] *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort
--------------------------------------------------------------------------
mpirun has exited due to process rank 1 with PID 16637 on
node chaos2 exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[chaos2:16635] 3 more processes have sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[chaos2:16635] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

Using ThirdParty OpenMPI 1.6.5 and ThirdParty gcc 4.6.3. OS: OpenSUSE 13.1 64 bit.

Dear colleagues, I'd like to ask you, whether your parallel runs crash with new foam-extend and swak4foam.


Best regards,
Aleksey.

That is really hard to tell with the information you provide:
- what kind of case is this (standard tutorial etc). Are there any functionObjects etc added
- does this case use swak? If yes: try to remove it (no libs, no functions) and see if the problem persists
- have you tried to run a tutorial case (for instance pitzDaily) in parallel on that installation?

Aleksey_R April 2, 2014 12:04

Dear Bernhard,

thank you very much for your reply!

Here's the case (simple 3D tube + Poiselle's inlet conditions):
http://files.mail.ru/E0756960CFBB4DFC82322AC2935C2325

It runs ok in following cases:
- using groovyBC for U + serial run
- using funkySetBoundaryField + fixedValue conditions for U + serial run
- using funkySetBoundaryField + fixedValue conditions for U + parallel run

It crashes in following cases:
- using groovyBC for U + parallel run

As you can easily see, there's no functionObjects in presented run case.

These data suggest that namely using of groovyBC make parallel run crash.

If you need any additional data, please, let me know.


Best regards,
Aleksey.


Aleksey_R April 5, 2014 09:27

Dear colleagues,

please, could you check if presented case runs in parallel on your machines (using groovyBC). I do need to know whether this is a bug or I simply installed OF and SWAK wrong.

Best regards,
Aleksey.

gschaider April 11, 2014 06:57

Quote:

Originally Posted by Aleksey_R (Post 484005)
Dear colleagues,

please, could you check if presented case runs in parallel on your machines (using groovyBC). I do need to know whether this is a bug or I simply installed OF and SWAK wrong.

Best regards,
Aleksey.

I can reproduce that. Seems to be a duplicate of http://sourceforge.net/apps/mantisbt...iew.php?id=123

One remark: could you please use a file sharing service that provides an english translation. Some people feel extremely uncomfortable clicking on stuff that they can't read. Especially with the bad reputation these russian file-shares have concerning the distribution of viruses. Doesn't have to be a "endorsed by the NSA"-service. Remove the points/faces/etc-files from polyMesh and the case should be small enough to add as an attachment here on the board.

I'll have a look. But as mentioned in the Manits-report: this seems to happen one level below swak

Aleksey_R April 17, 2014 14:53

1 Attachment(s)
Dear Bernhard,

I'm terribly sorry for inconvenience. I attach the example run case to this post.

Indeed, changing of etc/controlDict helped! Here's a link to the workaround:

http://sourceforge.net/apps/mantisbt...iew.php?id=123

Thank you very much for advise!

Best regards,
Aleksey.

mjzhao March 12, 2015 05:12

Quote:

Originally Posted by Aleksey_R (Post 486778)
Dear Bernhard,

I'm terribly sorry for inconvenience. I attach the example run case to this post.

Indeed, changing of etc/controlDict helped! Here's a link to the workaround:

http://sourceforge.net/apps/mantisbt...iew.php?id=123

Thank you very much for advise!

Best regards,
Aleksey.

Hi
I have the same problem, I don't understand that "changing of etc/controlDict helped" mean and I can't find any helpful information with the link "http://sourceforge.net/apps/mantisbt...iew.php?id=123",

Aleksey_R April 3, 2015 09:35

Quote:

Originally Posted by mjzhao (Post 535936)
Hi
I have the same problem, I don't understand that "changing of etc/controlDict helped" mean and I can't find any helpful information with the link "http://sourceforge.net/apps/mantisbt...iew.php?id=123",

Hello!
It means you need to change the fragment of $WM_PROJECT_DIR/etc/controlDict from:
Code:

//    commsType      nonBlocking; //scheduled; //blocking;
    commsType      blocking; //scheduled;

to:
Code:

    commsType      nonBlocking; //scheduled; //blocking;
//    commsType      blocking; //scheduled;


mjzhao April 3, 2015 09:49

Quote:

Originally Posted by Aleksey_R (Post 539834)
Hello!
It means you need to change the fragment of $WM_PROJECT_DIR/etc/controlDict from:
Code:

//    commsType      nonBlocking; //scheduled; //blocking;
    commsType      blocking; //scheduled;

to:
Code:

    commsType      nonBlocking; //scheduled; //blocking;
//    commsType      blocking; //scheduled;


Thank you ~

Scram_1 March 18, 2019 15:57

Hi,
I'm facing a similar problem. With groovyBC, my parallel computation doesn't work.
And I have nonBlocking in my etc/controlDict as well.

Code:

    //- Modification checking:
    //  - timeStamp        : use modification time on file
    //  - inotify          : use inotify framework
    //  - timeStampMaster  : do time stamp (and file reading) only on master.
    //  - inotifyMaster    : do inotify (and file reading) only on master.
    fileModificationChecking timeStampMaster;

    commsType      nonBlocking; //scheduled; //blocking;
    floatTransfer  0;
    nProcsSimpleSum 0;

    // Optional max size (bytes) for unstructured data exchanges. In some
    // phases of OpenFOAM it can send over very large data chunks
    // (e.g. in parallel load balancing) and some Pstream implementations have
    // problems with this. Setting this variable > 0 indicates that the
    // data exchange needs to be done in multiple passes, each of maxCommsSize.
    // This is not switched on by default since it requires an additional
    // global reduction, even if multi-pass is not needed)
    maxCommsSize    0;

I'm using OF v1706. Is there anything else that needs to be changed to get groovyBC to start working in parallel.

gschaider March 21, 2019 18:58

Quote:

Originally Posted by Scram_1 (Post 728139)
Hi,
I'm facing a similar problem. With groovyBC, my parallel computation doesn't work.
And I have nonBlocking in my etc/controlDict as well.

Code:

    //- Modification checking:
    //  - timeStamp        : use modification time on file
    //  - inotify          : use inotify framework
    //  - timeStampMaster  : do time stamp (and file reading) only on master.
    //  - inotifyMaster    : do inotify (and file reading) only on master.
    fileModificationChecking timeStampMaster;

    commsType      nonBlocking; //scheduled; //blocking;
    floatTransfer  0;
    nProcsSimpleSum 0;

    // Optional max size (bytes) for unstructured data exchanges. In some
    // phases of OpenFOAM it can send over very large data chunks
    // (e.g. in parallel load balancing) and some Pstream implementations have
    // problems with this. Setting this variable > 0 indicates that the
    // data exchange needs to be done in multiple passes, each of maxCommsSize.
    // This is not switched on by default since it requires an additional
    // global reduction, even if multi-pass is not needed)
    maxCommsSize    0;

I'm using OF v1706. Is there anything else that needs to be changed to get groovyBC to start working in parallel.


This is a rather old thread. With "doesn't work" you mean the same behaviour as the original poster? Could you be more specific about the swak-version you use? Is this your own case or have you tried one of the examples?

Scram_1 March 21, 2019 19:46

Hi Bernhard!
Thanks for replying!
I'm using swak 0.4.2. I'm running a case where I have air as the bulk fluid and I'm injecting water droplets. So essentially I have two fluids in my domain once the water evaporates and becomes "part" of the bulk fluid. I have a Neumann BC at one of the patches.

Here my groovyBC
Code:

    BOTTOM
        {
        type            groovyBC;
        variables        "Dmuc=4.3e-6;Dmem=1.4e-6;tmem=1e-5;kow=200;liq{BOTTOM}=H2O;";
        valueExpression        "liq";
        gradientExpression "-(Dmem*kow/(Dmuc*tmem))*liq*1e-3";
        fractionExpression "0";
        value                        uniform 0;
        evaluateDuringConstruction 1;
        timelines      (
);
        lookuptables    (
);
        lookuptables2D  (
);
    }

This is the error I'm getting when I have 1-air as one of the variables in groovyBC.
Code:

--> FOAM FATAL ERROR:
 Parser Error for driver PatchValueExpressionDriver at "1.3-5" :"field air not existing or of wrong type"
"1-air"
    ^^^
----| 

Context of the error:


- Driver constructed from scratch
  Evaluating expression "1-air"


    From function parsingValue
    in file lnInclude/CommonValueExpressionDriverI.H at line 1246.

FOAM exiting

When I change it to H2O, I get this error
Code:

--> FOAM FATAL ERROR:
hanging pointer at index 1 (size 5), cannot dereference

    From function const T& Foam::UPtrList<T>::operator[](Foam::label) const [with T = Foam::fvPatchField<double>; Foam::label = int]
    in file /usr/local/apps/OpenFOAM/gcc482-v1706/OpenFOAM-v1706/src/OpenFOAM/lnInclude/UPtrListI.H at line 107.

FOAM aborting

#0  Foam::error::printStack(Foam::Ostream&) at ??:?
#1  Foam::error::abort() at ??:?
#2  Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>::Patch const& Foam::fvPatch::patchField<Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>, double>(Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh> const&) const at ??:?
#3  Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>::Patch const& Foam::fvPatch::lookupPatchField<Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>, double>(Foam::word const&, Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh> const*, double const*) const at ??:?
#4  Foam::tmp<Foam::Field<double> > Foam::PatchValueExpressionDriver::getField<double>(Foam::word const&) at ??:?
#5  parserPatch::PatchValueExpressionParser::parse() at ??:?
#6  Foam::PatchValueExpressionDriver::parseInternal(int) at ??:?
#7  Foam::CommonValueExpressionDriver::parse(Foam::exprString const&, Foam::word const&) at ??:?
#8  Foam::CommonValueExpressionDriver::evaluateVariableRemote(Foam::exprString const&, Foam::word const&, Foam::exprString const&) at ??:?
#9  Foam::CommonValueExpressionDriver::addVariables(Foam::exprString const&, bool) at ??:?
#10  Foam::CommonValueExpressionDriver::addVariables(Foam::List<Foam::exprString> const&, bool) at ??:?
#11  Foam::CommonValueExpressionDriver::clearVariables() at ??:?
#12  Foam::groovyBCFvPatchField<double>::updateCoeffs() at ??:?
#13  Foam::mixedFvPatchField<double>::evaluate(Foam::UPstream::commsTypes) at ??:?
#14  Foam::groovyBCFvPatchField<double>::groovyBCFvPatchField(Foam::fvPatch const&, Foam::DimensionedField<double, Foam::volMesh> const&, Foam::dictionary const&) at ??:?
#15  Foam::fvPatchField<double>::adddictionaryConstructorToTable<Foam::groovyBCFvPatchField<double> >::New(Foam::fvPatch const&, Foam::DimensionedField<double, Foam::volMesh> const&, Foam::dictionary const&) at ??:?
#16  Foam::fvPatchField<double>::New(Foam::fvPatch const&, Foam::DimensionedField<double, Foam::volMesh> const&, Foam::dictionary const&) at ??:?
#17  Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>::Boundary::readField(Foam::DimensionedField<double, Foam::volMesh> const&, Foam::dictionary const&) at ??:?
#18  Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>::readFields(Foam::dictionary const&) at ??:?
#19  Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>::readFields() at ??:?
#20  Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>::GeometricField(Foam::IOobject const&, Foam::fvMesh const&, bool) at ??:?
#21  ? at ??:?
#22  ? at ??:?
#23  __libc_start_main in "/lib64/libc.so.6"
#24  ? at ??:?

I started tinkering around with groovyBC options and realized that making evaluateDuringConstruction 0 solved the hanging pointer error. So, now my liq{BOTTOM} = H2O. The case is chugging along now but don't know if the results will be accurate. Will only know once the simulation ends and I post process the results.

gschaider March 22, 2019 07:57

Quote:

Originally Posted by Scram_1 (Post 728518)
Hi Bernhard!
Thanks for replying!
I'm using swak 0.4.2. I'm running a case where I have air as the bulk fluid and I'm injecting water droplets. So essentially I have two fluids in my domain once the water evaporates and becomes "part" of the bulk fluid. I have a Neumann BC at one of the patches.

Here my groovyBC
Code:

    BOTTOM
    {
        type            groovyBC;
    variables    "Dmuc=4.3e-6;Dmem=1.4e-6;tmem=1e-5;kow=200;liq{BOTTOM}=H2O;";
    valueExpression    "liq";
        gradientExpression "-(Dmem*kow/(Dmuc*tmem))*liq*1e-3";
        fractionExpression "0";
    value            uniform 0;
        evaluateDuringConstruction 1;
        timelines      (
);
        lookuptables    (
);
        lookuptables2D  (
);
    }

This is the error I'm getting when I have 1-air as one of the variables in groovyBC.
Code:

--> FOAM FATAL ERROR:
 Parser Error for driver PatchValueExpressionDriver at "1.3-5" :"field air not existing or of wrong type"
"1-air"
    ^^^
----| 

Context of the error:


- Driver constructed from scratch
  Evaluating expression "1-air"


    From function parsingValue
    in file lnInclude/CommonValueExpressionDriverI.H at line 1246.

FOAM exiting

When I change it to H2O, I get this error
Code:

--> FOAM FATAL ERROR:
hanging pointer at index 1 (size 5), cannot dereference

    From function const T& Foam::UPtrList<T>::operator[](Foam::label) const [with T = Foam::fvPatchField<double>; Foam::label = int]
    in file /usr/local/apps/OpenFOAM/gcc482-v1706/OpenFOAM-v1706/src/OpenFOAM/lnInclude/UPtrListI.H at line 107.

FOAM aborting

#0  Foam::error::printStack(Foam::Ostream&) at ??:?
#1  Foam::error::abort() at ??:?
#2  Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>::Patch const& Foam::fvPatch::patchField<Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>, double>(Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh> const&) const at ??:?
#3  Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>::Patch const& Foam::fvPatch::lookupPatchField<Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>, double>(Foam::word const&, Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh> const*, double const*) const at ??:?
#4  Foam::tmp<Foam::Field<double> > Foam::PatchValueExpressionDriver::getField<double>(Foam::word const&) at ??:?
#5  parserPatch::PatchValueExpressionParser::parse() at ??:?
#6  Foam::PatchValueExpressionDriver::parseInternal(int) at ??:?
#7  Foam::CommonValueExpressionDriver::parse(Foam::exprString const&, Foam::word const&) at ??:?
#8  Foam::CommonValueExpressionDriver::evaluateVariableRemote(Foam::exprString const&, Foam::word const&, Foam::exprString const&) at ??:?
#9  Foam::CommonValueExpressionDriver::addVariables(Foam::exprString const&, bool) at ??:?
#10  Foam::CommonValueExpressionDriver::addVariables(Foam::List<Foam::exprString> const&, bool) at ??:?
#11  Foam::CommonValueExpressionDriver::clearVariables() at ??:?
#12  Foam::groovyBCFvPatchField<double>::updateCoeffs() at ??:?
#13  Foam::mixedFvPatchField<double>::evaluate(Foam::UPstream::commsTypes) at ??:?
#14  Foam::groovyBCFvPatchField<double>::groovyBCFvPatchField(Foam::fvPatch const&, Foam::DimensionedField<double, Foam::volMesh> const&, Foam::dictionary const&) at ??:?
#15  Foam::fvPatchField<double>::adddictionaryConstructorToTable<Foam::groovyBCFvPatchField<double> >::New(Foam::fvPatch const&, Foam::DimensionedField<double, Foam::volMesh> const&, Foam::dictionary const&) at ??:?
#16  Foam::fvPatchField<double>::New(Foam::fvPatch const&, Foam::DimensionedField<double, Foam::volMesh> const&, Foam::dictionary const&) at ??:?
#17  Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>::Boundary::readField(Foam::DimensionedField<double, Foam::volMesh> const&, Foam::dictionary const&) at ??:?
#18  Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>::readFields(Foam::dictionary const&) at ??:?
#19  Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>::readFields() at ??:?
#20  Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>::GeometricField(Foam::IOobject const&, Foam::fvMesh const&, bool) at ??:?
#21  ? at ??:?
#22  ? at ??:?
#23  __libc_start_main in "/lib64/libc.so.6"
#24  ? at ??:?

I started tinkering around with groovyBC options and realized that making evaluateDuringConstruction 0 solved the hanging pointer error. So, now my liq{BOTTOM} = H2O. The case is chugging along now but don't know if the results will be accurate. Will only know once the simulation ends and I post process the results.


evalDuringConstruction was introduced for exactly that reason. As fields are constructed one after another it can't be guaranteed that a peer field is already there when the boundary condition gets evaluated the first time (during construction). Usually it is sufficient if boundary conditions are evaluated whenever the field is solved for. It is possible that the "other" field has a wrong value during the first time-step because it has not been evaluated yet. For such cases set a reasonable value for the "value"-entry of the other field


All times are GMT -4. The time now is 09:13.