CFD Online Discussion Forums

CFD Online Discussion Forums (http://www.cfd-online.com/Forums/)
-   OpenFOAM Bugs (http://www.cfd-online.com/Forums/openfoam-bugs/)
-   -   Bug in groovyBC in parallel computation (http://www.cfd-online.com/Forums/openfoam-bugs/97200-bug-groovybc-parallel-computation.html)

Aleksey_R February 11, 2012 08:15

Bug in groovyBC in parallel computation
 
1 Attachment(s)
Greetings, dear Foamers.

I found a bug which seems to be connected with groovyBC. If in groovyBC expressions the "max" and "min" statements are used, the parallel run of simpleFoam and pisoFoam fails.

This bug is specific for OF-1.6-ext. I tried the same case in OF-2.1.0 (with some changes specific for 2.1.0 version) and anything worked OK.

I attach an example case. It runs simpleFoam OK on a single core but fails in parallel simpleFoam run.

Best regards, Aleksey.

wyldckat February 11, 2012 09:51

Greetings Aleksey,

Bernhard might not fix this if it's not reported on the dedicated bug tracker for swak4Foam: http://sourceforge.net/apps/mantisbt...?project_id=10 ;)

Make sure that you pick swak4Foam as the project to where you report, since it is embedded into Extend's main project.

On another note... I think I saw swak4Foam being integrated directly into 1.6-ext's git repo... have you seen or are you using that version?

Best regards,
Bruno

Aleksey_R February 11, 2012 10:30

Thank you for your reply, Bruno.

I've submitted a bug N 123:
https://sourceforge.net/apps/mantisb...iew.php?id=123

The swak4foam was downloaded by command:
svn checkout https://openfoam-extend.svn.sourcefo...ies/swak4Foam/

OF-1.6-ext I downloaded by means of:
git clone git://openfoam-extend.git.sourceforge.net/gitroot/openfoam-extend/OpenFOAM-1.6-ext
and there were neither swak4foam nor groovyBC.


Best regards, Aleksey.

wyldckat February 11, 2012 10:41

Hi Aleksey,
Quote:

Originally Posted by Aleksey_R (Post 343928)
OF-1.6-ext I downloaded by means of:
git clone git://openfoam-extend.git.sourceforge.net/gitroot/openfoam-extend/OpenFOAM-1.6-ext
and there were neither swak4foam nor groovyBC.

Ah, now I remember, it's on a branch, apparently pending approval. Try this:
Code:

git merge origin/bgschaid/feature/swak4Foam
Then:
Code:

wmake all src/swak4Foam
wmake all applications/utilities/swak4Foam

Best regards,
Bruno

Aleksey_R February 11, 2012 11:20

I've checked the attached case using branched swak4foam. The same behaviour - simpleFoam works OK on single core and fails on parallel run.

Best regards, Aleksey.

philippose February 11, 2012 15:52

Hi and a Good Evening :-)!

This issue that you have seen is to do with the OpenFOAM-1.6-ext version of OpenFOAM.

I posted this as a bug report quite a while ago under the GroovyBC topic, but after discussing it with Bernhard, it turned out to be an issue with the "-ext" version of OpenFOAM.... specifically, to do with "pTraits".

However, can you give me some more details regarding the crash you are having with simpleFoam?? I use GroovyBC with simpleFoam for parallel simulations quite often. Though I get a warning that the "min" or "max" functions return a zero and that the average will be taken (cant remember the exact warning), the simulations work out fine, and the results are also good.


By the way..... I am using the latest version of OpenFOAM-1.6-ext from the Git repository.

Have a nice day!

Philippose

Aleksey_R February 12, 2012 08:26

2 Attachment(s)
Thank you for your reply.

Honestly, the logs are not informative. I attach the log and the terminal output.

By the way, does the attached in this thread case runs on your system?

Best regards, Aleksey.

philippose February 12, 2012 14:41

Hi again,

I just tried the case you had posted on this thread with OpenFOAM-1.6-extm and you are right..... it does not work.

Initially I thought it might be because the decomposition also cut up the patches into different domains... but I dont think that is the case. Atleast, the decomposition that I got from "decomposePar" had all the patches intact in either one of the two parts.

I hope this post catches Bernhard's attention..... He may have something more to say about the issue. I think he is already aware of it.

Sorry I could not help further....

Have a nice day ahead :-)!

Philippose

gschaider February 13, 2012 20:01

Quote:

Originally Posted by wyldckat (Post 343930)
Hi Aleksey,

Ah, now I remember, it's on a branch, apparently pending approval. Try this:
Code:

git merge origin/bgschaid/feature/swak4Foam
Then:
Code:

wmake all src/swak4Foam
wmake all applications/utilities/swak4Foam

Best regards,
Bruno

There are good reasons that this branch is currently not merged into the default-branch (I just put it there for discussion). The swak-sources that are there don't differ from regular swak (so this wouldn't fix the problem discussed).

gschaider February 13, 2012 20:04

Quote:

Originally Posted by philippose (Post 344030)
Hi again,

I just tried the case you had posted on this thread with OpenFOAM-1.6-extm and you are right..... it does not work.

Initially I thought it might be because the decomposition also cut up the patches into different domains... but I dont think that is the case. Atleast, the decomposition that I got from "decomposePar" had all the patches intact in either one of the two parts.

I hope this post catches Bernhard's attention..... He may have something more to say about the issue. I think he is already aware of it.

Sorry I could not help further....

Have a nice day ahead :-)!

Philippose

Had a quick look at it and can reproduce the problem. Further communications on this will be through the bug on the Mantis (link see above)

toolpost September 13, 2012 02:40

Hi !

I am getting the same error in simpleFoam solver of OpenFoam1.6ext with parabolicVelocity boundary condition. My case is a simple laminar pipe flow with a constriction and I am trying to run it on an 8-core cpu. The case is tested for single and multiple cores with surfaceNormalfixedvalue bc and found working. but, when parabolicVelocity boundary condition is used, normal solution without decomposing is working fine. but, when tried to decompose it and run in parallel, it ended up with similar errors mentioned above.

Time = 1

DILUPBiCG: Solving for Ux, Initial residual = 1, Final residual = 0.0105602, No Iterations 2
DILUPBiCG: Solving for Uy, Initial residual = 1, Final residual = 0.0903589, No Iterations 1
DILUPBiCG: Solving for Uz, Initial residual = 1, Final residual = 0.0875971, No Iterations 1
[entropy:2277] *** An error occurred in MPI_Recv
[entropy:2277] *** on communicator MPI_COMM_WORLD
[entropy:2277] *** MPI_ERR_TRUNCATE: message truncated
[entropy:2277] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 2277 on
node entropy exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[entropy:02276] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[entropy:02276] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

I have swake4Foam installed, but not used in this particular case. So I think this error is not related to groovyBC.

Any advice would be greatly appreciated... and thanks a lot for your time.

Regards
Jabir

gschaider September 13, 2012 04:36

Quote:

Originally Posted by toolpost (Post 381526)
Hi !

I am getting the same error in simpleFoam solver of OpenFoam1.6ext with parabolicVelocity boundary condition. My case is a simple laminar pipe flow with a constriction and I am trying to run it on an 8-core cpu. The case is tested for single and multiple cores with surfaceNormalfixedvalue bc and found working. but, when parabolicVelocity boundary condition is used, normal solution without decomposing is working fine. but, when tried to decompose it and run in parallel, it ended up with similar errors mentioned above.

Time = 1

DILUPBiCG: Solving for Ux, Initial residual = 1, Final residual = 0.0105602, No Iterations 2
DILUPBiCG: Solving for Uy, Initial residual = 1, Final residual = 0.0903589, No Iterations 1
DILUPBiCG: Solving for Uz, Initial residual = 1, Final residual = 0.0875971, No Iterations 1
[entropy:2277] *** An error occurred in MPI_Recv
[entropy:2277] *** on communicator MPI_COMM_WORLD
[entropy:2277] *** MPI_ERR_TRUNCATE: message truncated
[entropy:2277] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 2277 on
node entropy exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[entropy:02276] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[entropy:02276] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

I have swake4Foam installed, but not used in this particular case. So I think this error is not related to groovyBC.

Any advice would be greatly appreciated... and thanks a lot for your time.

Regards
Jabir

To be sure remove the swak-stuff from the libs entry (then I can plead "I'm not to blame")

I think it can't be the parabolic-BC as the velocity has already been successfully solved. The problem seems to occur during the pressure solution. Are you using any AMG-type solver for that. Just for testing replace it with one of the CG-solvers.

Other than that in my experience that kind of error usually occurs because of an inconsistently compiled OF-version (because of an update after which you chose only to compile some libraries for instance). Sounds like snake oil but sometimes a complete recompilation helps.

toolpost September 13, 2012 07:15

Thanks for your quick response..

Quote:

To be sure remove the swak-stuff from the libs entry (then I can plead "I'm not to blame")
Previously, I haven't included the libs in my controlDict file and so I thought there is no connection between this error and swak4Foam. right? Anyways, now I completely removed swak4Foam and the same error is still there.. So, I think it is a good news for you.. :)

Quote:

Are you using any AMG-type solver for that. Just for testing replace it with one of the CG-solvers.
No. I am not using AMG.

Code:

solvers
{
    p
    {
        solver          PCG;
        preconditioner  DIC;
        tolerance        1e-06;
        relTol          0.01;
    };
    U
    {
        solver          PBiCG;
        preconditioner  DILU;
        tolerance        1e-05;
        relTol          0.1;
    };

So, what to do next? Do you think a complete re-compilation/installation solve this issue?

Thanks a lot for your time.

Regards,
Jabir

gschaider September 13, 2012 10:56

Quote:

Originally Posted by toolpost (Post 381589)
Thanks for your quick response..

Previously, I haven't included the libs in my controlDict file and so I thought there is no connection between this error and swak4Foam. right? Anyways, now I completely removed swak4Foam and the same error is still there.. So, I think it is a good new for you.. :)

No. I am not using AMG.

Code:

solvers
{
    p
    {
        solver          PCG;
        preconditioner  DIC;
        tolerance        1e-06;
        relTol          0.01;
    };
    U
    {
        solver          PBiCG;
        preconditioner  DILU;
        tolerance        1e-05;
        relTol          0.1;
    };

So, what to do next? Do you think a complete re-compilation/installation solve this issue?

Thanks a lot for your time.

Regards,
Jabir

I'm afraid recompilation will not help.

You mentioned the parabolic-BC before. That is the last thing I can think of. Replace that with a normal fixedValue. If the simulation then runs OK then probably the min/max in parallel is the problem

toolpost September 13, 2012 23:01

Good morning!

Quote:

Originally Posted by gschaider (Post 381649)

You mentioned the parabolic-BC before. That is the last thing I can think of. Replace that with a normal fixedValue. If the simulation then runs OK then probably the min/max in parallel is the problem

I already tested the case with "surfaceNormalFixedValue" and it is working fine. I mentioned it in my first post. The "parabolicVelocity" runs in single core simulation, but fails for parallel case.

Thanks for your time and consideration.

Regards,
Jabir

wyldckat September 15, 2012 09:05

Greetings to all!

@Jabir: Can you provide an example based on one of the tutorial cases? This way we can more easily try and replicate the problem you're getting.

Best regards,
Bruno

toolpost September 15, 2012 11:45

1 Attachment(s)
Good evening!

Quote:

Originally Posted by wyldckat (Post 381914)
Can you provide an example based on one of the tutorial cases?

Yes. I tried with pitzDaily case with laminar flow assumption and the same errors appeared again. The case file is attached. Please have a look at the 0/U file. Both inlet bcs are fine with single core simulation. But in parallel run, the parabolicvelocity crashes as said above.

Thanks for your help.

Regards,
Jabir

wyldckat September 16, 2012 04:22

Hi Jabir,

OK, only now did I come to the conclusion that the problem is with the "parabolicVelocity" BC itself, not groovyBC :rolleyes:
Therefore, I think this should be reported at http://sourceforge.net/apps/mantisbt...?project_id=11, namely project "OpenFOAM-ext release", along with the test case you provided!

As for a quick fix, have a look at the tutorial "incompressible/simpleFoam/pitzDailyExptInlet", where you can find in the folder "constant/boundaryData/inlet/" that the values at the "inlet" are being defined for each point on the patch.
It may be a bit annoying having to define them manually or with the help of another utility, but for now this would be the somewhat-quickest solution.

Best regards,
Bruno

toolpost September 17, 2012 23:40

Good morning Bruno!

Thanks a lot for your help and suggestions. So, the problem comes from parabolicVelocity bc, and not from groovyBC. I shall post a bug report in the link you provided.

Quote:

As for a quick fix, have a look at the tutorial "incompressible/simpleFoam/pitzDailyExptInlet", where you can find in the folder "constant/boundaryData/inlet/" that the values at the "inlet" are being defined for each point on the patch.
It may be a bit annoying having to define them manually or with the help of another utility, but for now this would be the somewhat-quickest solution.
I have checked it. The procedure of specifying individual velocities at each points looks tedious, but as you said it would be the quickest solution. I am going to give it a try.

Thanks again for your time and help.

Regards,
Jabir

gschaider September 18, 2012 07:50

Quote:

Originally Posted by toolpost (Post 382250)
Good morning Bruno!

Thanks a lot for your help and suggestions. So, the problem comes from parabolicVelocity bc, and not from groovyBC. I shall post a bug report in the link you provided.

I have checked it. The procedure of specifying individual velocities at each points looks tedious, but as you said it would be the quickest solution. I am going to give it a try.

Thanks again for your time and help.

Regards,
Jabir

Something that might make it less tedious is funkySetBoundaryFields. This allows you to set boundary fields (not only the value but everything that accepts a uniform/nonuniform value like the inletValue) statically with an expression. Basically the idea would be to set the parabolic inlet for a fixedValue in the serial case with it and then decompose it (this would avoid this unfortunate min/max-bug)

There is an example-dictionary that comes with swak4Foam


All times are GMT -4. The time now is 17:03.