CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Running, Solving & CFD (https://www.cfd-online.com/Forums/openfoam-solving/)
-   -   InterTrackFoam parallel issue (https://www.cfd-online.com/Forums/openfoam-solving/150762-intertrackfoam-parallel-issue.html)

zbli March 28, 2015 23:14

InterTrackFoam parallel issue
 
Hey, guys

I've been working on this for several weeks, but nothing out.
Actually I've browsed the post related to this problem: InterTrackFoam any information
The thing is I cannot split the domain and leave the freesurface patch as a whole into the master processor.
I see lots of interTrackFoam users have worked it out, so hope you guys could help me a little bit. This problem is really a pain in the neck!

Thanks

zbli March 29, 2015 00:46

Quote:

Originally Posted by zbli (Post 538780)
Hey, guys

I've been working on this for several weeks, but nothing out.
Actually I've browsed the post related to this problem: InterTrackFoam any information
The thing is I cannot split the domain and leave the freesurface patch as a whole into the master processor.
I see lots of interTrackFoam users have worked it out, so hope you guys could help me a little bit. This problem is really a pain in the neck!

Thanks

This is what I ran into when I typed in "mpirun -np 8 interTrackFoam -parallel":

nProcs : 8
Slaves :
7
(
w002.5303
w002.5304
w002.5305
w002.5306
w002.5307
w002.5308
w002.5309
)

Pstream initialized with:
floatTransfer : 0
nProcsSimpleSum : 0
commsType : blocking
SigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create dynamic mesh for time = 0

Selecting dynamicFvMesh dynamicMotionSolverFvMesh
Selecting motion solver: laplace
Selecting motion diffusivity: uniform

Reading field p

Reading field U

Reading/calculating face flux field phi

Found free surface patch. ID: 3
[0] [1]
[1]
[1] --> FOAM FATAL IO ERROR:
[1] cannot open file
[1]
[1] file: /home/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor1/0/fluidIndicator at line 0.
[1]
[1] From function regIOobject::readStream()
[1] in file db/regIOobject/regIOobjectRead.C at line 61.
[1]
FOAM parallel run exiting
[2]
[2]
[2] --> FOAM FATAL IO ERROR:
[2] cannot open file
[2]
[2] file: /home/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor2/0/fluidIndicator at line 0.
[2]
[2] From function regIOobject::readStream()
[2] in file db/regIOobject/regIOobjectRead.C at line 61.
[2]
FOAM parallel run exiting
[2]
[3]
[3]
[3] --> FOAM FATAL IO ERROR:
[3] cannot open file
[3]
[3] file: /home/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor3/0/fluidIndicator at line 0.
[3]
[3] From function regIOobject::readStream()
[3] in file db/regIOobject/regIOobjectRead.C at line 61.
[3]
FOAM parallel run exiting
[3]
[4]
[4]
[4] --> FOAM FATAL IO ERROR:
[4] cannot open file
[4]
[4] file: /home/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor4/0/fluidIndicator at line 0.
[4]
[4] From function regIOobject::readStream()
[4] in file db/regIOobject/regIOobjectRead.C at line 61.
[4]
FOAM parallel run exiting
[4]
[5]
[5]
[5] --> FOAM FATAL IO ERROR:
[5] cannot open file
[5]
[5] file: /home/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor5/0/fluidIndicator at line 0.
[5]
[5] From function regIOobject::readStream()
[5] in file db/regIOobject/regIOobjectRead.C at line 61.
[5]
FOAM parallel run exiting
[5]
[1]
[6]
[6]
[6] --> FOAM FATAL IO ERROR:
[6] cannot open file
[6]
[6] file: /home/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor6/0/fluidIndicator at line 0.
[6]
[6] From function regIOobject::readStream()
[6] in file db/regIOobject/regIOobjectRead.C at line 61.
[6]
FOAM parallel run exiting
[6]

[0]
[0] --> FOAM FATAL IO ERROR:
[0] cannot open file
[0]
[0] file: /home/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor0/0/fluidIndicator at line 0.
[0]
[0] From function regIOobject::readStream()
[0] in file db/regIOobject/regIOobjectRead.C at line 61.
[0]
FOAM parallel run exiting
[0]
[7]
[7]
[7] --> FOAM FATAL IO ERROR:
[7] cannot open file
[7]
[7] file: /home/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor7/0/fluidIndicator at line 0.
[7]
[7] From function regIOobject::readStream()
[7] in file db/regIOobject/regIOobjectRead.C at line 61.
[7]
FOAM parallel run exiting
[7]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 6 with PID 5308 on
node w002 exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[w002:05301] 7 more processes have sent help message help-mpi-api.txt / mpi-abort
[w002:05301] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

zbli March 30, 2015 02:40

Dear all

I've tackled the manual decompose problem with funkySetFields. I have to say this utility is quite in handy and powerful. You just need to put into an expression to distribute the processors where every point is assigned and hit the funkySetFields command to dump them into a dataFile. Change it a little bit and done.

Unfortunately, the interTrackFoam can't still work in parallel yet. Here is the error message:
Code:

// ***********************************************************************//
/*---------------------------------------------------------------------------*\
| =========                |                                                |
| \\      /  F ield        | foam-extend: Open Source CFD                    |
|  \\    /  O peration    | Version:  3.0                                  |
|  \\  /    A nd          | Web:        http://www.extend-project.de      |
|    \\/    M anipulation  |                                                |
\*---------------------------------------------------------------------------*/
Build    : 3.0-7a7445ead09d
Exec    : interTrackFoam -parallel
Date    : Mar 30 2015
Time    : 14:25:58
Host    : w002
PID      : 26609
CtrlDict : /home/##/OpenFOAM/foam-extend-3.0/etc/controlDict
Case    : /home/##/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump
nProcs  : 2
Slaves :
1
(
w002.26610
)

Pstream initialized with:
    floatTransfer    : 0
    nProcsSimpleSum  : 0
    commsType        : blocking
SigFpe  : Enabling floating point exception trapping (FOAM_SIGFPE).

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create dynamic mesh for time = 0

Selecting dynamicFvMesh dynamicMotionSolverFvMesh
Selecting motion solver: laplace
Selecting motion diffusivity: uniform

Reading field p

Reading field U

Reading/calculating face flux field phi

Found free surface patch. ID: 3
[1]
[1]
[1] --> FOAM FATAL IO ERROR:
[1] cannot open file
[1]
[1] file: /home/##/OpenFOAM/foam-extend-3.0/tutorials/surfaceTracking/interTrackFoam/wavehump/processor1/0/fluidIndicator at line 0.
[1]
[1]    From function regIOobject::readStream()
[1]    in file db/regIOobject/regIOobjectRead.C at line 61.
[1]
FOAM parallel run exiting
[1]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 1 with PID 26610 on
node w002 exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------

So... still the 'fluidindicator'. I barely even know him.

zbli March 30, 2015 04:17

[resolved]
I'm so glad that I finally done this problem. Now the case is running just fine.
This post helps me out by configuring an alpha file, and I rename it into 'fluidIndicator'.

Chia May 28, 2015 05:55

Hi Jason,

sorry for the late answer, but maybe this will help in the future.
There is a utility in the surfaceTracking folder called setFluidIndicator, you just need to compile it and then run setFluidIndicator before decomposePar in your case folder.

The fluidIndicator is set to 1 for the denser phase and 0 elsewhere. The solver treats the fluidIndicator in a different way for serial and parallel runs. In serial, if not present is generated automatically, in parallel it must be present in the processor*/0 folder.

Best,

Chiara

jasonchen March 9, 2016 09:57

Hello there,

I'm trying to run interTrackFoam in parallel. Could you please detail how to manually decompose using funkySetFields? Thank you.

"I've tackled the manual decompose problem with funkySetFields. I have to say this utility is quite in handy and powerful. You just need to put into an expression to distribute the processors where every point is assigned and hit the funkySetFields command to dump them into a dataFile. Change it a little bit and done. "


All times are GMT -4. The time now is 22:45.