CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > OpenFOAM Running, Solving & CFD

Is it possible to use fsiFoam in parallel?

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Display Modes
Old   January 2, 2016, 16:24
Default Is it possible to use fsiFoam in parallel?
  #1
New Member
 
Wojciech Gołąbek
Join Date: Dec 2013
Posts: 29
Rep Power: 4
Woj3x is on a distinguished road
Hello
Is it possible to use fsiFoam in parallel?
I tried to do this in base case: beamInCrossFlow but it failed.

What modification did I do in case?
I modified Allrun file like this:
Code:
#!/bin/sh 
 # Source tutorial run functions 
 . $WM_PROJECT_DIR/bin/tools/RunFunctions 
  
 # Get application name 
 application=`getApplication` 
  
 runApplication -l log.blockMesh.solid blockMesh -region solid 
 runApplication -l log.setSet.solid setSet -case ../solid -batch ../solid/setBatch 
 runApplication -l log.setToZones.solid setsToZones -case ../solid -noFlipMap 
  
 runApplication blockMesh 
 runApplication setSet -batch setBatch 
 runApplication setsToZones -noFlipMap 
 runApplication decomposeParFsi 
  
 cd .. 
 ./makeLinks fluid solid 
 cd fluid 
  
 # Build hronTurekReport function object 
 wmake libso ../setInletVelocity 
  
 runParallel $application 2
  
 # ----------------------------------------------------------------- end-of-file
In decomposeParDict I changed:
numberOfSubdomains 2
n (2 1 1);

Then I run this case in standard way:
Code:
sed -i s/tcsh/sh/g *Links 
./removeSerialLinks fluid solid 
./makeSerialLinks fluid solid 
cd fluid 
./Allclean 
./Allrun
I received the following information in log.fsiFoam:
Code:
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | foam-extend: Open Source CFD                    |
|  \\    /   O peration     | Version:     3.1                                |
|   \\  /    A nd           | Web:         http://www.extend-project.de       |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
Build    : 3.1-1dd681f6e943
Exec     : fsiFoam -parallel
Date     : Jan 02 2016
Time     : 20:55:20
Host     : FOX-MS-7816
PID      : 9059
CtrlDict : /home/wojciech/foam/foam-extend-3.1/etc/controlDict
Case     : /home/wojciech/FluidStructureInteraction/pararelTest/beamInCrossFlow/fluid
nProcs   : 2
Slaves : 
1
(
FOX-MS-7816.9060
)

Pstream initialized with:
    floatTransfer     : 0
    nProcsSimpleSum   : 0
    commsType         : blocking
SigFpe   : Enabling floating point exception trapping (FOAM_SIGFPE).

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create dynamic mesh for time = 0

Selecting dynamicFvMesh dynamicMotionSolverFvMesh
Selecting motion solver: velocityLaplacian
Selecting motion diffusion: quadratic
Selecting motion diffusion: inverseDistance
 Reading stress mesh
[0] [1] 
[1] 
[1] --> FOAM FATAL ERROR: 
[0] 
[0] --> FOAM FATAL ERROR: 
[0] Cannot find file "points" in directory "constant/solid/polyMesh"
[0] 
[0]     From function Time::findInstance(const fileName&, const word&, const IOobject::readOption)
[0]     in file db/Time/findInstance.C at line 148
[1] Cannot find file "points" in directory "constant/solid/polyMesh"
[1] 
[1]     From function Time::findInstance(const fileName&, const word&, const IOobject::readOption)
[1]     in file db/Time/findInstance.C at line 148.
[1] 
FOAM parallel run exiting
[1] 
.
[0] 
FOAM parallel run exiting
[0] 
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD 
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 1 with PID 9060 on
node FOX-MS-7816 exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[FOX-MS-7816:09058] 1 more process has sent help message help-mpi-api.txt / mpi-abort
[FOX-MS-7816:09058] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
and I've found this information in log.makeLinks:
Code:
./Allrun: 55: ./Allrun: makeLinks: not found
How to fix it?
I'm still a new Linux user so I suppose that I made a mistake somewhere so thank you in advance for you help.

Last edited by Woj3x; January 3, 2016 at 07:31.
Woj3x is offline   Reply With Quote

Old   June 8, 2016, 05:36
Default
  #2
Senior Member
 
Vaze
Join Date: Jun 2009
Posts: 147
Rep Power: 9
mvee is on a distinguished road
did you find any solution.

I am also facing similar trouble.
mvee is offline   Reply With Quote

Old   June 8, 2016, 06:13
Default
  #3
New Member
 
Wojciech Gołąbek
Join Date: Dec 2013
Posts: 29
Rep Power: 4
Woj3x is on a distinguished road
Unfortunately I didn't find any information how to use fsiFoam in parallel

Probably it is necessary to modify the source code or wait for new version
Woj3x is offline   Reply With Quote

Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Explicitly filtered LES saeedi Main CFD Forum 16 October 14, 2015 11:58
simpleFoam parallel AndrewMortimer OpenFOAM Running, Solving & CFD 12 August 7, 2015 18:45
problem during mpi in server: expected Scalar, found on line 0 the word 'nan' muth OpenFOAM Running, Solving & CFD 2 April 10, 2015 05:42
simpleFoam in parallel issue plucas OpenFOAM Running, Solving & CFD 3 July 17, 2013 11:30
parallel Grief: BoundaryFields ok in single CPU but NOT in Parallel JR22 OpenFOAM Running, Solving & CFD 2 April 19, 2013 16:49


All times are GMT -4. The time now is 08:08.