CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

Is it possible to use fsiFoam in parallel?

Register Blogs Community New Posts Updated Threads Search

Like Tree1Likes
  • 1 Post By Woj3x

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   January 2, 2016, 15:24
Default Is it possible to use fsiFoam in parallel?
  #1
New Member
 
Wojciech Gołąbek
Join Date: Dec 2013
Posts: 29
Rep Power: 12
Woj3x is on a distinguished road
Hello
Is it possible to use fsiFoam in parallel?
I tried to do this in base case: beamInCrossFlow but it failed.

What modification did I do in case?
I modified Allrun file like this:
Code:
#!/bin/sh 
 # Source tutorial run functions 
 . $WM_PROJECT_DIR/bin/tools/RunFunctions 
  
 # Get application name 
 application=`getApplication` 
  
 runApplication -l log.blockMesh.solid blockMesh -region solid 
 runApplication -l log.setSet.solid setSet -case ../solid -batch ../solid/setBatch 
 runApplication -l log.setToZones.solid setsToZones -case ../solid -noFlipMap 
  
 runApplication blockMesh 
 runApplication setSet -batch setBatch 
 runApplication setsToZones -noFlipMap 
 runApplication decomposeParFsi 
  
 cd .. 
 ./makeLinks fluid solid 
 cd fluid 
  
 # Build hronTurekReport function object 
 wmake libso ../setInletVelocity 
  
 runParallel $application 2
  
 # ----------------------------------------------------------------- end-of-file
In decomposeParDict I changed:
numberOfSubdomains 2
n (2 1 1);

Then I run this case in standard way:
Code:
sed -i s/tcsh/sh/g *Links 
./removeSerialLinks fluid solid 
./makeSerialLinks fluid solid 
cd fluid 
./Allclean 
./Allrun
I received the following information in log.fsiFoam:
Code:
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | foam-extend: Open Source CFD                    |
|  \\    /   O peration     | Version:     3.1                                |
|   \\  /    A nd           | Web:         http://www.extend-project.de       |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
Build    : 3.1-1dd681f6e943
Exec     : fsiFoam -parallel
Date     : Jan 02 2016
Time     : 20:55:20
Host     : FOX-MS-7816
PID      : 9059
CtrlDict : /home/wojciech/foam/foam-extend-3.1/etc/controlDict
Case     : /home/wojciech/FluidStructureInteraction/pararelTest/beamInCrossFlow/fluid
nProcs   : 2
Slaves : 
1
(
FOX-MS-7816.9060
)

Pstream initialized with:
    floatTransfer     : 0
    nProcsSimpleSum   : 0
    commsType         : blocking
SigFpe   : Enabling floating point exception trapping (FOAM_SIGFPE).

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create dynamic mesh for time = 0

Selecting dynamicFvMesh dynamicMotionSolverFvMesh
Selecting motion solver: velocityLaplacian
Selecting motion diffusion: quadratic
Selecting motion diffusion: inverseDistance
 Reading stress mesh
[0] [1] 
[1] 
[1] --> FOAM FATAL ERROR: 
[0] 
[0] --> FOAM FATAL ERROR: 
[0] Cannot find file "points" in directory "constant/solid/polyMesh"
[0] 
[0]     From function Time::findInstance(const fileName&, const word&, const IOobject::readOption)
[0]     in file db/Time/findInstance.C at line 148
[1] Cannot find file "points" in directory "constant/solid/polyMesh"
[1] 
[1]     From function Time::findInstance(const fileName&, const word&, const IOobject::readOption)
[1]     in file db/Time/findInstance.C at line 148.
[1] 
FOAM parallel run exiting
[1] 
.
[0] 
FOAM parallel run exiting
[0] 
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD 
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 1 with PID 9060 on
node FOX-MS-7816 exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[FOX-MS-7816:09058] 1 more process has sent help message help-mpi-api.txt / mpi-abort
[FOX-MS-7816:09058] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
and I've found this information in log.makeLinks:
Code:
./Allrun: 55: ./Allrun: makeLinks: not found
How to fix it?
I'm still a new Linux user so I suppose that I made a mistake somewhere so thank you in advance for you help.
Bali likes this.

Last edited by Woj3x; January 3, 2016 at 06:31.
Woj3x is offline   Reply With Quote

Old   June 8, 2016, 05:36
Default
  #2
Senior Member
 
Vaze
Join Date: Jun 2009
Posts: 172
Rep Power: 16
mvee is on a distinguished road
did you find any solution.

I am also facing similar trouble.
mvee is offline   Reply With Quote

Old   June 8, 2016, 06:13
Default
  #3
New Member
 
Wojciech Gołąbek
Join Date: Dec 2013
Posts: 29
Rep Power: 12
Woj3x is on a distinguished road
Unfortunately I didn't find any information how to use fsiFoam in parallel

Probably it is necessary to modify the source code or wait for new version
Woj3x is offline   Reply With Quote

Old   February 20, 2018, 18:23
Default
  #4
New Member
 
Wei Meng
Join Date: May 2017
Posts: 12
Rep Power: 8
Tomko is on a distinguished road
Hi Wojciech, I am also facing the same problem too...Just wandering do you have any solutions now?
Tomko is offline   Reply With Quote

Old   January 13, 2019, 14:54
Default
  #5
New Member
 
Stephen Waite
Join Date: May 2013
Location: Auckland, New Zealand
Posts: 29
Rep Power: 12
Stephen Waite is on a distinguished road
For anyone still having this problem, there is an example of parallel runs in the tutorial examples, fsiFoam/beamInCrossFlow. in foam-extended 4.0

You need to decompose both the solid and fluid domains

Code:
runApplication -l log.blockMesh.solid blockMesh -case ../solid
runApplication -l log.setSet.solid setSet -case ../solid -batch ../solid/setBatch
runApplication -l log.setToZones.solid setsToZones -case ../solid -noFlipMap

runApplication -l log.decomposePar.solid decomposePar -case ../solid -cellDist

runApplication blockMesh
runApplication setSet -batch setBatch
runApplication setsToZones -noFlipMap

runApplication decomposePar -cellDist
Both the solid and fluid case folders will need decomposeParDict files in their systems folders.

and make sure that you use

Code:
./makeLinks fluid solid
not

Code:
./makeSerialLinks fluid solid
which most of the tutorial cases use because they are single core simulations, as then you will get the error,

Code:
Cannot find file "points" in directory "constant/solid/polyMesh".
This occurs because the polymesh folder of solid processor* has not been linked to the fluid processor* constant directory
Stephen Waite is offline   Reply With Quote

Old   September 3, 2019, 10:37
Post Please need urgent help
  #6
New Member
 
pjagdale1's Avatar
 
Pradeepkumar Jagdale
Join Date: Jun 2019
Location: Kharagpur
Posts: 7
Rep Power: 6
pjagdale1 is on a distinguished road
I am trying to solve FSI-foam in Parallel. I have done:
1) decomposePar for fluid and solid. (two subdomains each by scotch Method)
2) ./makeLinks fluid and solid
3) updated ./Allrun to ./AllrunPar using the tutorial fsiFoam/beamInCrossFlow.

But the case doesn't run

log.fsiFoam :

HTML Code:
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | foam-extend: Open Source CFD                    |
|  \\    /   O peration     | Version:     4.0                                |
|   \\  /    A nd           | Web:         http://www.foam-extend.org         |
|    \\/     M anipulation  | For copyright notice see file Copyright         |
\*---------------------------------------------------------------------------*/
Build    : 4.0-268bb07d15d8
Exec     : fsiFoam -parallel
Date     : Sep 03 2019
Time     : 19:57:10
Host     : pradeep-HP-Pavilion-15-Notebook-PC
PID      : 24061
CtrlDict : "/home/pradeep/foam/parallelTest/HronTurekFsi3FE40/fluid/system/controlDict"
Case     : /home/pradeep/foam/parallelTest/HronTurekFsi3FE40/fluid
nProcs   : 2
Slaves : 
1
(
pradeep-HP-Pavilion-15-Notebook-PC.24062
)

Pstream initialized with:
    nProcsSimpleSum   : 0
    commsType         : nonBlocking
SigFpe   : Enabling floating point exception trapping (FOAM_SIGFPE).

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create dynamic mesh for time = 0

Selecting dynamicFvMesh dynamicMotionSolverFvMesh
Selecting motion solver: refVelocityLaplacian
Selecting motion diffusion: quadratic
Selecting motion diffusion: inverseDistance
 Reading stress mesh
Creating traction displacement boundary conditions
Creating fixed displacement boundary condition
Selecting rheology model linearElastic
Creating constitutive model
Selecting coupling scheme Aitken

Starting time loop

Creating pointHistory function object.
[0] History point ID: 133
[0] History point coordinates: (0.6 0.201111 0.01)
[0] Reference point coordinates: (0.6 0.2 0.025334)
Creating hronTurekReport function object.
Time = 0.001 (dt = 0.001)

Create extended GGI zone-to-zone interpolator
Checking fluid-to-solid face interpolator
[pradeep-HP-Pavilion-15-Notebook-PC:24061] *** Process received signal ***
[pradeep-HP-Pavilion-15-Notebook-PC:24061] Signal: Segmentation fault (11)
[pradeep-HP-Pavilion-15-Notebook-PC:24061] Signal code:  (-6)
[pradeep-HP-Pavilion-15-Notebook-PC:24061] Failing at address: 0x3e800005dfd
[pradeep-HP-Pavilion-15-Notebook-PC:24061] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x354b0)[0x7fe5631bf4b0]
[pradeep-HP-Pavilion-15-Notebook-PC:24061] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0x38)[0x7fe5631bf428]
[pradeep-HP-Pavilion-15-Notebook-PC:24061] [ 2] /lib/x86_64-linux-gnu/libc.so.6(+0x354b0)[0x7fe5631bf4b0]
[pradeep-HP-Pavilion-15-Notebook-PC:24061] [ 3] /home/pradeep/foam/foam-extend-4.0/lib/linux64GccDPOpt/libfiniteVolume.so(_ZN4Foam7Pstream6gatherINS_5FieldINS_6VectorIdEEEENS_5sumOpIS5_EEEEvRKNS_4ListINS0_11commsStructEEERT_RKT0_ii+0x226)[0x7fe567afaa26]
[pradeep-HP-Pavilion-15-Notebook-PC:24061] [ 4] /home/pradeep/foam/pradeep-4.0/lib/linux64GccDPOpt/libfluidSolidInteraction.so(_ZN4Foam6reduceINS_5FieldINS_6VectorIdEEEENS_5sumOpIS4_EEEEvRT_RKT0_ii+0xc7)[0x7fe5651a6747]
[pradeep-HP-Pavilion-15-Notebook-PC:24061] [ 5] /home/pradeep/foam/pradeep-4.0/lib/linux64GccDPOpt/libfluidSolidInteraction.so(_ZNK4Foam19fluidSolidInterface19calcGgiInterpolatorEv+0x5f9)[0x7fe5654aed49]
[pradeep-HP-Pavilion-15-Notebook-PC:24061] [ 6] /home/pradeep/foam/pradeep-4.0/lib/linux64GccDPOpt/libfluidSolidInteraction.so(_ZNK4Foam19fluidSolidInterface15ggiInterpolatorEv+0x19)[0x7fe5654af539]
[pradeep-HP-Pavilion-15-Notebook-PC:24061] [ 7] fsiFoam[0x402d22]
[pradeep-HP-Pavilion-15-Notebook-PC:24061] [ 8] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf0)[0x7fe5631aa830]
[pradeep-HP-Pavilion-15-Notebook-PC:24061] [ 9] fsiFoam[0x4031d9]
[pradeep-HP-Pavilion-15-Notebook-PC:24061] *** End of error message ***
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 24061 on node pradeep-HP-Pavilion-15-Notebook-PC exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------
pjagdale1 is offline   Reply With Quote

Old   September 5, 2019, 00:40
Default
  #7
New Member
 
Aashay Tinaikar
Join Date: May 2019
Location: Boston
Posts: 19
Rep Power: 6
ARTisticCFD is on a distinguished road
Hello Guys,
Thanks for posting to this forum. Yes it can be run.

0) After blockMesh and setToZones command, run decomposePar separately from inside fluid/ and solid/


1) make a separate .sh file in fluid/ Eg: linkParallel.sh

2) Paste this inside it

for proc in processor0 processor1 processor2 .... processorN
do
echo $proc
cd $proc
cd 0
ln -s ../../../$2/$proc/0 solid
cd ../constant
ln -s ../../../$2/$proc/constant solid
cd ../..
done

3) If your makeSerialLinks.sh is one directory higher. Do


>> cd ..
>> ./makeSerialLinks.sh fluid solid
>> cd fluid/

4) Then execute the following
>> ./linkParallel.sh fluid solid"

This should create the links for individual processor domains.

5) Execute parallel run
>> mpirun -np <num_sub_domains> fsiFoam -parallel

I tested this myself. Hope this works for you too. Please let me know!

Cheers :-)

Last edited by ARTisticCFD; September 5, 2019 at 00:41. Reason: Random additional spaces between the lines
ARTisticCFD is offline   Reply With Quote

Old   September 5, 2019, 00:58
Post Please need urgent help
  #8
New Member
 
pjagdale1's Avatar
 
Pradeepkumar Jagdale
Join Date: Jun 2019
Location: Kharagpur
Posts: 7
Rep Power: 6
pjagdale1 is on a distinguished road
Quote:
Originally Posted by ARTisticCFD View Post
Hello Guys,
Thanks for posting to this forum. Yes it can be run.

0) After blockMesh and setToZones command, run decomposePar separately from inside fluid/ and solid/


1) make a separate .sh file in fluid/ Eg: linkParallel.sh

2) Paste this inside it

for proc in processor0 processor1 processor2 .... processorN
do
echo $proc
cd $proc
cd 0
ln -s ../../../$2/$proc/0 solid
cd ../constant
ln -s ../../../$2/$proc/constant solid
cd ../..
done

3) If your makeSerialLinks.sh is one directory higher. Do


>> cd ..
>> ./makeSerialLinks.sh fluid solid
>> cd fluid/

4) Then execute the following
>> ./linkParallel.sh fluid solid"

This should create the links for individual processor domains.

5) Execute parallel run
>> mpirun -np <num_sub_domains> fsiFoam -parallel

I tested this myself. Hope this works for you too. Please let me know!

Cheers :-)
Hi sir,

Can you please share your decomposeParDict file for solid and fluid (urgent requirement).

and regarding the file "makeLinks" there is some syntax error in the code. I have corrected it

makeLinks previously at line 20

Code:
foreach proc(processor*)
cd $proc
cd 0
ln -s ../../../$2/$proc/0 solid
cd ../constant
ln -s ../../../$2/$proc/constant solid
cd ../..
end
just modify it to

Code:
for proc in processor*
do
cd $proc
cd 0
ln -s ../../../$2/$proc/0 solid
cd ../constant
ln -s ../../../$2/$proc/constant solid
cd ../..
done
Thankyou.
pjagdale1 is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
problem during mpi in server: expected Scalar, found on line 0 the word 'nan' muth OpenFOAM Running, Solving & CFD 3 August 27, 2018 04:18
Explicitly filtered LES saeedi Main CFD Forum 16 October 14, 2015 11:58
simpleFoam parallel AndrewMortimer OpenFOAM Running, Solving & CFD 12 August 7, 2015 18:45
simpleFoam in parallel issue plucas OpenFOAM Running, Solving & CFD 3 July 17, 2013 11:30
parallel Grief: BoundaryFields ok in single CPU but NOT in Parallel JR22 OpenFOAM Running, Solving & CFD 2 April 19, 2013 16:49


All times are GMT -4. The time now is 19:36.