CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Bugs

interDyMFoam parallel bug?

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   March 30, 2009, 07:32
Default interDyMFoam parallel bug?
  #1
New Member
 
Nikolaos Spyrou
Join Date: Mar 2009
Posts: 22
Rep Power: 17
nikos_fb16 is on a distinguished road
Hello Mattijs,

here is an uploaded testcase:





And here is a link to refresh what this thread is about:

http://www.cfd-online.com/Forums/ope...-parallel.html

In the testcase i have put a README file, where I summarise in which cases the error occurs. I found that the error in parallel calculations appears and disappears for different domain decompositions.

The testcase is a Rayleigh Taylor Instability.

Thank you

Nikos
Attached Files
File Type: gz testingInterDyMFoam.gz (4.1 KB, 45 views)
nikos_fb16 is offline   Reply With Quote

Old   March 31, 2009, 07:07
Default
  #2
Senior Member
 
Mattijs Janssens
Join Date: Mar 2009
Posts: 1,419
Rep Power: 26
mattijs is on a distinguished road
Hi Nikos,

Thanks for reporting. In the 1.5 series the pressure reference cell still has to be on the master. In interDyMFoam the reference is set by its location and in your case that location was on the second processor. It runs fine if you change the location to be on the master processor. I've pushed a change to 1.5.x interDyMFoam so at least it will tell you if the reference cell is not on the master.
mattijs is offline   Reply With Quote

Old   March 31, 2009, 07:21
Default
  #3
Senior Member
 
sega's Avatar
 
Sebastian Gatzka
Join Date: Mar 2009
Location: Frankfurt, Germany
Posts: 729
Rep Power: 20
sega is on a distinguished road
Hi mattijs.

Thanks for your response. But how do you determine whether a cell is on the master processor?

As in this case the decomposition is done in x-direction (splitting with a zy-plane) one can suppose the cell with the number 0 is located on the first part of the slitted domain and hence on the first (master) processor?
__________________
Schrödingers wife: "What did you do to the cat? It's half dead!"
sega is offline   Reply With Quote

Old   March 31, 2009, 07:56
Default
  #4
New Member
 
Nikolaos Spyrou
Join Date: Mar 2009
Posts: 22
Rep Power: 17
nikos_fb16 is on a distinguished road
Hi Mattijs,

thank for the answer and the change in1.5.x - i'm compiling.
What I did now is:

After the domain decomposition I looked in "processor0/constant/polyMesh/cellProcAddressing" for the cellLabels there and chose one label as entry for pdRefCell in my fvSolution-file.

The error still exists. Did I think in a too easy way by doing so?
(I already tried randomly different pdRefCell-Values but the error always appears. Thats against a fifty fifty chance for hitting a proper cell )
nikos_fb16 is offline   Reply With Quote

Old   March 31, 2009, 08:19
Default
  #5
Senior Member
 
Mattijs Janssens
Join Date: Mar 2009
Posts: 1,419
Rep Power: 26
mattijs is on a distinguished road
interDyMFoam does not use the pRefCell - it uses the pRefProbe which is a location.
mattijs is offline   Reply With Quote

Old   March 31, 2009, 08:24
Default
  #6
New Member
 
Nikolaos Spyrou
Join Date: Mar 2009
Posts: 22
Rep Power: 17
nikos_fb16 is on a distinguished road

Ok, thanks a lot.
nikos_fb16 is offline   Reply With Quote

Old   March 4, 2011, 14:49
Default
  #7
New Member
 
Mark Beal
Join Date: Feb 2011
Posts: 24
Rep Power: 15
msbealo is on a distinguished road
Hi guys,

Is there a solution to this interDyMFoam/parallel pocessor problem?

I had one case that worked fine, but when I replaced the stl file with the actual geometry I was interested in (a yacht hull). interDyMFoam no longer wanted to run as a parallel case.

I'm using OF 1.7.1 with Ubuntu 10.10.

Kind Regards,

Mark
msbealo is offline   Reply With Quote

Old   January 19, 2018, 05:06
Default
  #8
New Member
 
Benjamin
Join Date: Apr 2014
Location: Zürich
Posts: 27
Rep Power: 12
Benji is on a distinguished road
EDIT: Seems to work when I overwrite the Baffles and start the simulation at t=0. I'm fine with this now (still don't understand why it worked with interFoam though).

Hey everyone

Sorry to dig this thread up, but I'm stuck with a similar problem.
I'm running a dynamic case (rotating AMI). Simulation works fine with interFoam in parallel and interDyMFoam, but NOT with interDyMFoam in parallel.

It says it cannot find the points file in the following directory: "/home/benji/OpenFOAM/benji-5.0/run/SuperPipe_2/processor1/constant/polyMesh/points". Since i start at 0.002, the directory where the files are is this: "/home/benji/OpenFOAM/benji-5.0/run/SuperPipe_2/processor1/0.002/polyMesh/points". Does anyone know why? As I said, with interFoam in parallel it works.... Does interDyMFoam in parallel require another way of decomposing/path names?

Ben

This is the error:
Code:
  benji@ubuntu:~/OpenFOAM/benji-5.0/run/SuperPipe_2$ mpirun -np 4 interDyMFoam -parallel > log &
  [1] 14745
  benji@ubuntu:~/OpenFOAM/benji-5.0/run/SuperPipe_2$ [1]
  [1]
  [1] --> FOAM FATAL ERROR:
  [1] cannot find file "/home/benji/OpenFOAM/benji-5.0/run/SuperPipe_2/processor1/constant/polyMesh/points"
  [1]
  [1]     From function virtual Foam::autoPtr<Foam::ISstream> Foam::fileOperations::uncollatedFileOperation::readStream(Foam::regIOobject&, const Foam::fileName&, const Foam::word&, bool) const
  [1]     in file global/fileOperations/uncollatedFileOperation/uncollatedFileOperation.C at line 505.
  [1]
  FOAM parallel run exiting
  [1]
  [0] --------------------------------------------------------------------------
  MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD
  with errorcode 1.
   
  NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
  You may or may not see output from other processes, depending on
  exactly when Open MPI kills them.
  --------------------------------------------------------------------------
  [2]
  [2]
  [2] --> FOAM FATAL ERROR:
  [2] cannot find file "/home/benji/OpenFOAM/benji-5.0/run/SuperPipe_2/processor2/constant/polyMesh/points"
  [2]
  [2]     From function virtual Foam::autoPtr<Foam::ISstream> Foam::fileOperations::uncollatedFileOperation::readStream(Foam::regIOobject&, const Foam::fileName&, const Foam::word&, bool) const
  [2]     in file global/fileOperations/uncollatedFileOperation/uncollatedFileOperation.C at line 505.
  [2]
  FOAM parallel run exiting
  [2]
  [3]
  [3]
  [3] --> FOAM FATAL ERROR:
  [3] cannot find file "/home/benji/OpenFOAM/benji-5.0/run/SuperPipe_2/processor3/constant/polyMesh/points"
  [3]
  [3]     From function virtual Foam::autoPtr<Foam::ISstream> Foam::fileOperations::uncollatedFileOperation::readStream(Foam::regIOobject&, const Foam::fileName&, const Foam::word&, bool) const
  [3]     in file global/fileOperations/uncollatedFileOperation/uncollatedFileOperation.C at line 505.
  [3]
  FOAM parallel run exiting
  [3]
   
  [0]
  [0] --> FOAM FATAL ERROR:
  [0] cannot find file "/home/benji/OpenFOAM/benji-5.0/run/SuperPipe_2/processor0/constant/polyMesh/points"
  [0]
  [0]     From function virtual Foam::autoPtr<Foam::ISstream> Foam::fileOperations::uncollatedFileOperation::readStream(Foam::regIOobject&, const Foam::fileName&, const Foam::word&, bool) const
  [0]     in file global/fileOperations/uncollatedFileOperation/uncollatedFileOperation.C at line 505.
  [0]
  FOAM parallel run exiting
  [0]
  --------------------------------------------------------------------------
  mpirun has exited due to process rank 3 with PID 14749 on
  node ubuntu exiting improperly. There are two reasons this could occur:
   
  1. this process did not call "init" before exiting, but others in
  the job did. This can cause a job to hang indefinitely while it waits
  for all processes to call "init". By rule, if one process calls "init",
  then ALL processes must call "init" prior to termination.
   
  2. this process called "init", but exited without calling "finalize".
  By rule, all processes that call "init" MUST call "finalize" prior to
  exiting or it will be considered an "abnormal termination"
   
  This may have caused other processes in the application to be
  terminated by signals sent by mpirun (as reported here).
  --------------------------------------------------------------------------
  [ubuntu:14745] 3 more processes have sent help message help-mpi-api.txt / mpi-abort
  [ubuntu:14745] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Benji is offline   Reply With Quote

Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
InterDyMFoam dynamic messing in parallel fails under nonquiescent conditions adona058 OpenFOAM Running, Solving & CFD 5 August 19, 2010 11:47
interDyMFoam fails in parallel nikos_fb16 OpenFOAM Running, Solving & CFD 2 March 28, 2009 12:07
Running interDyMFoam in parallel sega OpenFOAM Running, Solving & CFD 1 March 12, 2009 05:54
InterDyMFoam dynamic meshing in parallel fails under nonquiescent conditions adona058 OpenFOAM Bugs 7 November 18, 2008 14:58
OpenFOAM 14 stock version parallel bug msrinath80 OpenFOAM Bugs 2 May 30, 2007 14:47


All times are GMT -4. The time now is 09:13.