CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > OpenFOAM Running, Solving & CFD

damBreak case parallel run problem

Register Blogs Members List Search Today's Posts Mark Forums Read

Like Tree1Likes
  • 1 Post By wyldckat

Reply
 
LinkBack Thread Tools Display Modes
Old   July 25, 2015, 18:52
Default damBreak case parallel run problem [solved]
  #1
New Member
 
behzad Ghasemi
Join Date: Sep 2013
Location: Iran
Posts: 15
Rep Power: 4
behzad-cfd is on a distinguished road
Hi dear Foamers,

I have a problem with parallel processing of the open Foam. I saw some threads but I think my problem is a little different so i posted new one.

Every time i try to do the damBreak case i get Error below:
Code:
(OF:2.4.0-Opt) behzad@behzad:~/Documents/damBreak$ mpirun -np 4 interFoam -parallel > log &
[1] 14702
(OF:2.4.0-Opt) behzad@behzad:~/Documents/damBreak$ [0] 
[0] 
[0] --> FOAM FATAL ERROR: 
[0] interFoam: cannot open case directory "/home/behzad/Documents/damBreak/processor0"
[0] 
[0] 
FOAM parallel run exiting
[0] 
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD 
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 14703 on
node behzad exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------

[1]+  Exit 1                  mpirun -np 4 interFoam -parallel > log
not only this case also any other case i try fails.
Running on my dell xps l502x laptop core i7 2630QM CPU, 12 gig of RAM, 10 gig SWAP space, UBUNTU 12.04.2 LTS,Linux 3.16.0-30-generic (x86_64).
my Open MPI version is 1.6.5 .

I tested this on several versions of open Foam such as 2.4, 2.3, 3.1 Extend and every time i got errors like that!
I've done a few searches in forum and found couple of old threads and couldn't solve the problem. Please give me some steps I'm not a Linux expert.

This is the log file:

https://www.dropbox.com/s/9p4l4qxepkqdz7c/log?dl=0

Regards

Last edited by behzad-cfd; August 3, 2015 at 04:29.
behzad-cfd is offline   Reply With Quote

Old   July 26, 2015, 15:05
Default
  #2
New Member
 
behzad Ghasemi
Join Date: Sep 2013
Location: Iran
Posts: 15
Rep Power: 4
behzad-cfd is on a distinguished road
Any idea?
Why no body doesn't answer my questions?!
It's very disappointing ...
behzad-cfd is offline   Reply With Quote

Old   July 26, 2015, 16:12
Default
  #3
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 9,558
Blog Entries: 39
Rep Power: 97
wyldckat is just really nicewyldckat is just really nicewyldckat is just really nicewyldckat is just really nicewyldckat is just really nice
Quick question: Did you run decomposePar before running mpirun?
wyldckat is offline   Reply With Quote

Old   July 27, 2015, 03:48
Default
  #4
New Member
 
behzad Ghasemi
Join Date: Sep 2013
Location: Iran
Posts: 15
Rep Power: 4
behzad-cfd is on a distinguished road
Quote:
Originally Posted by wyldckat View Post
Quick question: Did you run decomposePar before running mpirun?
Thanks for your reply bruno. I thought my thread is invisible! :-)
Yes i did before i post this thread. But nothing changed.
behzad-cfd is offline   Reply With Quote

Old   August 2, 2015, 09:11
Default
  #5
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 9,558
Blog Entries: 39
Rep Power: 97
wyldckat is just really nicewyldckat is just really nicewyldckat is just really nicewyldckat is just really nicewyldckat is just really nice
Hi behzad,

I've finally managed to take a quick look at your problem... and this is disconcerting . I was expecting that you had provided the case in DropBox, but instead you provided the log file.

From what I can figure out, based on the little information you've provided,it seems that you didn't notice about the complaints that decomposePar gave you, because if interFoam is complaining that:
Quote:
Code:
[0] --> FOAM FATAL ERROR: 
[0] interFoam: cannot open case directory "/home/behzad/Documents/damBreak/processor0"
Then that means that the folder does either not exist or has the wrong permissions for you to use it.

Please provide the following details (which are somewhat implied in this thread: How to give enough info to get help):
  1. Provide the log file for the decomposePar operation. You can create a detailed log file by running:
    Code:
    decomposePar > log.decomposePar 2>&1
    The resulting file "log.decomposePar" has the complete output text, including any error messages.
  2. Provide the list of contents of the case folder, by running:
    Code:
    ls -l

If you're not familiar with how to use the command line in a Linux system, please study one or two tutorials about it. This page might help you get started: http://openfoamwiki.net/index.php/In...with_the_Shell

Best regards,
Bruno
behzad-cfd likes this.
__________________
wyldckat is offline   Reply With Quote

Old   August 2, 2015, 17:18
Default
  #6
New Member
 
behzad Ghasemi
Join Date: Sep 2013
Location: Iran
Posts: 15
Rep Power: 4
behzad-cfd is on a distinguished road
Hi Bruno,
I'm appreciated of you for accepting my request and your kind answer. I searched open foam's bug page a few days ago and saw a bug exactly same as my problem that you had answered it(bug ID 0000301).http://www.openfoam.org/bugs/
The problem was about illegal machine name. I reinstalled my Linux for some reasons and changed my machine name too.but i hadn't tested parallel running case again until you asked me to create log files, I tested and did same things that i had done before and was not working, but this time it worked without any problem.

So i think it was about inappropriate machine name and solved.

Thank you again bruno.

Best regards,
Behzad

Last edited by behzad-cfd; August 3, 2015 at 04:34.
behzad-cfd is offline   Reply With Quote

Reply

Tags
dambreak, mpirun error, openfoam, parallel error

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Case running in serial, but Parallel run gives error atmcfd OpenFOAM Running, Solving & CFD 18 March 26, 2016 13:40
Running AMI case in parallel Kaskade OpenFOAM Running, Solving & CFD 3 March 14, 2016 16:58
problem during mpi in server: expected Scalar, found on line 0 the word 'nan' muth OpenFOAM Running, Solving & CFD 2 April 10, 2015 05:42
Superlinear speedup in OpenFOAM 13 msrinath80 OpenFOAM Running, Solving & CFD 18 March 3, 2015 06:36
Parallel Run problem shhe OpenFOAM Running, Solving & CFD 1 April 27, 2010 06:52


All times are GMT -4. The time now is 18:34.