CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > OpenFOAM

error while running in parallel using openmpi on local mc 6 processors

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Display Modes
Old   May 20, 2012, 05:02
Default error while running in parallel using openmpi on local mc 6 processors
  #1
New Member
 
Join Date: Mar 2009
Location: Pune, India
Posts: 27
Rep Power: 9
suryawanshi_nitin is on a distinguished road
when i m running my case for parallel processing with following
mpirun -np 6 pisoFoam -parallel > log &

getting following error...

neptune@ubuntu:~/tutorials/incompressible/icoFoam/cavity$ mpirun -np 6 pisoFoam -parallel > log &
[1] 12387
neptune@ubuntu:~/tutorials/incompressible/icoFoam/cavity$ --------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[0]
[0]
[0] --> FOAM FATAL ERROR:
[0] number of processor directories = 2 is not equal to the number of processors = 6
[0]
FOAM parallel run exiting
[0]
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 12388 on
node ubuntu exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------


therefore after this i did parallel test
as mentioned in on of Brunos link

  1. Run mpirun with a test for launching mpi'less applications. For example, run each one of these at a time: Code:
    mpirun -np 2 bash -c "ls -l" mpirun -np 2 bash -c "export"
    The first one will show you the contents of the folder each remotely launched bash shell. The second one will show you the environment variables for each remote shell.
    If neither one of these work, then your MPI installation isn't working.
  2. Build the test application designed for these kinds of tests: Code:
    cd $WM_PROJECT_DIR wmake applications/test/parallel
    Now go to your case that has the decomposePar already done.
    Then run the following scenarios:

Till 2 step everything working ok but when i say parallelTest gives following error

neptune@ubuntu:~/tutorials/incompressible/icoFoam/cavity$ parallelTest
parallelTest: command not found

Thanks in advance... please help me on this....
suryawanshi_nitin is offline   Reply With Quote

Old   May 20, 2012, 08:28
Default
  #2
Senior Member
 
Adhiraj
Join Date: Sep 2010
Location: Maryland, United States
Posts: 102
Rep Power: 8
adhiraj is on a distinguished road
Why does it complain that you have 2 processor directories and are trying to run with 6 processors?
adhiraj is offline   Reply With Quote

Old   May 20, 2012, 10:45
Default
  #3
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 9,635
Blog Entries: 39
Rep Power: 99
wyldckat is just really nicewyldckat is just really nicewyldckat is just really nicewyldckat is just really nicewyldckat is just really nice
Greetings to both!

@suryawanshi_nitin: Adhiraj is right, the decomposition apparently didn't go as you expected. Check your "system/decomposeParDict".

As for parallelTest, as of 2.0.0 it has been renamed to Test-parallel.

Best regards,
Bruno
__________________
wyldckat is offline   Reply With Quote

Old   May 20, 2012, 13:37
Default
  #4
New Member
 
Join Date: Mar 2009
Location: Pune, India
Posts: 27
Rep Power: 9
suryawanshi_nitin is on a distinguished road
Thnaks for your valuable replies
yes you are right, in my actual case i m working with 6 processors but it was giving issue, so i thought of checking with simple case. Below is the error msg of actual case. i have allready solved this case in of201 debian pack, till 1.7 sec by decomposing domain for 6 processors, And now i m using same data to solve further in of210 source pack installation. but while running it for further time after 1.7 sec but getting following error...
(And test-Parallel is working now)

neptune@ubuntu:~/nitin/s$ mpirun -np 6 pisoFoam -parallel > log &
[1] 2865
neptune@ubuntu:~/nitin/s$ [5]
[5]
[5] --> FOAM FATAL IO ERROR:
[5] essential value entry not provided
[5]
[5] file: /home/neptune/nitin/s/processor5/1.77/phi::boundaryField::symmetryBottom from line 59453 to line 59453.
[5]
[5] From function fvsPatchField<Type>::fvsPatchField
(
const fvPatch& p,
const DimensionedField<Type, surfaceMesh>& iF,
const dictionary& dict
)
[5]
[5] in file lnInclude/fvsPatchField.C at line 110.
[5]
FOAM parallel run exiting
[5]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 5 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 5 with PID 2871 on
node ubuntu exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------

kindly waiting for your replies. Thanks in advance.

Nitin
suryawanshi_nitin is offline   Reply With Quote

Old   May 21, 2012, 08:31
Default
  #5
Member
 
Jan
Join Date: Dec 2009
Location: Berlin
Posts: 50
Rep Power: 10
SirWombat is on a distinguished road
Send a message via Skype™ to SirWombat
Quote:
Originally Posted by suryawanshi_nitin View Post
[5] --> FOAM FATAL IO ERROR:
[5] essential value entry not provided
[5]
[5] file: /home/neptune/nitin/s/processor5/1.77/phi::boundaryField::symmetryBottom from line 59453 to line 59453.
What OpenFOAM is trying to tell you: Your "symmetryBottom" has a missing value in your boundary setup. Be sure to provide all needed variables!
__________________
~~~_/)~~~
SirWombat is offline   Reply With Quote

Old   May 21, 2012, 14:13
Default
  #6
New Member
 
Join Date: Mar 2009
Location: Pune, India
Posts: 27
Rep Power: 9
suryawanshi_nitin is on a distinguished road
sir thanks for your reply
its working well now, but i started the case from start time i.e. 0.0 sec in controldict file, a complete new simulation. From this what i understood is if we are having solution data of of debian pack of201 and if we want to run that case further in of210 source pack installation, then it of210 unable to understand/handle old data from old version especially in parallel case. this is what my interpretation.... thanks sir for your valuable time. if anyone is having clear view about this then they are most wellcome.
This is the way to learn fast with more clarity.....


regards
Nitin Suryawanshi.
suryawanshi_nitin is offline   Reply With Quote

Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Superlinear speedup in OpenFOAM 13 msrinath80 OpenFOAM Running, Solving & CFD 18 March 3, 2015 06:36
How to write k and epsilon before the abnormal end xiuying OpenFOAM Running, Solving & CFD 8 August 27, 2013 15:33
Upgraded from Karmic Koala 9.10 to Lucid Lynx10.04.3 bookie56 OpenFOAM Installation 8 August 13, 2011 04:03
Running dieselFoam in parallel. Palminchi OpenFOAM 0 February 17, 2010 05:00
Transient simulation not converging skabilan OpenFOAM Running, Solving & CFD 12 September 17, 2007 17:48


All times are GMT -4. The time now is 02:41.