CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

problem with "running in parallel" >> error

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   April 15, 2014, 15:54
Default problem with "running in parallel" >> error
  #1
Member
 
Tommy Sp
Join Date: Jan 2014
Posts: 51
Rep Power: 12
vitorspadeto is on a distinguished road
hello! I was trying to run in PARALLEL icoFoam but me the following error occurred: Anyone know what I should do to correct the error? Thank you.

observation: I have a core i7 (8 cores)



s@s-Aspire-V371:~/Dropbox//icoFoam/cavity$
mpirun -np 8 icoFoam -parallel > log &
[1] 17678
s@s-Aspire-V3-571:~/Dropbox/Openfoam_PG/tutorials/incompressible/icoFoam/cavity$ [0]
[0]
[0] --> FOAM FATAL ERROR:
[0] Cannot read "/home/s/Dropbox/Openfoam_PG/tutorials/incompressible/icoFoam/cavity/system/decomposeParDict"
[0]
FOAM parallel run exiting
[0]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 17679 on
node s-Aspire-V3-571 exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
vitorspadeto is offline   Reply With Quote

Old   April 15, 2014, 16:48
Default
  #2
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,974
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
Greetings Vitor,

What message do you get if you run the following command?
Code:
ls -l /home/s/Dropbox/Openfoam_PG/tutorials/incompressible/icoFoam/cavity/system/decomposeParDict
Best regards,
Bruno
__________________
wyldckat is offline   Reply With Quote

Old   April 15, 2014, 16:59
Default
  #3
Member
 
Tommy Sp
Join Date: Jan 2014
Posts: 51
Rep Power: 12
vitorspadeto is on a distinguished road
Code:
 ls -l /home/s/Dropbox/Openfoam_PG/tutorials/incompressible/icoFoam/cavity/system/decomposeParDictclear
ls: não é possível acessar /home/s/Dropbox/Openfoam_PG/tutorials/incompressible/icoFoam/cavity/system/decomposeParDictclear: Arquivo ou diretório não encontrado
translating to english:

Quote:
ls -l /home/s/Dropbox/Openfoam_PG/tutorials/incompressible/icoFoam/cavity/system/decomposeParDictclear
ls: can not access /home/s/Dropbox/Openfoam_PG/tutorials/incompressible/icoFoam/cavity/system/decomposeParDictclear: File or directory not found
vitorspadeto is offline   Reply With Quote

Old   April 15, 2014, 17:00
Default
  #4
Member
 
Tommy Sp
Join Date: Jan 2014
Posts: 51
Rep Power: 12
vitorspadeto is on a distinguished road
I just type blockMesh and then ran icoFoam in parallel like above
vitorspadeto is offline   Reply With Quote

Old   April 16, 2014, 04:46
Default
  #5
Senior Member
 
Alexey Matveichev
Join Date: Aug 2011
Location: Nancy, France
Posts: 1,930
Rep Power: 38
alexeym has a spectacular aura aboutalexeym has a spectacular aura about
Send a message via Skype™ to alexeym
Hi,

before running a case in parallel you have to decompose it with decomposePar utility. This utility reads system/decomposeParDict for the number of domains and decomposition method. You can find an example of the file in (for example ) $FOAM_TUTORIALS/multiphase/interFoam/ras/damBreak/system.

And the sequence of commands for your case should be

Code:
$ blockMesh
$ decomposePar
$ mpiexec -np 8 icoFoam -parallel > log 2>&1 &
alexeym is offline   Reply With Quote

Old   April 16, 2014, 09:49
Default
  #6
Member
 
Tommy Sp
Join Date: Jan 2014
Posts: 51
Rep Power: 12
vitorspadeto is on a distinguished road
thanks bro... I got success!
vitorspadeto is offline   Reply With Quote

Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
[blockMesh] blockMesh with double grading. spwater OpenFOAM Meshing & Mesh Conversion 92 January 12, 2019 10:00
Pressure outlet boundary condition rolando OpenFOAM Running, Solving & CFD 62 September 18, 2017 07:45
[OpenFOAM] Native ParaView Reader Bugs tj22 ParaView 270 January 4, 2016 12:39
[swak4Foam] GroovyBC the dynamic cousin of funkySetFields that lives on the suburb of the mesh gschaider OpenFOAM Community Contributions 300 October 29, 2014 19:00
user subroutine error CFDUSER CFX 2 December 9, 2006 07:31


All times are GMT -4. The time now is 06:21.