|
[Sponsors] |
June 14, 2015, 03:38 |
parallel Simulation fail
|
#1 |
New Member
daniel
Join Date: Jun 2015
Posts: 22
Rep Power: 10 |
Hello
I am currently trying to run a parallel simulation on 2 quadcore nodes. Both have OF installed on them both are available on network (one is dhcp server). I set up the case on the main node with decomposePar for 8 cpu's. I then run mpirun -hostfile Machines -np 8 icoFoam -parallel it then asks me for the second node password and then it runs (sort of) terminal puts out the following: /*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox | | \\ / O peration | Version: 2.4.0 | | \\ / A nd | Web: www.OpenFOAM.org | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ Build : 2.4.0-dcea1e13ff76 Exec : icoFoam -parallel Date : Jun 14 2015 Time : 06:48:14 Host : "Node1" PID : 5762 Thats all I get all 8 CPU's are on 100% and stay that way it never ends even if i set up a simulation that should only take a few seconds. Any one got any ideas why or how to fix this ? |
|
June 14, 2015, 06:40 |
|
#2 |
New Member
daniel
Join Date: Jun 2015
Posts: 22
Rep Power: 10 |
Update
I just attempted the simulation again while monitoring both nodes. when i start the simulation both nodes have 4 instances of icoFoam running each core at 100%. But again it just does this indefinitely, even for a really small simulation |
|
June 26, 2015, 15:25 |
Update 2
|
#3 |
New Member
daniel
Join Date: Jun 2015
Posts: 22
Rep Power: 10 |
Well i maneged to solve the previous error , the problem was the ssh key.
Now i have found a new error when i run the same case, the openmpi seems to initialise and opens 4 instances of icoFoam on each node but the it crashes and gives out the following error /*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox | | \\ / O peration | Version: 2.4.0 | | \\ / A nd | Web: www.OpenFOAM.org | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ Build : 2.4.0-dcea1e13ff76 Exec : icoFoam -parallel Date : Jun 26 2015 Time : 18:44:41 Host : "Master" PID : 9870 Case : /home/pi/Desktop/cavity nProcs : 8 Slaves : 7 ( "Master.9871" "Master.9872" "Master.9873" "Slave1.8215" "Slave1.8216" "Slave1.8217" "Slave1.8218" ) Pstream initialized with: floatTransfer : 0 nProcsSimpleSum : 0 commsType : nonBlocking polling iterations : 0 sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE). fileModificationChecking : Monitoring run-time modified files using timeStampMaster allowSystemOperations : Allowing user-supplied system call operations // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time Create mesh for time = 0 [4] [4] [4] --> FOAM FATAL ERROR: [4] Cannot find file "points" in directory "polyMesh" in times 0 down to constant [4] [4] From function Time::findInstance(const fileName&, const word&, const IOobject::readOption, const word&)[6] [6] [6] --> FOAM FATAL ERROR: [6] Cannot find file "points" in directory "polyMesh" in times 0 down to constant [6] [6] From function Time::findInstance(const fileName&, const word&, const IOobject::readOption, const word&) [6] in file db/Time/findInstance.C at line 203. [6] FOAM parallel run exiting [6] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 7 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [7] [7] [7] --> FOAM FATAL ERROR: [7] Cannot find file "points" in directory "polyMesh" in times 0 down to constant [7] [7] From function Time::findInstance(const fileName&, const word&, const IOobject::readOption, const word&) [7] in file db/Time/findInstance.C at line 203. [7] FOAM parallel run exiting [7] [4] in file db/Time/findInstance.C at line 203. [4] FOAM parallel run exiting [4] [5] [5] [5] --> FOAM FATAL ERROR: [5] Cannot find file "points" in directory "polyMesh" in times 0 down to constant [5] [5] From function Time::findInstance(const fileName&, const word&, const IOobject::readOption, const word&) [5] in file db/Time/findInstance.C at line 203. [5] FOAM parallel run exiting [5] -------------------------------------------------------------------------- mpirun has exited due to process rank 4 with PID 8215 on node 192.168.0.51 exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- [Master:09866] 3 more processes have sent help message help-mpi-api.txt / mpi-abort [Master:09866] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages Any one encountered this error before or knows how to fix it ? |
|
December 16, 2015, 16:11 |
|
#4 |
Senior Member
nasir musa yakubu
Join Date: Mar 2014
Location: Birmingham
Posts: 109
Rep Power: 12 |
I have the same issue as posted on this thread:
http://www.cfd-online.com/Forums/ope...tml#post577812 i hope someone has a solutions to this problem thanks nas |
|
December 17, 2015, 04:34 |
|
#5 | |
New Member
daniel
Join Date: Jun 2015
Posts: 22
Rep Power: 10 |
Quote:
|
||
December 17, 2015, 08:11 |
|
#6 |
Senior Member
nasir musa yakubu
Join Date: Mar 2014
Location: Birmingham
Posts: 109
Rep Power: 12 |
Hello Daniel, thanks for your prompt reply, however i am a salome newbie, please what do you mean by a file sharing system? u didn't use the path of the case directory when exporting the mesh?
thanks |
|
December 17, 2015, 09:00 |
|
#7 |
New Member
daniel
Join Date: Jun 2015
Posts: 22
Rep Power: 10 |
I think we are talking about two different issues here are you running the simulation on one machine ? , if so i dont have an answer for you I am not familiar with salome.
|
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Parallel Simulation Numeca Fine | daxterss | Fidelity CFD | 6 | June 24, 2013 04:52 |
Simulation running in background | samiam1000 | OpenFOAM | 1 | November 22, 2012 04:24 |
Some errors of parallel simulation with Fluent on Windows 7 | lzgwhy | FLUENT | 0 | October 1, 2012 02:56 |
Engine Simulation in parallel | Peter_600 | OpenFOAM | 0 | July 26, 2012 06:02 |
Parallel Deforming Mesh Simulation | Tristan | CFX | 1 | January 28, 2009 17:59 |