CFD Online Logo CFD Online URL
Home > Forums > Software User Forums > OpenFOAM

Open MPI error

Register Blogs Members List Search Today's Posts Mark Forums Read

Like Tree6Likes
  • 1 Post By GrivalszkiP
  • 2 Post By hokhay
  • 2 Post By GrivalszkiP
  • 1 Post By Turin Turambar

LinkBack Thread Tools Search this Thread Display Modes
Old   August 3, 2020, 08:33
Default Open MPI error
Grivalszki Péter
Join Date: Mar 2019
Location: Budapest, Hungary
Posts: 39
Rep Power: 5
GrivalszkiP is on a distinguished road

I get this error message if I try to run parallel stuff:

vituki@VHDT08:/mnt/d/griva_modellek/wavebreak$ mpirun -np 8 snappyHexMesh -parallel
There are not enough slots available in the system to satisfy the 8
slots that were requested by the application:


Either request fewer slots for your application, or make more slots
available for use.

A "slot" is the Open MPI term for an allocatable unit where we can
launch a process.  The number of slots available are defined by the
environment in which Open MPI processes are run:

  1. Hostfile, via "slots=N" clauses (N defaults to number of
     processor cores if not provided)
  2. The --host command line parameter, via a ":N" suffix on the
     hostname (N defaults to 1 if not provided)
  3. Resource manager (e.g., SLURM, PBS/Torque, LSF, etc.)
  4. If none of a hostfile, the --host command line parameter, or an
     RM is present, Open MPI defaults to the number of processor cores

In all the above cases, if you want Open MPI to default to the number
of hardware threads instead of the number of processor cores, use the
--use-hwthread-cpus option.

Alternatively, you can use the --oversubscribe option to ignore the
number of available slots when deciding the number of processes to
I use OpenFOAM v2006 and OpenMPI v4.0.3. I have 8 threads (4 cores) in my computer and decomposePar run without problems. There weren't any problems like this before. Thank you in advance!
Ashuthosh likes this.
GrivalszkiP is offline   Reply With Quote

Old   August 17, 2020, 02:30
Join Date: Nov 2014
Posts: 92
Rep Power: 9
hokhay is on a distinguished road
Have you checked whether the multi-threading function is on?
Check how many CPU you have got with command 'htop'
GrivalszkiP and Ashuthosh like this.
hokhay is offline   Reply With Quote

Old   August 18, 2020, 11:01
Grivalszki Péter
Join Date: Mar 2019
Location: Budapest, Hungary
Posts: 39
Rep Power: 5
GrivalszkiP is on a distinguished road
Thank you, I have fixed it:

Newer versions of mpi detect wrong input by users, if they declare 8 cores when there are only 4 available.
You have 2 options:
- switch to 4 threads and gain more performance
- add the option --use-hwthread-cpus and run your simulation on lower performance with 8 threads
Ashuthosh and cfdphoenix like this.
GrivalszkiP is offline   Reply With Quote

Old   April 27, 2021, 08:41
New Member
Join Date: Oct 2016
Posts: 5
Rep Power: 7
Turin Turambar is on a distinguished road

I had the same error before. My computer has 8 cores and I was not able to run the simulation in parallel for more than 4 cores. I used "-oversubscribe" flag which allows processes on a node than processing elements (from mpirun man page), and it solved my problem. Here is the example:


mpirun -oversubscribe -np 8 interFoam -parallel | tee log.interFoam

I hope it is also a convenient way of solving this issue.

Best regards
wdx_cfd likes this.
Turin Turambar is online now   Reply With Quote


error, mpirun, openmpi, parallel, slots

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On

Similar Threads
Thread Thread Starter Forum Replies Last Post
[swak4Foam] swak4foam for OpenFOAM 4.0 mnikku OpenFOAM Community Contributions 80 May 17, 2022 08:06
Compile calcMassFlowC aurore OpenFOAM Programming & Development 13 March 23, 2018 07:43
OpenFOAM without MPI kokizzu OpenFOAM Installation 4 May 26, 2014 09:17
Compiling dynamicTopoFvMesh for OpenFOAM 2.1.x Saxwax OpenFOAM Installation 25 November 29, 2013 05:34
How to get the max value of the whole field waynezw0618 OpenFOAM Running, Solving & CFD 4 June 17, 2008 05:07

All times are GMT -4. The time now is 13:27.