CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

OpenFOAM parallel error on HPC facility

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   March 19, 2018, 13:48
Default OpenFOAM parallel error on HPC facility
  #1
New Member
 
Bota Amdrei
Join Date: May 2015
Posts: 13
Rep Power: 10
andrewmichael is on a distinguished road
Hello everyone,

I am trying to submit a job for OpenFOAM on an HPC facility. I have a problem
when the parallel simulation starts, in the log file I receive this:

--> FOAM FATAL ERROR:
[3] UPstream::init(int& argc, char**& argv) : environment variable
MPI_BUFFER_SIZE not defined[5]

From what I found online the problem is that OpenFOAM uses an MPI
platform and have to load an MPI module to solve this. In my job file,
I had used this command to load OpenFOAM to run on an MPI platform:
module load apps/openfoam/4.1/gcc-6.2-openmpi-2.0.1.

My job file is this one:

#! /bin/bash
#$ -o Openfoam_ReconstructLog.txt
#$ -j y
#$ -m bea -M ambota1@sheffield.ac.uk
#$ -pe openmpi-ib 27 (tried even with -pe mpi command)
#$ -l h_rt=03:00:00
#$ -l rmem=16G

module load apps/openfoam/4.1/gcc-6.2-openmpi-2.0.1
fluentMeshToFoam Wing1to28_structured_refine.msh
decomposePar -force
mpirun -np 27 simpleFoam -parallel > Residuals &
andrewmichael is offline   Reply With Quote

Old   March 19, 2018, 17:17
Default
  #2
Member
 
Yousef
Join Date: Feb 2015
Posts: 40
Rep Power: 11
ykanani is on a distinguished road
Quote:
Originally Posted by andrewmichael View Post
Hello everyone,

I am trying to submit a job for OpenFOAM on an HPC facility. I have a problem
when the parallel simulation starts, in the log file I receive this:

--> FOAM FATAL ERROR:
[3] UPstream::init(int& argc, char**& argv) : environment variable
MPI_BUFFER_SIZE not defined[5]

From what I found online the problem is that OpenFOAM uses an MPI
platform and have to load an MPI module to solve this. In my job file,
I had used this command to load OpenFOAM to run on an MPI platform:
module load apps/openfoam/4.1/gcc-6.2-openmpi-2.0.1.

My job file is this one:

#! /bin/bash
#$ -o Openfoam_ReconstructLog.txt
#$ -j y
#$ -m bea -M ambota1@sheffield.ac.uk
#$ -pe openmpi-ib 27 (tried even with -pe mpi command)
#$ -l h_rt=03:00:00
#$ -l rmem=16G

module load apps/openfoam/4.1/gcc-6.2-openmpi-2.0.1
fluentMeshToFoam Wing1to28_structured_refine.msh
decomposePar -force
mpirun -np 27 simpleFoam -parallel > Residuals &

Hi,
You also need to load appropriate openMPI module as well. Have you done that? it seems you need to load openMPI version 2.0.1 which should be available. Typically the openfoam module itself should take care of that. You can check by loading openfoam module in the login node and then check "module list" to see the currently loaded modules.

Typically each HPC has it's own configuration, module, etc. So one could expect to get more accurate answer and solve the issue in shorter timeframe by contacting the technical support team of the HPC.

Regards,
ykanani is offline   Reply With Quote

Old   March 19, 2018, 17:22
Default
  #3
New Member
 
Bota Amdrei
Join Date: May 2015
Posts: 13
Rep Power: 10
andrewmichael is on a distinguished road
when I load the OpenFOAM module, the mpi platform is automatically loaded too. That's the problem, I tried to load a separated mpi module as you said but it did not worked.

Best wishes,
Andrei
andrewmichael is offline   Reply With Quote

Old   March 21, 2018, 10:23
Default
  #4
Member
 
Yousef
Join Date: Feb 2015
Posts: 40
Rep Power: 11
ykanani is on a distinguished road
Quote:
Originally Posted by andrewmichael View Post
when I load the OpenFOAM module, the mpi platform is automatically loaded too. That's the problem, I tried to load a separated mpi module as you said but it did not worked.

Best wishes,
Andrei
If the MPI is appropriately loaded, then the following environmental variables typically are set properly:

MPI_ARCH_PATH
MPI_BUFFER_SIZE
FOAM_MPI

after loading openfoam module, you can view the value of those variable by using echo command, e.g. echo $MPI_BUFFER_SIZE and it will show you their values. If you see nothing, it means that they are not properly set by the modules that you have loaded.

It will help if you could post the following information after loading the module:

echo $MPI_BUFFER_SIZE
echo $FOAM_MPI
echo $MPI_ARCH_PATH

Again, I strongly recommend you to contact the technical support of your HPC system since they themselves have installed openfoam and know the appropriate procedure to use it.

Regards,
ykanani is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
OpenFOAM 5.0 Released CFDFoundation OpenFOAM Announcements from OpenFOAM Foundation 11 June 5, 2018 23:48
How to contribute to the community of OpenFOAM users and to the OpenFOAM technology wyldckat OpenFOAM 17 November 10, 2017 15:54
First Step Parallel Simulation and HPC FlyBob91 OpenFOAM Running, Solving & CFD 3 July 14, 2017 09:05
OpenFOAM can't be run in parallel in cluster sibo OpenFOAM Running, Solving & CFD 4 February 21, 2017 16:29
Parallel run of OpenFOAM in linux and windows side by side m2montazari OpenFOAM Running, Solving & CFD 5 June 24, 2011 03:26


All times are GMT -4. The time now is 11:53.