CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Running, Solving & CFD (https://www.cfd-online.com/Forums/openfoam-solving/)
-   -   big difference between clockTime and executionTime (https://www.cfd-online.com/Forums/openfoam-solving/121576-big-difference-between-clocktime-executiontime.html)

katakgoreng February 19, 2014 07:20

Hi Bruno,

I managed to get OpenFOAM running on the cluster for 2 nodes using the method that you proposed. :D

This time, I don't copy the case folder to the cluster node temporary folder and run openfoam application by using the "-case" statement.

PBS job script that I use:

Code:

#!/bin/bash
#
# --- SET THE PBS DIRECTIVES
#PBS -l walltime=2:00:00
#PBS -l select=2:ncpus=16:mpiprocs=16:mem=4000mb
#PBS -e ehk112_err               
#PBS -o ehk112_out               
#PBS -m ae
#PBS -M erwanhafizi@hotmail.com
#PBS -V

echo "============================================="
echo "FOLDER LOCATION AND NAME"
echo "============================================="
CASEFOLDER="damBreak32"
CASELOCATION="$WORK/CASEFOLDER/"
echo $CASEFOLDER
echo $CASELOCATION

echo "============================================="
echo "SOURCING SYSTEM BASHRC"
echo "============================================="
. $HOME/.bash_profile

echo "============================================="
echo "SOURCING OPENFOAM 2.2.x BASHRC"
echo "============================================="
. /home/ehk112/OpenFOAM/OpenFOAM-2.2.x/etc/bashrc

echo "============================================="
echo "CD TO CASE FOLDER"
echo "============================================="
cd $CASELOCATION/$CASEFOLDER

echo "============================================="
echo "RUNNING OPENFOAM BATCH SCRIPT"
echo "============================================="
./Allrun

OpenFOAM batch script
Code:

#!/bin/bash

# ===============================
# PREPARE CASES
# ===============================
rm log.*
cp -rf 0/backup/* 0/

# ===============================
# MESHING
# ===============================
blockMesh -case /work/ehk112/CASEFOLDER/damBreak32 > log.blockMesh 2>&1

# ===============================
# SET FIELD
# ===============================
setFields -case /work/ehk112/CASEFOLDER/damBreak32 > log.setField 2>&1

# ===============================
# DECOMPOSE DOMAIN
# ===============================
decomposePar -case /work/ehk112/CASEFOLDER/damBreak32 > log.decomposePar 2>&1

# ===============================
# RENUMBER MESH               
# ===============================
renumberMesh -overwrite -case /work/ehk112/CASEFOLDER/damBreak32 > log.renumberMesh 2>&1

# ===============================
# RUN APPLICATION
# ===============================
mpirun -hostfile $PBS_NODEFILE -np 32 interFoam -parallel -case /work/ehk112/CASEFOLDER/damBreak32 >  log.interFoam 2>&1

The log file:

Code:

/*---------------------------------------------------------------------------*\
| =========                |                                                |
| \\      /  F ield        | OpenFOAM: The Open Source CFD Toolbox          |
|  \\    /  O peration    | Version:  2.2.x                                |
|  \\  /    A nd          | Web:      www.OpenFOAM.org                      |
|    \\/    M anipulation  |                                                |
\*---------------------------------------------------------------------------*/
Build  : 2.2.x-0ee7dc546f1b
Exec  : interFoam -parallel -case /work/ehk112/CASEFOLDER/damBreak32
Date  : Feb 19 2014
Time  : 10:31:39
Host  : "cx1-9-17-2.cx1.hpc.ic.ac.uk"
PID    : 28546
Case  : /work/ehk112/CASEFOLDER/damBreak32
nProcs : 32
Slaves :
31
(
"cx1-9-17-2.cx1.hpc.ic.ac.uk.28547"
"cx1-9-17-2.cx1.hpc.ic.ac.uk.28548"
"cx1-9-17-2.cx1.hpc.ic.ac.uk.28549"
"cx1-9-17-2.cx1.hpc.ic.ac.uk.28550"
"cx1-9-17-2.cx1.hpc.ic.ac.uk.28551"
"cx1-9-17-2.cx1.hpc.ic.ac.uk.28552"
"cx1-9-17-2.cx1.hpc.ic.ac.uk.28553"
"cx1-9-17-2.cx1.hpc.ic.ac.uk.28554"
"cx1-9-17-2.cx1.hpc.ic.ac.uk.28555"
"cx1-9-17-2.cx1.hpc.ic.ac.uk.28556"
"cx1-9-17-2.cx1.hpc.ic.ac.uk.28557"
"cx1-9-17-2.cx1.hpc.ic.ac.uk.28558"
"cx1-9-17-2.cx1.hpc.ic.ac.uk.28559"
"cx1-9-17-2.cx1.hpc.ic.ac.uk.28560"
"cx1-9-17-2.cx1.hpc.ic.ac.uk.28561"
"cx1-9-17-4.cx1.hpc.ic.ac.uk.30201"
"cx1-9-17-4.cx1.hpc.ic.ac.uk.30202"
"cx1-9-17-4.cx1.hpc.ic.ac.uk.30203"
"cx1-9-17-4.cx1.hpc.ic.ac.uk.30204"
"cx1-9-17-4.cx1.hpc.ic.ac.uk.30205"
"cx1-9-17-4.cx1.hpc.ic.ac.uk.30206"
"cx1-9-17-4.cx1.hpc.ic.ac.uk.30207"
"cx1-9-17-4.cx1.hpc.ic.ac.uk.30208"
"cx1-9-17-4.cx1.hpc.ic.ac.uk.30209"
"cx1-9-17-4.cx1.hpc.ic.ac.uk.30210"
"cx1-9-17-4.cx1.hpc.ic.ac.uk.30211"
"cx1-9-17-4.cx1.hpc.ic.ac.uk.30212"
"cx1-9-17-4.cx1.hpc.ic.ac.uk.30213"
"cx1-9-17-4.cx1.hpc.ic.ac.uk.30214"
"cx1-9-17-4.cx1.hpc.ic.ac.uk.30215"
"cx1-9-17-4.cx1.hpc.ic.ac.uk.30216"
)

Pstream initialized with:
    floatTransfer      : 0
    nProcsSimpleSum    : 0
    commsType          : nonBlocking
    polling iterations : 0
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0


PIMPLE: Operating solver in PISO mode

Reading field p_rgh

Reading field U

Reading/calculating face flux field phi

Reading transportProperties

Selecting incompressible transport model Newtonian
Selecting incompressible transport model Newtonian
Selecting turbulence model type laminar

Reading g
Calculating field g.h

No finite volume options present

time step continuity errors : sum local = 0, global = 0, cumulative = 0
DICPCG:  Solving for pcorr, Initial residual = 0, Final residual = 0, No Iterations 0
time step continuity errors : sum local = 0, global = 0, cumulative = 0
Courant Number mean: 0 max: 0

Starting time loop

Courant Number mean: 0 max: 0
Interface Courant Number mean: 0 max: 0
deltaT = 0.00119048
Time = 0.00119048

MULES: Solving for alpha1
Phase-1 volume fraction = 0.130194  Min(alpha1) = 0  Max(alpha1) = 1
MULES: Solving for alpha1
Phase-1 volume fraction = 0.130194  Min(alpha1) = 0  Max(alpha1) = 1
DICPCG:  Solving for p_rgh, Initial residual = 1, Final residual = 0.0039953, No Iterations 1
time step continuity errors : sum local = 0.00110536, global = 0, cumulative = 0
DICPCG:  Solving for p_rgh, Initial residual = 0.00192022, Final residual = 6.91678e-05, No Iterations 17
time step continuity errors : sum local = 3.98241e-05, global = -9.70428e-06, cumulative = -9.70428e-06
DICPCG:  Solving for p_rgh, Initial residual = 5.5586e-05, Final residual = 9.36665e-08, No Iterations 62
time step continuity errors : sum local = 6.72902e-08, global = 7.69607e-09, cumulative = -9.69659e-06
ExecutionTime = 0.18 s  ClockTime = 0 s

At last, it works !! :D
Thank you so much Bruno.

Kind regards,
katakgoreng

Saleh Abuhanieh February 15, 2019 03:05

Hi,


I faced similar problem while running foam 4.0 on a cluster.
I was using the openmpi which comes with foam.
Whatever the number of the required processors, foam was assigning all of them to the same node although SLURM is booking the required number of nodes!


problem solved when I changed the openmpi to the system one.


Hope that may help somebody.


Saleh


All times are GMT -4. The time now is 17:42.