CFD Online Discussion Forums

CFD Online Discussion Forums (
-   FLUENT (
-   -   ANSYS Fluent UDFs in Batch mode (

hwvfn June 13, 2020 23:42

ANSYS Fluent UDFs in Batch mode
Recently, I am doing the numerical simulation using the HPC cluster at our university. I have some problems using the ANSYS Fluent in batch mode running on multiple nodes. Here is the information I got from a website. The batch file was submitted by using Slurm.


Running on Multiple Nodes

Another thing to note is that due to the nature of UDF function calls, all nodes need access to the compiled UDF libraries. By default, ANSYS Fluent jobs are set to run in the ~/work directory which is local to the head node. Depending on the behavior of the function, the slave node processes may need to access a common file or library. In this case, the job should be run out of the NFS mounted directory, ~/work/shared. So simply prepend move and change directory commands on the "Software Settings" page before you launch Fluent (not in the journal file):

mv * shared

cd shared

fluent 3ddp -gu -ssh -cnf=$FLUENT_HOSTS -t$RESCALE_CORES_PER_SLOT -i example.jou

So again, if you are running on multiple cores on a single node, this step is unnecessary since the local file system can be accessed by all the cores on that node.


The tutorial may help to solve my problem. However, I am not familiar with the command. I have tried the command “mv * share” and “cd share” but it didn’t work. Would you please provide me an example code or tutorials to solve this problem?

Here is my batch file code



#SBATCH --job-name=Permafrost.sbatch

#SBATCH --nodes=2

#SBATCH --ntasks=8

#SBATCH --time=50:00:00

#SBATCH --mail-type=BEGIN

#SBATCH --mail-type=END

#SBATCH --mail-type=FAIL


#SBATCH --out=Forge-%j.out

#SBATCH --error=Forge-%j.err

#generate a node file

export PBS_NODEFILE=`generate_pbs_nodefile`

mv shared

cd shared

#run fluent in parallel.

module load ansys

fluent 2ddp -g \


-pinfiniband \


-ssh < /home/hwvfn/Permafrost_command.txt


All times are GMT -4. The time now is 19:57.