Compiling UDF on a cluster
I am trying to compile a UDF on cluster(multiple nodes). If I run Fluent interactively on the cluster it compiles fine and I am able to link the UDF files with the case file. But when I submit a job on the cluster I get an error that says the UDF file cannot be found. My job file looks like this:
This job file compiles the C program but is not able to find and link the compiled UDF with the case file. Is there anything wrong in the way I am doing it?
I have never compiled udf, during a job, so I'm guessing....
from which path do you launch your job? Do you move to the working directory in the script file, otherwise the working directory is the home.
Probably fluent compiles the udf library in the home directory o somewhere else.
Try to look up the libudf directory generated during compilation, using
find $HOME -name "*libudf*"
|All times are GMT -4. The time now is 11:53.|