Journal file for submitting a batch job
Hi,
I am trying to write a journal file to submit a Fluent job at HPC. However, I have received some errors after submission. I have written two files submission file "ANSYS.sub" and "fluent.in". The "ANSYS.sub" file is: #!/bin/bash -l #PBS -N FLUENT3D #PBS -l walltime=1:00:00 #PBS -l mem=8GB #PBS -l ncpus=8 #PBS -j oe #PBS -l ANSYS=1 #PBS -l ANSYS_HPC=8 module load ansys/19.0 cd ~/SIMULATION/PAPER3/DepthHeight/Nima fluent 3ddp -g -t 8 -i fluent.in > fluent.out "Fluent.in" file is also as below: /file/read-case-data Scen1.cas /solve/iterate 1000 /file/write-data Scen1.dat /exit yes The simulation is stopped and Fluent.out also showed the following error message: ===============Message from the Cortex Process========= Fatal error in one of the compute processes. ================================================= The job file also shows the following error: pkg/suse12/software/ansys/19.0/v190/fluent/fluent19.0.0/bin/fluent: line 2641: 9992 Segmentation fault (core dumped) $NO_RUN $CORTEX_PRE $EXE -f $FLUENT_PROD $CX_FLAGS "$CX_FUNCTION" It would be much appreciated if anyone could help me regarding this issue. Cheers, Nima |
you just add the initialisation step to your journal file
/file/read-case-data Scen1.cas /solve/initialize/initialize-flow /solve/iterate 1000 /file/write-data Scen1.dat /exit yes |
All times are GMT -4. The time now is 19:13. |