|
[Sponsors] |
![]() |
![]() |
#1 |
Member
Haoran Zhou
Join Date: Nov 2019
Posts: 52
Rep Power: 7 ![]() |
Hi all,
Recently I tried to run setFields in parallel but a problem occurred. The commands I executed are as follows: 1. blockMesh 2. cp -r 0.org 0 3. decomposePar 4. mpirun -np 4 snappyHexMesh -parallel -overwrite 5. mpirun -np 4 setFields -parallel During step 5, I got the FOAM FATAL IO ERROR: Cannot find patchField entry for caisson The error directs to the alpha.water files in every processor. However, I've already added the boundaryField 'caisson' in the alpha.water file of 0 folder before executing decomposePar. Do I miss any steps before running setFields in parallel? Is there any methods to deal with this problem? Thanks in advance! Best regards, Stan |
|
![]() |
![]() |
![]() |
![]() |
#2 |
New Member
Federico
Join Date: Jan 2021
Posts: 13
Rep Power: 6 ![]() |
Why do you need to run setFields in parallel? I'm not expert but setFields in general is a really quick application and requires low computational power so I launch it and then decompose to run the simulation
|
|
![]() |
![]() |
![]() |
![]() |
#3 | |
Senior Member
Yann
Join Date: Apr 2012
Location: France
Posts: 1,277
Rep Power: 29 ![]() ![]() |
Quote:
Stan, have you checked the content of the processor*/0/alpha.water file after running decomposePar? Does it still contain the boundary condition for the patch "caisson" ? |
||
![]() |
![]() |
![]() |
![]() |
#4 |
New Member
Federico
Join Date: Jan 2021
Posts: 13
Rep Power: 6 ![]() |
Yeah I know but I had many problems to launch the simulation without reconstructing the mesh because OF didn't read 0 in each processor. For this I started to use a directory in which I mesh and one in which I simulate and in general even if depends on the number of cells the reconstruction doesn't require so much time compared to snappyhexmesh so I think it is good use reconstructParMesh
|
|
![]() |
![]() |
![]() |
![]() |
#5 |
Senior Member
Yann
Join Date: Apr 2012
Location: France
Posts: 1,277
Rep Power: 29 ![]() ![]() |
It might work for light meshes, but it quickly gets inefficient on larger meshes which might take hours to reconstruct.
Of course it depends on the size of the simulations you run but in my opinion it would be better to solve the issue in the first place. I never reconstruct any of my simulation cases. To come back to Stan's problem (and maybe your), the issue comes from decomposePar which decompose the fields according to the patches in polyMesh. When you run blockMesh, then decomposePar, it will only keep the patches contained in the current mesh and you will loose all the boundary conditions you have defined for the final mesh (these patches are not created yet since snappyHexMesh has not been run yet) To avoid this, I usually keep my boundary conditions in a 0.orig directory and I copy the content of this directory in processor*/0 after running snappyHexMesh. (you might need to add a boundary condition for the "procBoundary.*" patches) Another solution might be to use decomposePar with the -copyZero option. I hope this helps. Yann |
|
![]() |
![]() |
![]() |
![]() |
#6 | |
New Member
Federico
Join Date: Jan 2021
Posts: 13
Rep Power: 6 ![]() |
Quote:
Not to be insistent but even with 40 million of cells on cluster I always spent almost 10 minutes but no more. |
||
![]() |
![]() |
![]() |
![]() |
#7 | |
Senior Member
Yann
Join Date: Apr 2012
Location: France
Posts: 1,277
Rep Power: 29 ![]() ![]() |
Quote:
Add this in each variable file in 0 directory: Code:
"procBoundary.*" { type processor; } Cheers, Yann |
||
![]() |
![]() |
![]() |
![]() |
#8 |
New Member
Srinivas
Join Date: Jan 2024
Posts: 18
Rep Power: 3 ![]() |
Hi Yann,
I am facing the similar issue while creating setFields after running snappyHexMesh in parallel. Run commands used are as follows: cp -rv 0.orig/ 0 blockMesh surfaceFeatureExtract decomposePar > log.decomposePar1 mpirun -np 8 snappyHexMesh -overwrite -parallel > log.SHM reconstructParMesh -constant > log.reconstructParMesh rm -r 0 cp -rv 0.orig/ 0 updated the new boundary patches generted with snappyhexmesh in the all fields of 0 folder. rm -r processor* setFields > log.setfields mpirun -np 8 redistributePar -decompose -parallel > log.decomposePar2 mpirun -np 8 renumberMesh -constant -overwrite -parallel > log.renumbermesh mpirun -np 8 interFoam -parallel > log.interFoam The following are the last few of error mb108smec070:59649] [11] interFoam(+0x6111e)[0x555bc1f2011e] [mb108smec070:59649] *** End of error message *** in ~/OpenFoam/OpenFOAM-v2206/platforms/linux64GccDPInt32Opt/bin/interFoam [mb108smec070:59654] *** Process received signal *** [mb108smec070:59654] Signal: Floating point exception (8) [mb108smec070:59654] Signal code: (-6) [mb108smec070:59654] Failing at address: 0x3e80000e906 [mb108smec070:59654] [ 0] /lib/x86_64-linux-gnu/libpthread.so.0(+0x14420)[0x7f2dfc86e420] [mb108smec070:59654] [ 1] /lib/x86_64-linux-gnu/libpthread.so.0(raise+0xcb)[0x7f2dfc86e2ab] [mb108smec070:59654] [ 2] /lib/x86_64-linux-gnu/libpthread.so.0(+0x14420)[0x7f2dfc86e420] [mb108smec070:59654] [ 3] /home/vit/OpenFoam/OpenFOAM-v2206/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so(_ZN4Foam3invERNS_5FieldINS_6TensorI dEEEERKNS_5UListIS2_EE+0x1ef)[0x7f2dfd5f006f] [mb108smec070:59654] [ 4] interFoam(+0xea69a)[0x5618562f569a] [mb108smec070:59654] [ 5] interFoam(_ZN4Foam3invINS_12fvPatchFieldENS_7volMe shEEEvRNS_14GeometricFieldINS_6TensorIdEET_T0_EERK S8_+0x6c)[0x5618562e6b87] [mb108smec070:59654] [ 6] interFoam(_ZN4Foam3invINS_12fvPatchFieldENS_7volMe shEEENS_3tmpINS_14GeometricFieldINS_6TensorIdEET_T 0_EEEERKSA_+0x13a)[0x5618562d3113] [mb108smec070:59654] [ 7] interFoam(_ZN4Foam3fvc11reconstructIdEENS_3tmpINS_ 14GeometricFieldINS_12outerProductINS_6VectorIdEET _E4typeENS_12fvPatchFieldENS_7volMeshEEEEERKNS3_IS 7_NS_13fvsPatchFieldENS_11surfaceMeshEEE+0x1f3)[0x5618562b2560] [mb108smec070:59654] [ 8] interFoam(+0x8b1b0)[0x5618562961b0] [mb108smec070:59654] [ 9] interFoam(+0x6c852)[0x561856277852] [mb108smec070:59654] [10] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf3)[0x7f2dfc68a083] [mb108smec070:59654] [11] interFoam(+0x6111e)[0x56185626c11e] [mb108smec070:59654] *** End of error message *** -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- mpirun noticed that process rank 3 with PID 0 on node mb108smec070 exited on signal 8 (Floating point exception). Please help me to rectify this issue. Thanks and regards |
|
![]() |
![]() |
![]() |
![]() |
#9 |
Senior Member
Yann
Join Date: Apr 2012
Location: France
Posts: 1,277
Rep Power: 29 ![]() ![]() |
Hello Srinivas,
The end part of the error message is not very useful to know what is going on. Why do you think it is an issue with setFields? Could you post your setFields and interFoam log files? Regards, Yann |
|
![]() |
![]() |
![]() |
![]() |
#10 | |
New Member
Srinivas
Join Date: Jan 2024
Posts: 18
Rep Power: 3 ![]() |
Quote:
The setFields in the reconstructed case is different from the decomposed case in ParaView and in the log.interfoam we can see the reference depth is 0.08 instead of 0.4 which i gave in the setField dictionary. I am sending setFields images as well as log files. |
||
![]() |
![]() |
![]() |
![]() |
#11 |
Senior Member
Yann
Join Date: Apr 2012
Location: France
Posts: 1,277
Rep Power: 29 ![]() ![]() |
Looks like you messed up your fields with renumberMesh.
Do you have the renumberMesh log? You should renumber both the mesh and fields. If only the mesh is renumbered it would explain your screenshot. |
|
![]() |
![]() |
![]() |
![]() |
#12 |
New Member
Srinivas
Join Date: Jan 2024
Posts: 18
Rep Power: 3 ![]() |
Hi Yann,
Thanks for your reply. I am working on the wave interaction with submerged trapezoid in a numerical wave tank. For that i created background mesh with blockMesh. Using snappyHexMesh I created Trapezoid. Followed the procedure as mentioned earlier. I took this procedure from a case file available on internet. I don't have any idea where i am doing mistake. I am sending all the log files. Please go through them. Please let me know the procedure for running the simulation successfully. |
|
![]() |
![]() |
![]() |
![]() |
#13 |
Senior Member
Yann
Join Date: Apr 2012
Location: France
Posts: 1,277
Rep Power: 29 ![]() ![]() |
Hello Srinivas,
Try running the same workflow, but remove the -constant flag with renumberMesh and just run: Code:
mpirun -np 8 renumberMesh -overwrite -parallel > log.renumbermesh |
|
![]() |
![]() |
![]() |
![]() |
#14 |
New Member
Srinivas
Join Date: Jan 2024
Posts: 18
Rep Power: 3 ![]() |
Hi Yann,
Thanks for the reply. I will try that and let you know the result |
|
![]() |
![]() |
![]() |
![]() |
#15 |
New Member
Srinivas
Join Date: Jan 2024
Posts: 18
Rep Power: 3 ![]() |
Hi Yann,
I tried as you said. This time the setFields in reconstructed case and decomposed case both are same. But the interFoam solver is not running. I checked the mesh quality with mpirun -np 8 checkMesh -parallel -allGeometry -allTopology -writeAllFields -writeSets vtk . It is showing failed 1 mesh check with some concave cells. If we don't clear the concave cells first, can't we run interfoam solver? I am attaching here, all the log files and the error for checkmesh and interFoam solver. Please go through them. Thanks and regards Srinivas |
|
![]() |
![]() |
![]() |
![]() |
#16 |
Senior Member
Yann
Join Date: Apr 2012
Location: France
Posts: 1,277
Rep Power: 29 ![]() ![]() |
Hello Srinivas,
I'm not familiar with the model you are using (waveModel) so I am not sure of what can cause this error. It could be a lot of different things (mesh, boundary conditions, ...) By the way, I noticed this in your checkMesh: Code:
***Total number of faces on empty patches is not divisible by the number of cells in the mesh. Hence this mesh is not 1D or 2D. |
|
![]() |
![]() |
![]() |
![]() |
#17 |
New Member
Srinivas
Join Date: Jan 2024
Posts: 18
Rep Power: 3 ![]() |
Hi Yann,
sorry for not mentioning it. I am doing 2D simulation. I will try the extrudeMesh option and let you know. Thank you for the suggestion. |
|
![]() |
![]() |
![]() |
![]() |
#18 |
New Member
Srinivas
Join Date: Jan 2024
Posts: 18
Rep Power: 3 ![]() |
Hi Yann,
Thank you so much for the suggestion. I am able to run the 2D simulation successfully after doing extrudeMesh. |
|
![]() |
![]() |
![]() |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
[Other] blueCFD-Core-2016 user compiled solvers not running in parallel | sbence | OpenFOAM Installation | 10 | December 5, 2018 09:44 |
Error running openfoam in parallel | fede32 | OpenFOAM Programming & Development | 5 | October 4, 2018 17:38 |
error while running in parallel using openmpi on local mc 6 processors | suryawanshi_nitin | OpenFOAM | 10 | February 22, 2017 22:33 |
How to use parallel running to the most? | 6863523 | OpenFOAM Running, Solving & CFD | 5 | January 19, 2017 03:22 |
Problems running in parallel - missing controlDict | Argen | OpenFOAM Running, Solving & CFD | 4 | June 7, 2012 04:50 |