|
[Sponsors] |
January 11, 2021, 07:53 |
Error when running setFields in parallel
|
#1 |
Member
Haoran Zhou
Join Date: Nov 2019
Posts: 49
Rep Power: 6 |
Hi all,
Recently I tried to run setFields in parallel but a problem occurred. The commands I executed are as follows: 1. blockMesh 2. cp -r 0.org 0 3. decomposePar 4. mpirun -np 4 snappyHexMesh -parallel -overwrite 5. mpirun -np 4 setFields -parallel During step 5, I got the FOAM FATAL IO ERROR: Cannot find patchField entry for caisson The error directs to the alpha.water files in every processor. However, I've already added the boundaryField 'caisson' in the alpha.water file of 0 folder before executing decomposePar. Do I miss any steps before running setFields in parallel? Is there any methods to deal with this problem? Thanks in advance! Best regards, Stan |
|
January 25, 2021, 05:40 |
|
#2 |
New Member
Federico
Join Date: Jan 2021
Posts: 13
Rep Power: 5 |
Why do you need to run setFields in parallel? I'm not expert but setFields in general is a really quick application and requires low computational power so I launch it and then decompose to run the simulation
|
|
January 25, 2021, 10:10 |
|
#3 | |
Senior Member
Yann
Join Date: Apr 2012
Location: France
Posts: 1,065
Rep Power: 26 |
Quote:
Stan, have you checked the content of the processor*/0/alpha.water file after running decomposePar? Does it still contain the boundary condition for the patch "caisson" ? |
||
January 25, 2021, 12:11 |
|
#4 |
New Member
Federico
Join Date: Jan 2021
Posts: 13
Rep Power: 5 |
Yeah I know but I had many problems to launch the simulation without reconstructing the mesh because OF didn't read 0 in each processor. For this I started to use a directory in which I mesh and one in which I simulate and in general even if depends on the number of cells the reconstruction doesn't require so much time compared to snappyhexmesh so I think it is good use reconstructParMesh
|
|
January 26, 2021, 03:44 |
|
#5 |
Senior Member
Yann
Join Date: Apr 2012
Location: France
Posts: 1,065
Rep Power: 26 |
It might work for light meshes, but it quickly gets inefficient on larger meshes which might take hours to reconstruct.
Of course it depends on the size of the simulations you run but in my opinion it would be better to solve the issue in the first place. I never reconstruct any of my simulation cases. To come back to Stan's problem (and maybe your), the issue comes from decomposePar which decompose the fields according to the patches in polyMesh. When you run blockMesh, then decomposePar, it will only keep the patches contained in the current mesh and you will loose all the boundary conditions you have defined for the final mesh (these patches are not created yet since snappyHexMesh has not been run yet) To avoid this, I usually keep my boundary conditions in a 0.orig directory and I copy the content of this directory in processor*/0 after running snappyHexMesh. (you might need to add a boundary condition for the "procBoundary.*" patches) Another solution might be to use decomposePar with the -copyZero option. I hope this helps. Yann |
|
January 26, 2021, 06:07 |
|
#6 | |
New Member
Federico
Join Date: Jan 2021
Posts: 13
Rep Power: 5 |
Quote:
Not to be insistent but even with 40 million of cells on cluster I always spent almost 10 minutes but no more. |
||
January 26, 2021, 07:52 |
|
#7 | |
Senior Member
Yann
Join Date: Apr 2012
Location: France
Posts: 1,065
Rep Power: 26 |
Quote:
Add this in each variable file in 0 directory: Code:
"procBoundary.*" { type processor; } Cheers, Yann |
||
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
[Other] blueCFD-Core-2016 user compiled solvers not running in parallel | sbence | OpenFOAM Installation | 10 | December 5, 2018 08:44 |
Error running openfoam in parallel | fede32 | OpenFOAM Programming & Development | 5 | October 4, 2018 16:38 |
error while running in parallel using openmpi on local mc 6 processors | suryawanshi_nitin | OpenFOAM | 10 | February 22, 2017 21:33 |
How to use parallel running to the most? | 6863523 | OpenFOAM Running, Solving & CFD | 5 | January 19, 2017 02:22 |
Problems running in parallel - missing controlDict | Argen | OpenFOAM Running, Solving & CFD | 4 | June 7, 2012 03:50 |