2 Attachment(s)
Quote:
Thank you so much! So I've cleaned up the file. I've been thinking about my problem in a physics type of way and please bear with me as I try to explain it. At the most basic level, I would like one field to solve throughout the entire domain while the other field will have a boundary condition in the exact middle of the domain. So essentially, I would like to create two middle boundary conditions on my blockMeshDict file, then "merge" the two middle boundary condition patches for one of my fields. I was thinking that patchtopatchinterpolation may not be able to do this. I think patchtopatch will take values of one patch and send it to another, but in my case file, if the values of said patch is zero or zeroGradient, all it will do is just send zero or zeroGradient to the other patch, and I'd be back at square one. Since I'd like to merge them, then I would take the internal cells adjacent to the length of my patch, solve for those cells, send those values to the other patch, solve again, and then continue solving for the rest of the cells. Perhaps, in that way the two cells will "merge" as if there was no boundary there at all. I've tried groovyBC (that was very close but the physics came out wrong), I've tried mappedPatch (only maps values from one patch to another only at the initial conditions not during the time steps), and I've tried cyclic boundaries (which work great but then I cannot set boundary conditions for those patches that I've made cyclic to each other). In essence, I may be going about this incorrectly with patchtopatchinterpolation. Let me know what you think and if my thinking process is accurate. Sincerely, Benjamin P.S. Perhaps I would need to make multiple meshes: Two local meshes and a global mesh. My global mesh will encompass the entire domain while the two local meshes will occupy half of the global mesh. I would assign a field to each mesh, and then I would need to map meshes to each other. The mapping I would have a problem with. Perhaps this is my solution though? |
2 Attachment(s)
Quote:
Thank you so much! So I've cleaned up the file. I've been thinking about my problem in a physics type of way and please bear with me as I try to explain it. At the most basic level, I would like one field to solve throughout the entire domain while the other field will have a boundary condition in the exact middle of the domain. So essentially, I would like to create two middle boundary conditions on my blockMeshDict file, then "merge" the two middle boundary condition patches for one of my fields. I was thinking that patchtopatchinterpolation may not be able to do this. I think patchtopatch will take values of one patch and send it to another, but in my case file, if the values of said patch is zero or zeroGradient, all it will do is just send zero or zeroGradient to the other patch, and I'd be back at square one. Since I'd like to merge them, then I would take the internal cells adjacent to the length of my patch, solve for those cells, send those values to the other patch, solve again, and then continue solving for the rest of the cells. Perhaps, in that way the two cells will "merge" as if there was no boundary there at all. I've tried groovyBC (that was very close but the physics came out wrong), I've tried mappedPatch (only maps values from one patch to another only at the initial conditions not during the time steps), and I've tried cyclic boundaries (which work great but then I cannot set boundary conditions for those patches that I've made cyclic to each other). In essence, I may be going about this incorrectly with patchtopatchinterpolation. Let me know what you think and if my thinking process is accurate. Sincerely, Benjamin P.S. Perhaps I would need to make multiple meshes: Two local meshes and a global mesh. My global mesh will encompass the entire domain while the two local meshes will occupy half of the global mesh. I would assign a field to each mesh, and then I would need to map meshes to each other. The mapping I would have a problem with. Perhaps this is my solution though? |
Benjamin,
I am not sure that patchToPatchInterpolation is what you want. Have you looked at solvers like chtMultiRegionFoam, or if you search for domain coupling in OpenFOAM (e.g. like methods in this presentation). I had a look at your code and it is not clear exactly what you are trying to do but you are not changing the boundary conditions - you are making a copy of the H20 boundary values "scalarField shawmut = H2O.boundaryField()[topID];" and then you make and set a new field called "interpole". Both these fields just get delete at the end of the scope. You haven't actually updated the boundary conditions - the updating method will depend on what type of boundary they are i.e. fixedValue, fixedGradient, etc.. Philip |
Dear Foamers, I did not want to make up a new topic in this thread, so I started a new one. But I think you might be the right people to ask, so may I suggest my new thread to you? :-) Would be great if you could give me a hint!
http://www.cfd-online.com/Forums/ope...tml#post420165 Best regards Florian |
1 Attachment(s)
Quote:
I'm trying to include conjugate heat transfer in interDyMFoam. Running the solver on a single processor works quite fine, but as I decompose the mesh, results become unphysical (as you can see in the attached picture). Currently, the interpolation is done via Code:
const polyPatch& ownPatch = patch().patch(); Code:
const polyPatch& ownPatch = patch().patch(); Do you have any idea, how to solve the problem? |
I've managed to compile the coupling part of the solver on OF-1.6ex. It seems there is a difference in "patchToPatchInterpolation" between OF-1.6ex and OF2.2.0.
After adding the code below to the patchToPatchInterpolation.H (OF-2.2.0), I was able to compile the solver without error messages. Code:
typedef PatchToPatchInterpolation |
1 Attachment(s)
I guess the coupling isn't the origin of the unphsysical behaviour, but the decomposition of the domain. I've assumed decomposePar devides the mesh in four equal domains. As you can see in the attached picture, it isn't the case.
So I think I have to decompose the mesh in a proper way. |
All times are GMT -4. The time now is 18:47. |