Hi Bruno,
Thanks for your reply. I run snappyHexMesh in OpenFOAM-2.1.0 on Tianhe-1A, a supercomputer. So the memory shouldn't be a problem. OMG! I will re-download a OpenFOAM-2.1.0 and try again, hope that it will work. Thank you again! Best wishes! Xiaow-g Quote:
|
Hi, as I'm experiencing the same error (http://www.cfd-online.com/Forums/ope...ssorx-0-a.html) is there any way to set up a case from scratch in such a way that this error here is avoided without the need to run changeDict?
Thank you! |
strange behaviour DecomposePar
Hello OF experts and users,
please assist me in struggling the following problem: during run of icoFoam i included my utility: Code:
.... Code:
scalar x,y,z; Code:
ii=315457 y=-0.996054 u=(-0.00620626 6.79366e-05 0.0108981) Code:
ii=103398 y=-0.996054 u=(-0.00619963 6.84788e-05 0.0108932) DecomposePar Dict the following: Code:
numberOfSubdomains 6; |
Greetings Lev,
It's quite simple:
Bruno |
Hello, Bruno
unfortunately command "PInfo" give a compilation error as "PInfo was not declared in this scope ". But the problem is not only at the screen output, but also i do not know how to order the code to analyze whole mesh that is spread to many processors (as is working with single run). And i need the results to be collected and to be written in the log file. Like this command in single run Code:
OFstream k(runTime.path()/"k.dat"); Regards Lev |
Hi Lev,
Sorry about that. Apparently PInfo doesn't exist. But I vaguely remember reading about it... weird :confused: OK, by the description of what you're trying to do, it looks like to me that the probe or sampling utilities and/or function objects would be more suitable for what you want to do. And then there is also "swak4Foam": http://openfoamwiki.net/index.php/Contrib/swak4Foam Other than this... try searching this forum for more answers... I know they're in here somewhere... Best regards, Bruno |
Greetings!
I saw just now the solution and remembered about this thread. Quote:
Quote:
Best regards, Bruno |
Hi all,
I'm following up with the issue that decomposePar does not preserve some boundaries and later on the solver complains about missing boundary definitions. Furthermore potentialFoam complains about the processor-boundaries. My question is: Apart from the workarounds that have been constructed in the meanwhile to solve this: is there a out-of-box-solution for OpenFOAM 2.1.1 that I am using out of box on ubuntu??? my script looks like this (without reconstructing) Code:
#!/bin/sh Thanks very much!! Marco |
Greetings Marco,
Well... there are 2... 2.5 tricks that I know of:
Best regards, Bruno |
Hey Bruno,
great help, thanks! I took version 2, which works fine. Thanks marco |
Hi dear FOAMuser
I have 3 machin include 12 core. I used scotch method for 12 processor, its ok on 1 machine. but when I used 2 or 3 machine for more processor, it dosent work without any error or warning. :confused: |
Quote:
|
Hi dear Bruno
Thank you for your reply actually i dont know how the OpenFOAM installed, This system with OpenFOAM has been gave me. I use OpenFOAM-2.3.1. I reproduce the problem with motor bike and have a same problem with scotch method. when I using simple method ( for decompose) for cavity case, its work with 36 processor but didnt work with scotch method. I try to run simple method for my 3D asymmetric diffuser but similar to scotch method it didnt work. my command for run is: mpirun -np 36 $HOME/OpenFOAM/OpenFOAM-2.3.1/foamExec pimpleFoam -parallel Thanks Marzieh |
Quote:
|
Quick answer: Sorry Marzieh, I've been having long work weeks, along with other responsibilities and I haven't managed during weekends to answer as many questions as I wanted here on the forum :(.
Let me refer you to this blog post of mine, which I forgot the other day to mention about it: Notes about running OpenFOAM in parallel In addition, I forgot to ask the following questions:
|
Quote:
Your welcome dear bruno. Thank you for taking the time to reply me. 1. mpirun version 1.8.2 2. Linux ubuntu00 3.13.0-35-generic #62~precise1-Ubuntu SMP Mon Aug 18 14:52:04 UTC 2014 x86_64 x86_64 GNU/Linux |
Quick answer: I've search through my memories and tried to deduce the problem... and something doesn't add up.
This could be a problem with Open-MPI 1.8.2 or it could be due to a problem with incompatible MPI versions on each one of the 3 machines... but it's hard to prove that the issue is only reproducible if you decompose the mesh with scotch and run with pimpleFoam. I need the following details:
|
Hi,
I am using openFoam 3.0.1 , and I am trying to run my case using the following steps: $ decomposePar its running ok. $ mpirun –np 6 renumberMesh –overwrite -parallel gives me the following error: Code:
-------------------------------------------------------------------------- I was using "mpirun -np" on the same machine with pimpleFoam with no problems but when I tried to use it with "renumberMesh" I get this error . This is the first time for me to use the "renumberMesh". Bashar |
-n
Try -n instead of -np:
mpirun -n 6 renumberMesh -overwrite -parallel |
Quote:
I just used Code:
mpirun -n 6 renumberMesh -overwrite -parallel Reagrds, bashar |
All times are GMT -4. The time now is 13:51. |