CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Running, Solving & CFD (https://www.cfd-online.com/Forums/openfoam-solving/)
-   -   Allclean deleting polyMesh directory and not working runParallel after update (https://www.cfd-online.com/Forums/openfoam-solving/183804-allclean-deleting-polymesh-directory-not-working-runparallel-after-update.html)

HagenC February 14, 2017 06:01

Allclean deleting polyMesh directory and not working runParallel after update
 
Hi everyone,
I used to use OpenFoam version 2.3.1, now I wanted to use the newer version v1612+. I had both of them installed in Virtual Boxes with newly setup Ubuntu. (Ubuntu14.04 for OF2.3.1 Ubuntu16.04 for OFv1612+).
Then I copied one of my cases, which run in OF2.3.1 perfectly to OFv1612+.
I can run the case in the new version as well, but I struggle with two problems:
  1. When I use Allclean my constant/polyMesh folder is deleted as well, which I need as base for snappyHexMesh and which was not deleted in the old version.
  2. It seems like using runParallel does not work for me any more.
    Code:

    runParallel pisoFoam 4
    gives the error:
    Code:

    Usage: pisoFoam [OPTIONS]
    options:
      -case <dir>      specify alternate case directory, default is the cwd
      -decomposeParDict <file>
                        read decomposePar dictionary from specified location
      -noFunctionObjects
                        do not execute functionObjects
      -parallel        run in parallel
      -postProcess      Execute functionObjects only
      -roots <(dir1 .. dirN)>
                        slave root directories for distributed running
      -srcDoc          display source code in browser
      -doc              display application documentation in browser
      -help            print the usage

    Using: OpenFOAM-v1612+ (see www.OpenFOAM.com)
    Build: v1612+

    [0]
    [0]
    [0] --> FOAM FATAL ERROR:
    [0] Wrong number of arguments, expected 0 found 1
    [0]
    [0]
    FOAM parallel run exiting
    [0]
    --------------------------------------------------------------------------
    MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
    with errorcode 1.

    NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
    You may or may not see output from other processes, depending on
    exactly when Open MPI kills them.
    --------------------------------------------------------------------------

I would be very happy, if someone has Ideas on that. By the way running
Code:

mpirun -np 4 pisofoam -parallel
works fine.
Thanks in advance!
Hagen

HagenC February 15, 2017 03:34

Solved polyMesh Issue
 
The problem was that I had my blockMeshDict file in the polyMesh folder. It was no problem for running the case, OF found it there, but with ./Allclean it was deleted together with the polyMesh folder. I moved the blockMeshDict file to system folder, that solved the first problem.
Second one is still an issue.
Best,
Hagen

HagenC February 15, 2017 05:20

Solved runParallel
 
To solve the second problem just omitt the number
Code:

runParallel pisoFoam -parallel
The number of processors can be specified by
Code:

runParallel -np 4 pisoFoam -parallel
but without specifying the number, the number according to decomposeParDict will be taken automatically.

olesen February 16, 2017 05:56

Quote:

Originally Posted by HagenC (Post 637205)
To solve the second problem just omitt the number
Code:

runParallel pisoFoam -parallel
The number of processors can be specified by
Code:

runParallel -np 4 pisoFoam -parallel
but without specifying the number, the number according to decomposeParDict will be taken automatically.

You could try these instead (for 1612), note that your -parallel option shouldn't be there.
Code:

runParallel pisoFoam
runParallel -np 4 pisoFoam
runParallel -decomposeParDict system/someDecompDict pisoFoam


HagenC February 17, 2017 02:43

Thank you Olesen,
you are right, the -parallel option is obsolete.
Best,
Hagen


All times are GMT -4. The time now is 14:58.