CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

Allclean deleting polyMesh directory and not working runParallel after update

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   February 14, 2017, 06:01
Default Allclean deleting polyMesh directory and not working runParallel after update
  #1
New Member
 
Hagen
Join Date: Nov 2016
Posts: 16
Rep Power: 9
HagenC is on a distinguished road
Hi everyone,
I used to use OpenFoam version 2.3.1, now I wanted to use the newer version v1612+. I had both of them installed in Virtual Boxes with newly setup Ubuntu. (Ubuntu14.04 for OF2.3.1 Ubuntu16.04 for OFv1612+).
Then I copied one of my cases, which run in OF2.3.1 perfectly to OFv1612+.
I can run the case in the new version as well, but I struggle with two problems:
  1. When I use Allclean my constant/polyMesh folder is deleted as well, which I need as base for snappyHexMesh and which was not deleted in the old version.
  2. It seems like using runParallel does not work for me any more.
    Code:
    runParallel pisoFoam 4
    gives the error:
    Code:
    Usage: pisoFoam [OPTIONS]
    options:
      -case <dir>       specify alternate case directory, default is the cwd
      -decomposeParDict <file>
                        read decomposePar dictionary from specified location
      -noFunctionObjects
                        do not execute functionObjects
      -parallel         run in parallel
      -postProcess      Execute functionObjects only
      -roots <(dir1 .. dirN)>
                        slave root directories for distributed running
      -srcDoc           display source code in browser
      -doc              display application documentation in browser
      -help             print the usage
    
    Using: OpenFOAM-v1612+ (see www.OpenFOAM.com)
    Build: v1612+
    
    [0] 
    [0] 
    [0] --> FOAM FATAL ERROR: 
    [0] Wrong number of arguments, expected 0 found 1
    [0] 
    [0] 
    FOAM parallel run exiting
    [0] 
    --------------------------------------------------------------------------
    MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD 
    with errorcode 1.
    
    NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
    You may or may not see output from other processes, depending on
    exactly when Open MPI kills them.
    --------------------------------------------------------------------------
I would be very happy, if someone has Ideas on that. By the way running
Code:
mpirun -np 4 pisofoam -parallel
works fine.
Thanks in advance!
Hagen
HagenC is offline   Reply With Quote

Old   February 15, 2017, 03:34
Default Solved polyMesh Issue
  #2
New Member
 
Hagen
Join Date: Nov 2016
Posts: 16
Rep Power: 9
HagenC is on a distinguished road
The problem was that I had my blockMeshDict file in the polyMesh folder. It was no problem for running the case, OF found it there, but with ./Allclean it was deleted together with the polyMesh folder. I moved the blockMeshDict file to system folder, that solved the first problem.
Second one is still an issue.
Best,
Hagen
HagenC is offline   Reply With Quote

Old   February 15, 2017, 05:20
Default Solved runParallel
  #3
New Member
 
Hagen
Join Date: Nov 2016
Posts: 16
Rep Power: 9
HagenC is on a distinguished road
To solve the second problem just omitt the number
Code:
runParallel pisoFoam -parallel
The number of processors can be specified by
Code:
runParallel -np 4 pisoFoam -parallel
but without specifying the number, the number according to decomposeParDict will be taken automatically.
HagenC is offline   Reply With Quote

Old   February 16, 2017, 05:56
Default
  #4
Senior Member
 
Mark Olesen
Join Date: Mar 2009
Location: https://olesenm.github.io/
Posts: 1,685
Rep Power: 40
olesen has a spectacular aura aboutolesen has a spectacular aura about
Quote:
Originally Posted by HagenC View Post
To solve the second problem just omitt the number
Code:
runParallel pisoFoam -parallel
The number of processors can be specified by
Code:
runParallel -np 4 pisoFoam -parallel
but without specifying the number, the number according to decomposeParDict will be taken automatically.
You could try these instead (for 1612), note that your -parallel option shouldn't be there.
Code:
runParallel pisoFoam
runParallel -np 4 pisoFoam
runParallel -decomposeParDict system/someDecompDict pisoFoam
olesen is offline   Reply With Quote

Old   February 17, 2017, 02:43
Default
  #5
New Member
 
Hagen
Join Date: Nov 2016
Posts: 16
Rep Power: 9
HagenC is on a distinguished road
Thank you Olesen,
you are right, the -parallel option is obsolete.
Best,
Hagen
HagenC is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On



All times are GMT -4. The time now is 18:48.