CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > OpenFOAM

Sgimpi

Register Blogs Members List Search Today's Posts Mark Forums Read

Like Tree1Likes

Reply
 
LinkBack Thread Tools Display Modes
Old   June 23, 2011, 07:11
Default Sgimpi
  #1
Member
 
pere
Join Date: Mar 2011
Location: barcelona
Posts: 46
Rep Power: 5
pere is on a distinguished road
I have compiled OpenFOAM 1.7.1 for openmpi and I need to use it with sgimpi. How can I do it? I've changed $WM_MPLIB and $WM_PROJECT_DIR/wmake/rules/linuxIA64Icc/mpilibSGIMPI where I've made
PFLAGS = -DOMPI_SKIP_MPICXX
PINC = -I/opt/sgi/mpt/mpt-2.03/include
PLIBS = -L/opt/sgi/mpt/mpt-2.03/lib -lmpi

and after ./Allmake at WM_PROJECT_DIR

Is this right?
pere is offline   Reply With Quote

Old   June 25, 2011, 08:47
Default
  #2
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 7,450
Blog Entries: 33
Rep Power: 73
wyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the rough
Hi Pere,

Yeah, that should be pretty much it. I'm not sure if there is anything else missing for building with SGI-MPI.
No, wait, have you checked "OpenFOAM-1.7.1/etc/settings.sh"?

There might be some issues with Metis and Scotch if you have previously built on the same machine with OpenMPI. So you might need to either run Allwmake twice or play it safe and rebuild the whole thing again with the new options.

Best regards,
Bruno
wyldckat is offline   Reply With Quote

Old   June 30, 2011, 07:32
Default
  #3
Member
 
pere
Join Date: Mar 2011
Location: barcelona
Posts: 46
Rep Power: 5
pere is on a distinguished road
Hi Bruno,

Yes, I've checked settings.sh and I added:

SGIMPI)

export MPI_ARCH_PATH=/prod/OPENFOAM1.7.1/OPENFOAM-1.7.1-SGI/OPENFOAM-1.7.1/mpt-2.03

# Tell OpenMPI where to find its install directory
export OPAL_PREFIX=$MPI_ARCH_PATH

_foamAddPath $MPI_ARCH_PATH/bin
_foamAddLib $MPI_ARCH_PATH/lib
_foamAddMan $MPI_ARCH_PATH/man

export FOAM_MPI_LIBBIN=/prod/OPENFOAM1.7.1/OPENFOAM-1.7.1-SGI/OPENFOAM-1.7.1/mpt-2.03/lib

but when Itry to execute OpenFOAM this message appairs:

Warning: Command line arguments for program should be given
after the program name. Assuming that -parallel is a
command line argument for the program.
Missing: program name
Program buoyantBoussinesqPimpleFoam either does not exist, is not
executable, or is an erroneous argument to mpirun.

Does not detect correctly mpirun....

Does anyone knows where is the problem?
pere is offline   Reply With Quote

Old   June 30, 2011, 07:59
Default
  #4
Senior Member
 
akidess's Avatar
 
Anton Kidess
Join Date: May 2009
Location: Delft, Netherlands
Posts: 919
Rep Power: 16
akidess will become famous soon enough
Does buoyantBoussinesqPimpleFoam run in serial? Did you try giving mpirun the complete path (i.e. ~/OpenFOAM/USER-V.x/applications/bin/linux64GccDPOpt/buoyantBoussinesqPimpleFoam)?
akidess is offline   Reply With Quote

Old   June 30, 2011, 18:52
Default
  #5
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 7,450
Blog Entries: 33
Rep Power: 73
wyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the rough
Hi Pere and Anton,

@Pere:
Quote:
Originally Posted by pere View Post
Does anyone knows where is the problem?
Right now I don't have time to go into details, so I'll simply copy-paste the links I've got on one of my blog posts:
Quote:
Originally Posted by http://www.cfd-online.com/Forums/blogs/wyldckat/232-list-threads-useful-building-openfoam-other-third-party-tools.html
Basically, there are 2 details to keep in mind:
  • Whenever possible, use full paths to mpirun and foamExec. foamExec will activate OpenFOAM's environment on the remote shells launched by mpirun.
  • Or, if you want to use mpirun straight up, you must use the normal method for activating OpenFOAM's environment, namely:
    Quote:
    Originally Posted by http://www.openfoam.com/download/source.php#x5-22000
    if running bash or ksh (if in doubt type echo $SHELL), source the etc/bashrc file by adding the following line to the end of your $HOME/.bashrc file:
    Code:
    source $HOME/OpenFOAM/OpenFOAM-2.0.0/etc/bashrc
If you still can't get it to work after checking those links I posted above, this weekend I can give a more detailed guide line

Best regards and good luck!
Bruno
wyldckat is offline   Reply With Quote

Old   July 1, 2011, 04:32
Default
  #6
Member
 
pere
Join Date: Mar 2011
Location: barcelona
Posts: 46
Rep Power: 5
pere is on a distinguished road
Yes I tried the full path and the same error appears...
pere is offline   Reply With Quote

Old   July 2, 2011, 05:49
Default
  #7
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 7,450
Blog Entries: 33
Rep Power: 73
wyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the rough
Hi Pere,

I'm sorry, but this is too little information I'm unable to deduce whether it's an environment problem or if something is missing when you run mpirun.

So, here's a list of things I'll need to know:
  1. What is the whole complete command line you are using for launching the solver buoyantBoussinesqPimpleFoam?
  2. Are you trying to use just the local machine or multiple machines?
  3. If you are using multiple machines:
    1. Is OpenFOAM installed in all of them in the exact same place?
    2. Is it shared via NFS or some other network file sharing mechanism?
As an example, I'll use the tutorial "heatTransfer/buoyantBoussinesqSimpleFoam/iglooWithFridges" with the full command lines for running in parallel in the local host:
Code:
mkdir -p $FOAM_RUN
run
cp -r $FOAM_TUTORIALS .
cd tutorials/heatTransfer/buoyantBoussinesqSimpleFoam/iglooWithFridges
blockMesh
snappyHexMesh -overwrite
decomposePar
`which mpirun` -np 6 /home/myusername/OpenFOAM/OpenFOAM-1.7.1/bin/foamExec buoyantBoussinesqSimpleFoam -parallel
Also, run this command to confirm it's picking up on the correct mpirun binary:
Code:
echo `which mpirun`
Best regards,
Bruno
wyldckat is offline   Reply With Quote

Old   July 4, 2011, 05:40
Default
  #8
Member
 
pere
Join Date: Mar 2011
Location: barcelona
Posts: 46
Rep Power: 5
pere is on a distinguished road
I've solucionazied the first error and now I had this problem:

PI: pirineus: 0x9d400004db86f34:
MPI: pirineus: 0x9d400004db86f34:
MPI: pirineus: 0x9d400004db86f34: --> FOAM FATAL ERROR:
MPI: pirineus: 0x9d400004db86f34: Trying to use the dummy Pstream library.
MPI: pirineus: 0x9d400004db86f34: This dummy library cannot be used in parallel mode
MPI: pirineus: 0x9d400004db86f34:
MPI: pirineus: 0x9d400004db86f34: From function Pstream::init(int& argc, char**& argv)
MPI: pirineus: 0x9d400004db86f34: in file Pstream.C at line 39.
MPI: pirineus: 0x9d400004db86f34:
MPI: pirineus: 0x9d400004db86f34: FOAM exiting
MPI: pirineus: 0x9d400004db86f34:
MPI: could not run executable (case #4)

Did I have to modify Pstream.C?
pere is offline   Reply With Quote

Old   July 4, 2011, 20:40
Default
  #9
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 7,450
Blog Entries: 33
Rep Power: 73
wyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the rough
Quote:
Originally Posted by pere View Post
--> FOAM FATAL ERROR:
MPI: pirineus: 0x9d400004db86f34: Trying to use the dummy Pstream library.
MPI: pirineus: 0x9d400004db86f34: This dummy library cannot be used in parallel mode

(...)

Did I have to modify Pstream.C?
Noooo, go back!! The dummy library only exists for OpenFOAM builds that run only in single-core-single-machine set-ups!

Changing the Pstream code is only necessary in very specific situations, namely for MPI libraries that use non-conventional ways of doing MPI business. This would usually only be the case for older MPI libraries and some closed source ones, such for Cray super-computers (I'm just guessing on this last one).
wyldckat is offline   Reply With Quote

Old   July 5, 2011, 04:11
Default
  #10
Member
 
pere
Join Date: Mar 2011
Location: barcelona
Posts: 46
Rep Power: 5
pere is on a distinguished road
Thanks wyldckat. So what do you think I must do now?
pere is offline   Reply With Quote

Old   July 5, 2011, 07:40
Default
  #11
Member
 
pere
Join Date: Mar 2011
Location: barcelona
Posts: 46
Rep Power: 5
pere is on a distinguished road
I've changed Allwmake in Pstream and I put wmake libso mpi instead of wmake libso dummy, after ..../Pstream>./Allwmake....but the error is the same again...
pere is offline   Reply With Quote

Old   July 6, 2011, 20:36
Default
  #12
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 7,450
Blog Entries: 33
Rep Power: 73
wyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the rough
Hi Pere,

I didn't have time to respond sooner... anyway, it's time for some divide and conquer:
  1. Run mpirun with a test for launching mpi'less applications. For example, run each one of these at a time:
    Code:
    mpirun -np 2 bash -c "ls -l"
    mpirun -np 2 bash -c "export"
    The first one will show you the contents of the folder each remotely launched bash shell. The second one will show you the environment variables for each remote shell.
    If neither one of these work, then your MPI installation isn't working.
  2. Build the test application designed for these kinds of tests:
    Code:
    cd $WM_PROJECT_DIR
    wmake applications/test/parallel
    Now go to your case that has the decomposePar already done.
    Then run the following scenarios:
    1. Test to see what the new application shows:
      Code:
      parallelTest
    2. Testing if the OpenFOAM environment is available in the remote shells:
      Code:
      mpirun -np 2 parallelTest
      It should show you the result of the 1st test of this sub-list, but twice.
    3. Testing if foamExec does his job:
      Code:
      mpirun -np 2 `which foamExec` parallelTest
    4. Testing if you can run things in parallel:
      Code:
      mpirun -np 2 parallelTest -parallel
      In this case, it should show you a similar output to what's shown here: post #19 of "OpenFOAM updates"
    5. Finally, test if foamJob works:
      Code:
      foamJob -s -p parallelTest
These tests should help you isolate the problem.

Best regards and good luck!
Bruno
sunshuai likes this.
wyldckat is offline   Reply With Quote

Old   July 8, 2011, 05:08
Default
  #13
Member
 
pere
Join Date: Mar 2011
Location: barcelona
Posts: 46
Rep Power: 5
pere is on a distinguished road
I cannot build parallelTest, this error appears:

SOURCE=parallelTest.C ; g++ -m64 -Dlinux64 -DWM_DP -Wall -Wextra -Wno-unused-parameter -Wold-style-cast -Wnon-virtual-dtor -O3 -DNoRepository -ftemplate-depth-40 -IlnInclude -I. -I/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/src/OpenFOAM/lnInclude -I/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/src/OSspecific/POSIX/lnInclude -fPIC -c $SOURCE -o Make/linux64GccDPOpt/parallelTest.o
In file included from /prod/OPENFOAM1.7.1/ThirdParty-1.7.1/gcc-4.5.2/bin/../lib/gcc/x86_64-unknown-linux-gnu/4.5.2/../../../../include/c++/4.5.2/bits/localefwd.h:42:0,
from /prod/OPENFOAM1.7.1/ThirdParty-1.7.1/gcc-4.5.2/bin/../lib/gcc/x86_64-unknown-linux-gnu/4.5.2/../../../../include/c++/4.5.2/string:45,
from /prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/src/OpenFOAM/lnInclude/string.H:51,
from /prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/src/OpenFOAM/lnInclude/stringList.H:35,
from /prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/src/OpenFOAM/lnInclude/argList.H:73,
from parallelTest.C:32:
/prod/OPENFOAM1.7.1/ThirdParty-1.7.1/gcc-4.5.2/bin/../lib/gcc/x86_64-unknown-linux-gnu/4.5.2/../../../../include/c++/4.5.2/x86_64-unknown-linux-gnu/bits/c++locale.h:52:23: error: ‘uselocale’ was not declared in this scope
/prod/OPENFOAM1.7.1/ThirdParty-1.7.1/gcc-4.5.2/bin/../lib/gcc/x86_64-unknown-linux-gnu/4.5.2/../../../../include/c++/4.5.2/x86_64-unknown-linux-gnu/bits/c++locale.h:52:45: error: invalid type in declaration before ‘;’ token
/prod/OPENFOAM1.7.1/ThirdParty-1.7.1/gcc-4.5.2/bin/../lib/gcc/x86_64-unknown-linux-gnu/4.5.2/../../../../include/c++/4.5.2/x86_64-unknown-linux-gnu/bits/c++locale.h: In function ‘int std::__convert_from_v(__locale_struct* const&, char*, int, const char*, ...)’:
/prod/OPENFOAM1.7.1/ThirdParty-1.7.1/gcc-4.5.2/bin/../lib/gcc/x86_64-unknown-linux-gnu/4.5.2/../../../../include/c++/4.5.2/x86_64-unknown-linux-gnu/bits/c++locale.h:72:53: error: ‘__gnu_cxx::__uselocale’ cannot be used as a function
/prod/OPENFOAM1.7.1/ThirdParty-1.7.1/gcc-4.5.2/bin/../lib/gcc/x86_64-unknown-linux-gnu/4.5.2/../../../../include/c++/4.5.2/x86_64-unknown-linux-gnu/bits/c++locale.h:97:33: error: ‘__gnu_cxx::__uselocale’ cannot be used as a function
make: *** [Make/linux64GccDPOpt/parallelTest.o] Error 1

Is it possible that I need new headers? libraries?

The real problem is that OpenFOAM needs Pstream/mpi instead of Pstream/dummy from running ni parallel?

Thanks in advance wlydcat
pere is offline   Reply With Quote

Old   July 8, 2011, 07:49
Default
  #14
Member
 
pere
Join Date: Mar 2011
Location: barcelona
Posts: 46
Rep Power: 5
pere is on a distinguished road
Ok finally I can start parallelTest....
pere is offline   Reply With Quote

Old   July 8, 2011, 08:01
Default
  #15
Member
 
pere
Join Date: Mar 2011
Location: barcelona
Posts: 46
Rep Power: 5
pere is on a distinguished road
when I made parallelTest:

*---------------------------------------------------------------------------*\
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 1.7.1 |
| \\ / A nd | Web: www.OpenFOAM.com |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
Build : 1.7.1-03e7e056c215
Exec : parallelTest
Date : Jul 08 2011
Time : 12:47:19
Host : pirineus
PID : 340804
Case : /tmp/ppuigdom/snappy24_2/snappy24
nProcs : 1
SigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time


Starting transfers

End

-mpirun -np 2 parallelTest
/*---------------------------------------------------------------------------*\
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 1.7.1 |
| \\ / A nd | Web: www.OpenFOAM.com |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
Build : 1.7.1-03e7e056c215
Exec : parallelTest
Date : Jul 08 2011
Time : 12:50:25
Host : pirineus
PID : 341946
Case : /tmp/ppuigdom/snappy24_2/snappy24
nProcs : 1
SigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

End

MPI: pirineus: 0x9d400004db8713b:
MPI: pirineus: 0x9d400004db8713b: Starting transfers
MPI: pirineus: 0x9d400004db8713b:
MPI: could not run executable (case #4)

-mpirun -np 2 which foamExec parallelTest
/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/bin/foamExec
/home/ppuigdom/OpenFOAM/ppuigdom-1.7.1/applications/bin/linux64GccDPOpt/parallelTest
MPI: could not run executable (case #3)
MPI: No details available, no log files found

-mpirun -np 2 parallelTest -parallel

MPI: pirineus: 0x9d400004db87141:
MPI: pirineus: 0x9d400004db87141:
MPI: pirineus: 0x9d400004db87141: --> FOAM FATAL ERROR:
MPI: pirineus: 0x9d400004db87141: Trying to use the dummy Pstream library.
MPI: pirineus: 0x9d400004db87141: This dummy library cannot be used in parallel mode
MPI: pirineus: 0x9d400004db87141:
MPI: pirineus: 0x9d400004db87141: From function Pstream::init(int& argc, char**& argv)
MPI: pirineus: 0x9d400004db87141: in file Pstream.C at line 39.
MPI: pirineus: 0x9d400004db87141:
MPI: pirineus: 0x9d400004db87141: FOAM exiting
MPI: pirineus: 0x9d400004db87141:
MPI: could not run executable (case #4)

-foamJob -p -s parallelTest

Parallel processing using SGIMPI with 10 processors
Executing: mpirun -np 10 /prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/bin/foamExec parallelTest -parallel | tee log
MPI: pirineus: 0x9d400004db87144:
MPI: pirineus: 0x9d400004db87144:
MPI: pirineus: 0x9d400004db87144: --> FOAM FATAL ERROR:
MPI: pirineus: 0x9d400004db87144: Trying to use the dummy Pstream library.
MPI: pirineus: 0x9d400004db87144: This dummy library cannot be used in parallel mode
MPI: pirineus: 0x9d400004db87144:
MPI: pirineus: 0x9d400004db87144: From function Pstream::init(int& argc, char**& argv)
MPI: pirineus: 0x9d400004db87144: in file Pstream.C at line 39.
MPI: pirineus: 0x9d400004db87144:
MPI: pirineus: 0x9d400004db87144: FOAM exiting
MPI: pirineus: 0x9d400004db87144:
MPI: could not run executable (case #4)


Is the problem dummy library? how can I change it for mpi library?

Thanks
pere is offline   Reply With Quote

Old   July 9, 2011, 05:57
Default
  #16
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 7,450
Blog Entries: 33
Rep Power: 73
wyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the rough
Hi Pere,

Just the other day I had to suggest to another person to do the following instruction:
Code:
mv $FOAM_LIBBIN/dummy $FOAM_LIBBIN/dummy__
Basically, rename/hide the dummy folder, which is where the libraries for MPI'less operation are.
Then try again.

By the way, you didn't mention if the first two commands had worked or not. I suppose they did and that's why you went ahead to test parallelTest.

Best regards,
Bruno
wyldckat is offline   Reply With Quote

Old   July 11, 2011, 05:19
Default
  #17
Member
 
pere
Join Date: Mar 2011
Location: barcelona
Posts: 46
Rep Power: 5
pere is on a distinguished road
mpirun -np 2 bash -c "ls -l"




total 12
drwxr-x--- 2 ppuigdom cesca 4096 16 jun 11:17 0
drwxr-x--- 4 ppuigdom cesca 4096 16 jun 11:17 constant
-rw-r----- 1 ppuigdom cesca 0 11 jul 09:49 log
drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:21 processor0
drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor1
drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor2
drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor3
drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor4
drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor5
drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor6
drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor7
drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor8
drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor9
drwxr-x--- 2 ppuigdom cesca 4096 16 jun 11:17 system
MPI: could not run executable (case #3)
MPI: No details available, no log files found
Matat




mpirun -np 2 bash -c "export"

declare -x AWK="gawk"
declare -x BASH_ENV="/home/ppuigdom/.bashrc"
declare -x BINARY_TYPE="linux2.6-glibc2.3-x86_64"
declare -x BINARY_TYPE_HPC=""
declare -x BSUB_BLOCK_EXEC_HOST=""
declare -x COLORTERM="1"
declare -x CPATH="/prod/intel/Compiler/11.1/073/ipp/em64t/include:/prod/intel/Compiler/11.1/073/mkl/include:/prod/intel/Compiler/11.1/073/tbb/include:/prod/intel/Compiler/11.1/073/ipp/em64t/include:/prod/intel/Compiler/11.1/073/mkl/include:/prod/intel/Compiler/11.1/073/tbb/include"
declare -x CPU="x86_64"
declare -x CSHEDIT="emacs"
declare -x CVS_RSH="ssh"
declare -x DADES="/dades/ppuigdom"
declare -x DISPLAY="localhost:18.0"
declare -x DYLD_LIBRARY_PATH="/prod/intel/Compiler/11.1/073/tbb/intel64/cc4.1.0_libc2.4_kernel2.6.16.21/lib:/prod/intel/Compiler/11.1/073/tbb/intel64/cc4.1.0_libc2.4_kernel2.6.16.21/lib"
declare -x EGO_BINDIR="/usr/share/lsf/7.0/linux2.6-glibc2.3-x86_64/bin"
declare -x EGO_CONFDIR="/usr/share/lsf/conf/ego/CESCA/kernel"
declare -x EGO_ESRVDIR="/usr/share/lsf/conf/ego/CESCA/eservice"
declare -x EGO_LIBDIR="/usr/share/lsf/7.0/linux2.6-glibc2.3-x86_64/lib"
declare -x EGO_LOCAL_CONFDIR="/usr/share/lsf/conf/ego/CESCA/kernel"
declare -x EGO_SERVERDIR="/usr/share/lsf/7.0/linux2.6-glibc2.3-x86_64/etc"
declare -x EGO_TOP="/usr/share/lsf"
declare -x EGO_VERSION="1.2"
declare -x ENV="/etc/bash.bashrc"
declare -x FOAM_APP="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/applications"
declare -x FOAM_APPBIN="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/applications/bin/linux64GccDPOpt"
declare -x FOAM_INST_DIR="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI"
declare -x FOAM_JOB_DIR="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/jobControl"
declare -x FOAM_LIB="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/lib"
declare -x FOAM_LIBBIN="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/lib/linux64GccDPOpt"
declare -x FOAM_MPI_LIBBIN="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/mpt-2.03/lib"
declare -x FOAM_RUN="/home/ppuigdom/OpenFOAM/ppuigdom-1.7.1/run"
declare -x FOAM_SIGFPE=""
declare -x FOAM_SITE_APPBIN="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/site/1.7.1/bin/linux64GccDPOpt"
declare -x FOAM_SITE_LIBBIN="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/site/1.7.1/lib/linux64GccDPOpt"
declare -x FOAM_SOLVERS="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/applications/solvers"
declare -x FOAM_SRC="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/src"
declare -x FOAM_TUTORIALS="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/tutorials"
declare -x FOAM_USER_APPBIN="/home/ppuigdom/OpenFOAM/ppuigdom-1.7.1/applications/bin/linux64GccDPOpt"
declare -x FOAM_USER_LIBBIN="/home/ppuigdom/OpenFOAM/ppuigdom-1.7.1/lib/linux64GccDPOpt"
declare -x FOAM_UTILITIES="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/applications/utilities"
declare -x FPATH="/prod/intel/Compiler/11.1/073/mkl/include:/prod/intel/Compiler/11.1/073/mkl/include"
declare -x FROM_HEADER=""
declare -x G_BROKEN_FILENAMES="1"
declare -x G_FILENAME_ENCODING="@locale,UTF-8,ISO-8859-15,CP1252"
declare -x HISTSIZE="1000"
declare -x HOME="/home/ppuigdom"
declare -x HOST="pirineus"
declare -x HOSTNAME="pirineus"
declare -x HOSTTYPE="X86_64"
declare -x INCLUDE="/prod/intel/Compiler/11.1/073/ipp/em64t/include:/prod/intel/Compiler/11.1/073/mkl/include:/prod/intel/Compiler/11.1/073/ipp/em64t/include:/prod/intel/Compiler/11.1/073/mkl/include"
declare -x INFODIR="/usr/local/info:/usr/share/info:/usr/info"
declare -x INFOPATH="/usr/local/info:/usr/share/info:/usr/info"
declare -x INPUTRC="/etc/inputrc"
declare -x INTEL_LICENSE_FILE="/prod/intel/Compiler/11.1/073/licenses:/opt/intel/licenses:/home/ppuigdom/intel/licenses:/prod/intel/Compiler/11.1/073/licenses:/opt/intel/licenses:/home/ppuigdom/intel/licenses"
declare -x IPPROOT="/prod/intel/Compiler/11.1/073/ipp/em64t"
declare -x JAVA_BINDIR="/usr/lib64/jvm/jre/bin"
declare -x JAVA_HOME="/usr/share/lsf/perf/../jre/linux-x86_64"
declare -x JAVA_ROOT="/usr/lib64/jvm/jre"
declare -x JRE_HOME="/usr/lib64/jvm/jre"
declare -x KMP_AFFINITY="disable"
declare -x LANG="ca_ES.UTF-8"
declare -x LD_LIBRARY_PATH="/prod/OPENFOAM1.7.1/ThirdParty-1.7.1/platforms/linux64Gcc/paraview-3.8.0/lib/paraview-3.8:/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/mpt-2.03/lib:/usr/lib64/gcc/x86_64-suse-linux/4.3:/home/ppuigdom/OpenFOAM/ppuigdom-1.7.1/lib/linux64GccDPOpt:/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/site/1.7.1/lib/linux64GccDPOpt:/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/lib/linux64GccDPOpt:/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/lib/linux64GccDPOpt/dummy:/usr/share/lsf/7.0/linux2.6-glibc2.3-x86_64/lib:/usr/lib64/jvm/jre-1.6.0-sun/lib/amd64/server:/usr/lib64/jvm/jre-1.6.0-sun/lib/amd64:/usr/share/lsf/perf/1.2/linux-x86_64/lib:/usr/share/lsf/perf/ego/1.2/linux-x86_64/lib:/usr/share/lsf/perf/lsf/7.0/linux-x86_64/lib:/usr/share/lsf/1.2/linux2.6-glibc2.3-x86_64/lib:/prod/intel/Compiler/11.1/073/lib/intel64:/prod/intel/Compiler/11.1/073/ipp/em64t/sharedlib:/prod/intel/Compiler/11.1/073/mkl/lib/em64t:/prod/intel/Compiler/11.1/073/tbb/intel64/cc4.1.0_libc2.4_kernel2.6.16.21/lib:/opt/sgi/sgimc/lib"
declare -x LD_PRELOAD="libxpmem.so"
declare -x LESS="-M -I"
declare -x LESSCLOSE="lessclose.sh %s %s"
declare -x LESSKEY="/etc/lesskey.bin"
declare -x LESSOPEN="lessopen.sh %s"
declare -x LESS_ADVANCED_PREPROCESSOR="no"
declare -x LIB="/prod/intel/Compiler/11.1/073/ipp/em64t/lib:/prod/intel/Compiler/11.1/073/ipp/em64t/lib:"
declare -x LIBRARY_PATH="/prod/intel/Compiler/11.1/073/lib/intel64:/prod/intel/Compiler/11.1/073/ipp/em64t/lib:/prod/intel/Compiler/11.1/073/mkl/lib/em64t:/prod/intel/Compiler/11.1/073/tbb/intel64/cc4.1.0_libc2.4_kernel2.6.16.21/lib:/prod/intel/Compiler/11.1/073/lib/intel64:/prod/intel/Compiler/11.1/073/ipp/em64t/lib:/prod/intel/Compiler/11.1/073/mkl/lib/em64t:/prod/intel/Compiler/11.1/073/tbb/intel64/cc4.1.0_libc2.4_kernel2.6.16.21/lib"
declare -x LICENSE_FILE="/usr/share/lsf/conf/perf/CESCA/conf/license.dat"
declare -x LM_LICENSE_FILE="27000@192.168.17.1"
declare -x LOADEDMODULES=""
declare -x LOGNAME="ppuigdom"
declare -x LSB_ACCT_FILE="/tmp/ppuigdom/.1310369970.702642.acct"
declare -x LSB_BATCH_JID="702642"
declare -x LSB_CHKFILENAME="/home/ppuigdom/.lsbatch/1310369970.702642"
declare -x LSB_CPUSET_DEDICATED="YES"
declare -x LSB_DJOB_HB_INTERVAL="15"
declare -x LSB_DJOB_HOSTFILE="/home/ppuigdom/.lsbatch/1310369970.702642.hostfile"
declare -x LSB_DJOB_NUMPROC="1"
declare -x LSB_DJOB_RU_INTERVAL="15"
declare -x LSB_ECHKPNT_RSH_CMD="ssh -p2122"
declare -x LSB_EEXEC_REAL_GID=""
declare -x LSB_EEXEC_REAL_UID=""
declare -x LSB_EXEC_CLUSTER="CESCA"
declare -x LSB_EXIT_PRE_ABORT="99"
declare -x LSB_HOSTS="pirineus"
declare -x LSB_HOST_CPUSETS="1 pirineus /CESCA@702642 "
declare -x LSB_INTERACTIVE="Y"
declare -x LSB_JOBEXIT_STAT="0"
declare -x LSB_JOBFILENAME="/home/ppuigdom/.lsbatch/1310369970.702642"
declare -x LSB_JOBID="702642"
declare -x LSB_JOBINDEX="0"
declare -x LSB_JOBNAME="/bin/bash"
declare -x LSB_JOBRES_CALLBACK="52029@pirineus"
declare -x LSB_JOBRES_PID="756365"
declare -x LSB_JOB_EXECUSER="ppuigdom"
declare -x LSB_JOB_STARTER="/usr/local/bin/lsf_job_starter"
declare -x LSB_MCPU_HOSTS="pirineus 1 "
declare -x LSB_QUEUE="short"
declare -x LSB_SHMODE="y"
declare -x LSB_SUB_HOST="pirineus"
declare -x LSB_TRAPSIGS="trap # 15 10 12 2 1"
declare -x LSB_UNIXGROUP_INT="cesca"
declare -x LSB_XFJOB="Y"
declare -x LSFUSER="ppuigdom"
declare -x LSF_BINDIR="/usr/share/lsf/7.0/linux2.6-glibc2.3-x86_64/bin"
declare -x LSF_EAUTH_AUX_PASS="yes"
declare -x LSF_EAUTH_CLIENT="user"
declare -x LSF_EAUTH_SERVER="mbatchd@CESCA"
declare -x LSF_EGO_ENVDIR="/usr/share/lsf/conf/ego/CESCA/kernel"
declare -x LSF_ENVDIR="/usr/share/lsf/conf"
declare -x LSF_INVOKE_CMD="bsub.lsf"
declare -x LSF_LIBDIR="/usr/share/lsf/7.0/linux2.6-glibc2.3-x86_64/lib"
declare -x LSF_LIM_API_NTRIES="1"
declare -x LSF_LOGDIR="/usr/share/lsf/log"
declare -x LSF_SERVERDIR="/usr/share/lsf/7.0/linux2.6-glibc2.3-x86_64/etc"
declare -x LSF_VERSION="23"
declare -x LS_COLORS="no=00:fi=00:di=01;34:ln=00;36i=40;33: so=01;35:do=01;35:bd=40;33;01:cd=40;33;01r=41;33 ;01:ex=00;32:*.cmd=00;32:*.exe=01;32:*.com=01;32:* .bat=01;32:*.btm=01;32:*.dll=01;32:*.tar=00;31:*.t bz=00;31:*.tgz=00;31:*.rpm=00;31:*.deb=00;31:*.arj =00;31:*.taz=00;31:*.lzh=00;31:*.lzma=00;31:*.zip= 00;31:*.zoo=00;31:*.z=00;31:*.Z=00;31:*.gz=00;31:* .bz2=00;31:*.tb2=00;31:*.tz2=00;31:*.tbz2=00;31:*. avi=01;35:*.bmp=01;35:*.fli=01;35:*.gif=01;35:*.jp g=01;35:*.jpeg=01;35:*.mng=01;35:*.mov=01;35:*.mpg =01;35:*.pcx=01;35:*.pbm=01;35:*.pgm=01;35:*.png=0 1;35:*.ppm=01;35:*.tga=01;35:*.tif=01;35:*.xbm=01; 35:*.xpm=01;35:*.dl=01;35:*.gl=01;35:*.wmv=01;35:* .aiff=00;32:*.au=00;32:*.mid=00;32:*.mp3=00;32:*.o gg=00;32:*.voc=00;32:*.wav=00;32:"
declare -x LS_EXEC_T="START"
declare -x LS_JOBPID="756365"
declare -x LS_OPTIONS="-N --color=tty -T 0"
declare -x LS_SUBCWD="/tmp/ppuigdom/snappy24_2/snappy24"
declare -x MACHTYPE="x86_64-suse-linux"
declare -x MAIL="/var/mail/ppuigdom"
declare -x MANPATH="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/mpt-2.03/man:/usr/share/lsf/7.0/man:/prod/intel/Compiler/11.1/073/man/en_US:/prod/intel/Compiler/11.1/073/mkl/man/en_US:/prod/intel/Compiler/11.1/073/mkl/../man/en_US:/prod/intel/Compiler/11.1/073/mkl/man/en_US:/usr/local/man:/usr/share/man:/prod/intel/vtune/man:"
declare -x MAQUINA="pirineus"
declare -x MGR_HOME="/opt/sgi/sgimc"
declare -x MINICOM="-c on"
declare -x MKLROOT="/prod/intel/Compiler/11.1/073/mkl"
declare -x MODULEPATH="/opt/modules/tools:/opt/modules/modulefiles"
declare -x MODULESHOME="/usr/share/modules"
declare -x MODULE_VERSION="3.1.6"
declare -x MODULE_VERSION_STACK="3.1.6"
declare -x MORE="-sl"
declare -x MPI_ARCH_PATH="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/mpt-2.03"
declare -x MPI_BUFFER_SIZE="20000000"
declare -x MPI_DRANK="0"
declare -x MPI_ENVIRONMENT="6a085854 38938 0 e007c336 9d400004db871d1"
declare -x MPI_IDB_PATH="/prod/intel/Compiler/11.1/073/bin/intel64/idb"
declare -x MPI_SHARED_NEIGHBORHOOD="host"
declare -x NLSPATH="/prod/intel/Compiler/11.1/073/lib/intel64/locale/%l_%t/%N:/prod/intel/Compiler/11.1/073/ipp/em64t/lib/locale/%l_%t/%N:/prod/intel/Compiler/11.1/073/mkl/lib/em64t/locale/%l_%t/%N:/prod/intel/Compiler/11.1/073/idb/intel64/locale/%l_%t/%N:/prod/intel/Compiler/11.1/073/lib/intel64/locale/%l_%t/%N:/prod/intel/Compiler/11.1/073/ipp/em64t/lib/locale/%l_%t/%N:/prod/intel/Compiler/11.1/073/mkl/lib/em64t/locale/%l_%t/%N:/prod/intel/Compiler/11.1/073/idb/intel64/locale/%l_%t/%N"
declare -x NNTPSERVER="news"
declare -x OLDPWD
declare -x OPAL_PREFIX="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/mpt-2.03"
declare -x OSTYPE="linux"
declare -x OS_TYPE="linux-x86_64"
declare -x PAGER="less"
declare -x PAMENV="/prod/ESI-SW/visualenv66/env-Linux"
declare -x PAMHOME="/prod/ESI-SW/visualenv66"
declare -x PAMLANG="FR"
declare -x PATH="/prod/OPENFOAM1.7.1/ThirdParty-1.7.1/platforms/linux64Gcc/paraview-3.8.0/bin:/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/mpt-2.03/bin:/usr/bin/gcc:/home/ppuigdom/OpenFOAM/ppuigdom-1.7.1/applications/bin/linux64GccDPOpt:/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/site/1.7.1/bin/linux64GccDPOpt:/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/applications/bin/linux64GccDPOpt:/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/wmake:/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/bin:/usr/share/lsf/7.0/linux2.6-glibc2.3-x86_64/bin:/prod/pgi/linux86-64/10.5/bin:/usr/share/lsf/gui/2.0/bin:/usr/share/lsf/perf/1.2/bin:/usr/share/lsf/7.0/linux2.6-glibc2.3-x86_64/etc:/prod/intel/Compiler/11.1/073/bin/intel64:/usr/local/bin:/usr/bin:/bin:/usr/bin/X11:/usr/X11R6/bin:/usr/games:/usr/lib64/jvm/jre/bin:/usr/lib/mit/bin:/usr/lib/mit/sbin:/opt/sgi/sgimc/bin:/opt/sgi/sbin"
declare -x PERF_CONFDIR="/usr/share/lsf/conf/perf/CESCA/conf"
declare -x PERF_DATADIR="/usr/share/lsf/work/CESCA/perf/data"
declare -x PERF_ENV="-DPERF_TOP=/usr/share/lsf/perf -DPERF_CONFDIR=/usr/share/lsf/conf/perf/CESCA/conf -DPERF_WORKDIR=/usr/share/lsf/work/CESCA/perf -DPERF_LOGDIR=/usr/share/lsf/log/perf -DPERF_DATADIR=/usr/share/lsf/work/CESCA/perf/data"
declare -x PERF_LIB="/usr/share/lsf/perf/1.2/linux-x86_64/lib:/usr/share/lsf/perf/ego/1.2/linux-x86_64/lib:/usr/share/lsf/perf/lsf/7.0/linux-x86_64/lib"
declare -x PERF_LIB_TYPE="linux-x86_64"
declare -x PERF_LOGDIR="/usr/share/lsf/log/perf"
declare -x PERF_TOP="/usr/share/lsf/perf"
declare -x PERF_VERSION="1.2"
declare -x PERF_WORKDIR="/usr/share/lsf/work/CESCA/perf"
declare -x PGI="/prod/pgi"
declare -x PROFILEREAD="true"
declare -x PV_PLUGIN_PATH="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/lib/linux64GccDPOpt/paraview-3.8"
declare -x PWD="/tmp/ppuigdom/snappy24_2/snappy24"
declare -x PYTHONSTARTUP="/etc/pythonstart"
declare -x ParaView_DIR="/prod/OPENFOAM1.7.1/ThirdParty-1.7.1/platforms/linux64Gcc/paraview-3.8.0"
declare -x ParaView_MAJOR="3.8"
declare -x ParaView_VERSION="3.8.0"
declare -x QT_SYSTEM_DIR="/usr/share/desktop-data"
declare -x SBD_KRB5CCNAME_VAL=""
declare -x SCRATCH="/cescascratch/ppuigdom"
declare -x SHELL="/usr/local/bin/bash"
declare -x SHLVL="7"
declare -x SSH_CLIENT="192.94.163.141 57706 2122"
declare -x SSH_CONNECTION="192.94.163.141 57706 84.88.8.106 2122"
declare -x SSH_SENDS_LOCALE="yes"
declare -x SSH_TTY="/dev/pts/28"
declare -x TERM="xterm"
declare -x TMOUT="10800"
declare -x TMPDIR="/tmp/ppuigdom/702642.201107110949"
declare -x USER="ppuigdom"
declare -x WINDOWMANAGER="/usr/bin/gnome"
declare -x WM_ARCH="linux64"
declare -x WM_ARCH_OPTION="64"
declare -x WM_CC="gcc"
declare -x WM_CFLAGS="-m64 -fPIC"
declare -x WM_COMPILER="Gcc"
declare -x WM_COMPILER_LIB_ARCH="64"
declare -x WM_COMPILE_OPTION="Opt"
declare -x WM_CXX="g++"
declare -x WM_CXXFLAGS="-m64 -fPIC"
declare -x WM_DIR="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/wmake"
declare -x WM_LDFLAGS="-m64"
declare -x WM_LINK_LANGUAGE="c++"
declare -x WM_MPLIB="SGIMPI"
declare -x WM_OPTIONS="linux64GccDPOpt"
declare -x WM_OSTYPE="POSIX"
declare -x WM_PRECISION_OPTION="DP"
declare -x WM_PROJECT="OpenFOAM"
declare -x WM_PROJECT_DIR="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1"
declare -x WM_PROJECT_INST_DIR="/prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI"
declare -x WM_PROJECT_USER_DIR="/home/ppuigdom/OpenFOAM/ppuigdom-1.7.1"
declare -x WM_PROJECT_VERSION="1.7.1"
declare -x WM_THIRD_PARTY_DIR="/prod/OPENFOAM1.7.1/ThirdParty-1.7.1"
declare -x XAUTHLOCALHOSTNAME="pirineus"
declare -x XCURSOR_THEME="DMZ"
declare -x XDG_CONFIG_DIRS="/etc/xdg"
declare -x XDG_DATA_DIRS="/usr/share:/etc/opt/kde3/share:/opt/kde3/share"
declare -x XKEYSYMDB="/usr/share/X11/XKeysymDB"
declare -x XLSF_UIDDIR="/usr/share/lsf/7.0/linux2.6-glibc2.3-x86_64/lib/uid"
declare -x XNLSPATH="/usr/share/X11/nls"
declare -x enf="-n"
declare -x enl=""
declare -x ftp_proxy="http://192.168.255.254:8080"
declare -x http_proxy="http://192.168.255.254:8080"
declare -x https_proxy="http://192.168.255.254:8080"
declare -x no_proxy="localhost, 127.0.0.1"
MPI: could not run executable (case #3)
MPI: No details available, no log files found
Matat
ppuigdom@pirineus:/tmp/ppuigdom/snappy24_2/snappy24>
pere is offline   Reply With Quote

Old   July 11, 2011, 06:10
Default
  #18
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 7,450
Blog Entries: 33
Rep Power: 73
wyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the rough
Hi Pere,

Ah HA! Now we are getting somewhere! OK, right on the first command 'mpirun -np 2 bash -c "ls -l"', it doesn't run the second process.
This seems to mean that:
  • either the environment is all set-up manually on your current environment, i.e., you activate parts of the environment manually.
  • or there is something else stopping mpirun.
Are you able to use SGIMPI to run anything else on your machine?

There are three more commands that might be useful for testing:
  • Code:
    `which mpirun` -np 2 `which foamExec` bash -c "ls -l"
  • Code:
    `which mpirun` -np 2 `which foamExec` `which bash` -c "ls -l"
  • Code:
    `which mpirun` -np 2 `which bash` -c "ls -l"
These will help isolate where the problem is. They test if it's a matter of needing full paths to run things or missing environment variables.

There is another thing that you might want to check, although this is for OpenMPI, you'll have to search for the respective option:
Quote:
Originally Posted by pkr View Post
"Unless otherwise specified, Open MPI will greedily use all TCP networks that it can find and try to connect to all peers upon demand (i.e., Open MPI does not open sockets to all of its MPI peers during MPI_INIT -- see this FAQ entry for more details). Hence, if you want MPI jobs to not use specific TCP networks -- or not use any TCP networks at all -- then you need to tell Open MPI."

When using MPI_reduce, the OpenMPI was trying to establish TCP through a different interface. The problem is solved if the following command is used:
mpirun --mca btl_tcp_if_exclude lo,virbr0 -hostfile machines -np 2 /home/rphull/OpenFOAM/OpenFOAM-1.6/bin/foamExec interFoam -parallel

The above command will restrict MPI to use certain networks (lo, vibro in this case).
Best regards,
Bruno

Last edited by wyldckat; July 11, 2011 at 07:27. Reason: missing spaces on the first two command lines :(
wyldckat is offline   Reply With Quote

Old   July 11, 2011, 07:13
Default
  #19
Member
 
pere
Join Date: Mar 2011
Location: barcelona
Posts: 46
Rep Power: 5
pere is on a distinguished road
ppuigdom@pirineus:/tmp/ppuigdom/snappy24_2/snappy24> `which mpirun` -np 2 `which foamExec`bash -c "ls -l"
MPI: pirineus: 0x9d400004db871fb: /bin/sh: /prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/bin/foamExecbash: El fitxer o directori no existeix
MPI: pirineus: 0x9d400004db871fb: /bin/sh: line 0: exec: /prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/bin/foamExecbash: cannot execute: El fitxer o direMPI: pirineus: 0x9d400004db871fb: ctori no existeix
MPI: could not run executable (case #4)
Matat
ppuigdom@pirineus:/tmp/ppuigdom/snappy24_2/snappy24> `which mpirun` -np 2 `which foamExec``which bash` -c "ls -l"
MPI: pirineus: 0x9d400004db871fe: /bin/sh: /prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/bin/foamExec/usr/local/bin/bash: No és un directori
MPI: pirineus: 0x9d400004db871fe: /bin/sh: line 0: exec: /prod/OPENFOAM1.7.1/OpenFOAM-1.7.1-SGI/OpenFOAM-1.7.1/bin/foamExec/usr/local/bin/bash: cannot execute: NMPI: pirineus: 0x9d400004db871fe: o és un directori
MPI: could not run executable (case #4)
Matat
ppuigdom@pirineus:/tmp/ppuigdom/snappy24_2/snappy24> `which mpirun` -np 2 `which bash` -c "ls -l"
total 12
drwxr-x--- 2 ppuigdom cesca 4096 16 jun 11:17 0
drwxr-x--- 4 ppuigdom cesca 4096 16 jun 11:17 constant
-rw-r----- 1 ppuigdom cesca 0 11 jul 09:49 log
drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:21 processor0
drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor1
drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor2
drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor3
drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor4
drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor5
drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor6
drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor7
drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor8
drwxr-x--- 4 ppuigdom cesca 41 4 jul 10:22 processor9
drwxr-x--- 2 ppuigdom cesca 4096 16 jun 11:17 system
MPI: could not run executable (case #3)
MPI: No details available, no log files found
Matat
pere is offline   Reply With Quote

Old   July 11, 2011, 07:28
Default
  #20
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 7,450
Blog Entries: 33
Rep Power: 73
wyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the rough
I'm so sorry, but the online text editor seems to have eaten the spaces between "` `" Either that or I didn't type properly...

Here it is again fixed the first two commands:
  • Code:
    `which mpirun` -np 2 `which foamExec` bash -c "ls -l"
  • Code:
    `which mpirun` -np 2 `which foamExec` `which bash` -c "ls -l"
wyldckat is offline   Reply With Quote

Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On



All times are GMT -4. The time now is 09:55.