CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Installation

[foam-extend.org] foam-extend-3.2 on supercomputer

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   February 22, 2016, 00:44
Default foam-extend-3.2 on supercomputer
  #1
Member
 
Ripudaman Manchanda
Join Date: May 2013
Posts: 55
Rep Power: 12
ripudaman is on a distinguished road
Dear FOAMers,

I have been attempting to install foam-extend-3.2 on our supercomputer here in Texas for one month now. I used to have a working installation of foam-extend-3.1 on the same supercomputer. With the correct configuration I am able to compile foam-extend-3.2 using Intel compilers and both IMPI and MVAPICH2 mpi stacks, however, most tutorials I am interested in result in run-time seg faults when I try to run them in parallel. In serial they run fine. Also in debug mode they run fine. The problem is only in Opt mode. The seg fault using the IMPI stack looks like this-
Code:
Time = 1


Predicting U, gradU and snGradU based on V,gradV and snGradV

DICPCG:  Solving for Ux, Initial residual = 1, Final residual = 0.0848851, No Iterations 66
PCG:  Solving for Uy, Initial residual = 0, Final residual = 0, No Iterations 0

===================================================================================
=   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
=   PID 115276 RUNNING AT c557-402
=   EXIT CODE: 139
=   CLEANING UP REMAINING PROCESSES
=   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
===================================================================================
APPLICATION TERMINATED WITH THE EXIT STRING: Segmentation fault (signal 11)
TACC: MPI job exited with code: 139
 
TACC: Shutdown complete. Exiting.
The seg fault using the MVAPICH2 stacks looks like this-
Code:
Predicting U, gradU and snGradU based on V,gradV and snGradV
DICPCG:  Solving for Ux, Initial residual = 1, Final residual = 0.0848851, No Iterations 66
PCG:  Solving for Uy, Initial residual = 0, Final residual = 0, No Iterations 0
[c559-904.stampede.tacc.utexas.edu:mpi_rank_0][error_sighandler] Caught error: Segmentation fault (signal 11)
I have tried getting expert help at the supercomputer but to no avail. The cluster administrators have recommended me to contact openfoam experts to resolve this issue.

I have also discussed this issue with OpenFOAM developers in Ireland who were able to use Intel compilers to install and run foam cases on their computer architecture.

I am using the Stampede supercomputer at the Texas Advanced Computing Center (TACC). The Stampede system is a Dell Linux Cluster based on 6400+ Dell PowerEdge server nodes, each outfitted with 2 Intel Xeon E5 (Sandy Bridge) processors and an Intel Xeon Phi Coprocessor (MIC Architecture).

Additional info can be found here. https://portal.tacc.utexas.edu/user-guides/stampede

Can someone please provide some help regarding this?

Thank you.
Regards,
Ripu
ripudaman is offline   Reply With Quote

Old   March 2, 2016, 12:16
Default
  #2
New Member
 
Aditya Raman
Join Date: Oct 2015
Posts: 3
Rep Power: 10
adiraman9 is on a distinguished road
Hi ripudaman, did you manage to find a solution to your problem ? Did you try using system gcc 4.9.1 and ThirdParty Openmpi ?
I built foam extend 3.2 on stampede using system gcc and default ThirdParty Openmpi. I did not test all the tutorials but some of them including the moving cylinder immersed boundary one ran pretty fine on parallel. Also managed to run a couple of my own cases. I recently ran into communication issues with the ThirdParty openmpi. So was planning to try building it using intel compilers and intel mpi.
Here's the error I got:-

Code:
Selecting dynamicFvMesh dynamicRefineFvMesh
[c558-002.stampede.tacc.utexas.edu:46556] *** An error occurred in MPI_Bsend 
[c558-002.stampede.tacc.utexas.edu:46556] *** on communicator MPI_COMM_WORLD
[c558-002.stampede.tacc.utexas.edu:46556] *** MPI_ERR_BUFFER: invalid buffer pointer
[c558-002.stampede.tacc.utexas.edu:46556] *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 46556 on
node c558-002.stampede.tacc.utexas.edu exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
---------------------------------------------------------------------
I think it's weird because things were running just fine about two days ago.
adiraman9 is offline   Reply With Quote

Old   March 2, 2016, 12:59
Default Had some luck with older ICC
  #3
Member
 
Ripudaman Manchanda
Join Date: May 2013
Posts: 55
Rep Power: 12
ripudaman is on a distinguished road
Hi,

I finally had some luck running foam-extend-3.2 on Stampede today by using an older Intel compiler (intel/13.0.2.146). I have not tested my own code yet though.

Your problems maybe because of the openmpi you are using. I used the MV2MPI option that is explained on this link https://portal.tacc.utexas.edu/software/openfoam.

Regards,
Ripu
ripudaman is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
error with reactingFoam BakedAlmonds OpenFOAM Running, Solving & CFD 4 June 22, 2016 02:21
[blockMesh] error message with modeling a cube with a hold at the center hsingtzu OpenFOAM Meshing & Mesh Conversion 2 March 14, 2012 09:56
[blockMesh] BlockMesh FOAM warning gaottino OpenFOAM Meshing & Mesh Conversion 7 July 19, 2010 14:11
[blockMesh] Axisymmetrical mesh Rasmus Gjesing (Gjesing) OpenFOAM Meshing & Mesh Conversion 10 April 2, 2007 14:00
[Gmsh] Import gmsh msh to Foam adorean OpenFOAM Meshing & Mesh Conversion 24 April 27, 2005 08:19


All times are GMT -4. The time now is 02:39.