CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

LTSInterFoam

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   November 2, 2011, 08:02
Default LTSInterFoam
  #1
Member
 
pere
Join Date: Mar 2011
Location: barcelona
Posts: 46
Rep Power: 15
pere is on a distinguished road
Running a 25e6 cells file with LTSInterFoam and had this error only sometimes with more than 128 cpus and with less than 2, for 4,8, 16,32, 64 is runnig without problems.

MPI: #15 0x000000000042fb22 in main ()
MPI: (gdb) A debugging session is active.
MPI:
MPI: Inferior 1 [process 1194490] will be detached.
MPI:
MPI: Quit anyway? (y or n) [answered Y; input not from terminal]
MPI: Detaching from program: /proc/1194490/exe, process 1194490

MPI: -----stack traceback ends-----
MPI: On host pirineus, Program /prod/OF20/OpenFOAM-2.0.x/platforms/linux64GccDP
Opt/bin/LTSInterFoam, Rank 1, Process 1194490: Dumping core on signal SIGFPE(8)
into directory /tmp/ppuigdom/WG_2_25M/2/wigleyHull
MPI: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
MPI: aborting job
MPI: Received signal 8


Does anyone knows what can be the error?
pere is offline   Reply With Quote

Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Free Surface Ship Flow timfranke OpenFOAM Running, Solving & CFD 322 March 3, 2021 10:04
LTSinterfoam visualization pere OpenFOAM Post-Processing 4 October 14, 2011 16:25
LTSInterFoam - settings PrzemekPL OpenFOAM Running, Solving & CFD 1 June 28, 2011 04:38


All times are GMT -4. The time now is 08:46.