CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > OpenFOAM Programming & Development

Problem for parallel running chtMultiregionFoam

Register Blogs Members List Search Today's Posts Mark Forums Read

Like Tree1Likes
  • 1 Post By wyldckat

Reply
 
LinkBack Thread Tools Display Modes
Old   June 17, 2013, 05:34
Post Problem for parallel running chtMultiregionFoam
  #1
New Member
 
Giancarlo
Join Date: Apr 2013
Location: Milan
Posts: 18
Rep Power: 4
Giancarlo_IngChimico is on a distinguished road
Hi FOAMers,
I have developed a new solver that it is able to treat multi-region meshes. This solver has the same architecture of chtMultiregionFoam but it is able to solve material and energy balance for reactive system too.
When I launch a simulation in parallel I have a strange problem: it works with no problem for the first time-step, after that it crashes when reads the file: compressibleMultiRegionCourantNo.H.
This is very strange: why are there problems if it has to repeat the same operations of the first time step?

Can anyone help me?

Thanks

Best regards

Giancarlo
Giancarlo_IngChimico is offline   Reply With Quote

Old   June 18, 2013, 08:08
Default
  #2
New Member
 
Giancarlo
Join Date: Apr 2013
Location: Milan
Posts: 18
Rep Power: 4
Giancarlo_IngChimico is on a distinguished road
I post the error.

Can anyone help to understand the nature of error?

Thanks

Giancarlo

Code:
[1] [2] #0  #0  Foam::error::printStack(Foam::Ostream&)Foam::error::printStack(Foam::Ostream&) in "/home/OpenFOAM/OpenFOAM-2.1.x/platfo in "/home/cfduser1/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #rms/linux64GccDPOpt/lib/libOpenFOAM.so"1  Foam::sigSegv::sigHandler(int)
[2] #1  Foam::sigSegv::sigHandler(int) in "/home/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #2   in "/home/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #2  __restore_rt__restore_rt at sigaction.c:0
[2] #3   at sigaction.c:0
[1] #3  mainmain in "/home/GentileG/gianca/Run/parallel/multiFinal_test_parallel"
[1] #4  __libc_start_main in "/home/GentileG/gianca/Run/parallel/multiFinal_test_parallel"
[2] #4  __libc_start_main in "/lib64/libc.so.6"
[1] #5   in "/lib64/libc.so.6"
[2] #5  Foam::regIOobject::writeObject(Foam::IOstream::streamFormat, Foam::IOstream::versionNumber, Foam::IOstream::compressionType) constFoam::regIOobject::writeObject(Foam::IOstream::streamFormat, Foam::IOstream::versionNumber, Foam::IOstream::compressionType) const in "/home/GentileG/gianca/Run/parallel/multiFinal_test_parallel"
[compute:18709] *** Process received signal ***
[compute:18709] Signal: Segmentation fault (11)
[compute:18709] Signal code:  (-6)
[compute:18709] Failing at address: 0x623b00004915
[compute:18709] [ 0] /lib64/libc.so.6 [0x367c230280]
[compute:18709] [ 1] /lib64/libc.so.6(gsignal+0x35) [0x367c230215]
[compute:18709] [ 2] /lib64/libc.so.6 [0x367c230280]
[compute:18709] [ 3] multiFinal_test_parallel [0x456912]
[compute:18709] [ 4] /lib64/libc.so.6(__libc_start_main+0xf4) [0x367c21d974]
[compute:18709] [ 5] multiFinal_test_parallel(_ZNK4Foam11regIOobject11writeObjectENS_8IOstream12streamFormatENS1_13versionNumberENS1_15compressionTypeE+0x151) [0x4204a9]
[compute:18709] *** End of error message ***
--------------------------------------------------------------------------
mpirun noticed that process rank 1 with PID 18709 on node compute-3-11.local exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------
[compute.local:18707] 2 more processes have sent help message help-mpi-btl-base.txt / btl:no-nics
[compute.local:18707] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
vi log 
[1]+  Exit 139                mpirun -np 3 multiFinal_test_parallel -parallel > log
Giancarlo_IngChimico is offline   Reply With Quote

Old   August 5, 2013, 09:56
Default
  #3
New Member
 
M Bay
Join Date: Jun 2013
Location: Germany
Posts: 10
Rep Power: 4
mbay101 is on a distinguished road
Hallo Giancarlo,

Im having exactly the same Problem. I noticed in my Simulation that the Tempratur Value in the Air Region are too high. At the second Time step i get the same Error that you have posted. It seems that OF having Problems when he try s to calculate h in Fluid Region.

You can try to run the case seriely without decomposen it.
If you get your case working please let me know, how did you do it.

Regards
mbay101 is offline   Reply With Quote

Old   August 25, 2013, 07:50
Default
  #4
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 8,251
Blog Entries: 34
Rep Power: 84
wyldckat is just really nicewyldckat is just really nicewyldckat is just really nicewyldckat is just really nice
Greetings to all!

mbay101's problem is being addressed here: http://www.cfd-online.com/Forums/ope...egionfoam.html


@Giancarlo: The key issue is a bad memory access:
Quote:
Code:
Foam::sigSegv::sigHandler
Where SIGSEGV is further explained here: http://en.wikipedia.org/wiki/SIGSEGV

From your description, it looks like the problem is that at least one field/array has been destroyed when the iteration was finished.

Best regards,
Bruno
Luchini likes this.
wyldckat is offline   Reply With Quote

Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
problem with running in parallel dhruv OpenFOAM 3 November 25, 2011 06:06
Parallel running of 3D multiphase turbulence model (unknown problem!!) MOHAMMAD67 OpenFOAM Running, Solving & CFD 5 November 11, 2011 01:12
Problem running parallel. Hernán STAR-CCM+ 1 December 23, 2009 13:04
Problem running parallel Hernán Main CFD Forum 0 December 22, 2009 05:36
Kubuntu uses dash breaks All scripts in tutorials platopus OpenFOAM Bugs 8 April 15, 2008 07:52


All times are GMT -4. The time now is 09:47.