CFD Online Logo CFD Online URL
Home > Forums > OpenFOAM Programming & Development

Problem for parallel running chtMultiregionFoam

Register Blogs Members List Search Today's Posts Mark Forums Read

Like Tree1Likes
  • 1 Post By wyldckat

LinkBack Thread Tools Display Modes
Old   June 17, 2013, 05:34
Post Problem for parallel running chtMultiregionFoam
New Member
Join Date: Apr 2013
Location: Milan
Posts: 21
Rep Power: 6
Giancarlo_IngChimico is on a distinguished road
Hi FOAMers,
I have developed a new solver that it is able to treat multi-region meshes. This solver has the same architecture of chtMultiregionFoam but it is able to solve material and energy balance for reactive system too.
When I launch a simulation in parallel I have a strange problem: it works with no problem for the first time-step, after that it crashes when reads the file: compressibleMultiRegionCourantNo.H.
This is very strange: why are there problems if it has to repeat the same operations of the first time step?

Can anyone help me?


Best regards

Giancarlo_IngChimico is offline   Reply With Quote

Old   June 18, 2013, 08:08
New Member
Join Date: Apr 2013
Location: Milan
Posts: 21
Rep Power: 6
Giancarlo_IngChimico is on a distinguished road
I post the error.

Can anyone help to understand the nature of error?



[1] [2] #0  #0  Foam::error::printStack(Foam::Ostream&)Foam::error::printStack(Foam::Ostream&) in "/home/OpenFOAM/OpenFOAM-2.1.x/platfo in "/home/cfduser1/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/"
[1] #rms/linux64GccDPOpt/lib/"1  Foam::sigSegv::sigHandler(int)
[2] #1  Foam::sigSegv::sigHandler(int) in "/home/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/"
[1] #2   in "/home/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/"
[2] #2  __restore_rt__restore_rt at sigaction.c:0
[2] #3   at sigaction.c:0
[1] #3  mainmain in "/home/GentileG/gianca/Run/parallel/multiFinal_test_parallel"
[1] #4  __libc_start_main in "/home/GentileG/gianca/Run/parallel/multiFinal_test_parallel"
[2] #4  __libc_start_main in "/lib64/"
[1] #5   in "/lib64/"
[2] #5  Foam::regIOobject::writeObject(Foam::IOstream::streamFormat, Foam::IOstream::versionNumber, Foam::IOstream::compressionType) constFoam::regIOobject::writeObject(Foam::IOstream::streamFormat, Foam::IOstream::versionNumber, Foam::IOstream::compressionType) const in "/home/GentileG/gianca/Run/parallel/multiFinal_test_parallel"
[compute:18709] *** Process received signal ***
[compute:18709] Signal: Segmentation fault (11)
[compute:18709] Signal code:  (-6)
[compute:18709] Failing at address: 0x623b00004915
[compute:18709] [ 0] /lib64/ [0x367c230280]
[compute:18709] [ 1] /lib64/ [0x367c230215]
[compute:18709] [ 2] /lib64/ [0x367c230280]
[compute:18709] [ 3] multiFinal_test_parallel [0x456912]
[compute:18709] [ 4] /lib64/ [0x367c21d974]
[compute:18709] [ 5] multiFinal_test_parallel(_ZNK4Foam11regIOobject11writeObjectENS_8IOstream12streamFormatENS1_13versionNumberENS1_15compressionTypeE+0x151) [0x4204a9]
[compute:18709] *** End of error message ***
mpirun noticed that process rank 1 with PID 18709 on node compute-3-11.local exited on signal 11 (Segmentation fault).
[compute.local:18707] 2 more processes have sent help message help-mpi-btl-base.txt / btl:no-nics
[compute.local:18707] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
vi log 
[1]+  Exit 139                mpirun -np 3 multiFinal_test_parallel -parallel > log
Giancarlo_IngChimico is offline   Reply With Quote

Old   August 5, 2013, 09:56
New Member
M Bay
Join Date: Jun 2013
Location: Germany
Posts: 10
Rep Power: 6
mbay101 is on a distinguished road
Hallo Giancarlo,

Im having exactly the same Problem. I noticed in my Simulation that the Tempratur Value in the Air Region are too high. At the second Time step i get the same Error that you have posted. It seems that OF having Problems when he try s to calculate h in Fluid Region.

You can try to run the case seriely without decomposen it.
If you get your case working please let me know, how did you do it.

mbay101 is offline   Reply With Quote

Old   August 25, 2013, 07:50
Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,036
Blog Entries: 39
Rep Power: 110
wyldckat is a glorious beacon of lightwyldckat is a glorious beacon of lightwyldckat is a glorious beacon of lightwyldckat is a glorious beacon of lightwyldckat is a glorious beacon of lightwyldckat is a glorious beacon of light
Greetings to all!

mbay101's problem is being addressed here:

@Giancarlo: The key issue is a bad memory access:
Where SIGSEGV is further explained here:

From your description, it looks like the problem is that at least one field/array has been destroyed when the iteration was finished.

Best regards,
Luchini likes this.
wyldckat is offline   Reply With Quote

Old   December 21, 2016, 04:26
Senior Member
Manu Chakkingal
Join Date: Feb 2016
Location: Delft, Netherlands
Posts: 113
Rep Power: 3
manuc is on a distinguished road

I have a new solver with structure of chtMultiregionfoam which behaves like bouyantboussinesqpimplefoam for fluid region. I ran the simulations in series and the solution converges. When I try to run it in parallel it crashes in the first step.

Used OF2.4.0

I tried to reduce the courant number to attain initial stability , but then the solver crashes after 2nd time step.


deltaT = 4.5530327e-107

--> FOAM Warning : 
From function Time::operator++()
in file db/Time/Time.C at line 1061
Increased the timePrecision from 62 to 63 to distinguish between timeNames at time 2.0707573e-07
Time = 2.07075734119138104400662664383858668770699296146631240844726562e-07

Solving for fluid region air
DILUPBiCG: Solving for T, Initial residual = 0.010403611, Final residual = 2.7324966e-12, No Iterations 1
max(T) [0 0 0 1 0 0 0] 300.02011
DICPCG: Solving for p_rgh, Initial residual = 1, Final residual = 0.0099236724, No Iterations 251
time step continuity errors : sum local = 5.0400286e-07, global = 1.0693134e-19
mpirun noticed that process rank 4 with PID 17152 on node n11-42 exited on signal 8 (Floating point exception).
(I tried different decomposing methods (simple and scotch)..Also I varied simple coeff --delta-0.01 ...but no use))

problem solved:
Temperature anomoly at pressure reference cell

Last edited by manuc; December 30, 2016 at 03:06.
manuc is offline   Reply With Quote


Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On

Similar Threads
Thread Thread Starter Forum Replies Last Post
Parallel running of 3D multiphase turbulence model (unknown problem!!) MOHAMMAD67 OpenFOAM Running, Solving & CFD 7 November 23, 2015 11:53
problem with running in parallel dhruv OpenFOAM 3 November 25, 2011 06:06
Problem running parallel. Hernán STAR-CCM+ 1 December 23, 2009 13:04
Problem running parallel Hernán Main CFD Forum 0 December 22, 2009 05:36
Kubuntu uses dash breaks All scripts in tutorials platopus OpenFOAM Bugs 8 April 15, 2008 07:52

All times are GMT -4. The time now is 07:35.