CFD Online Logo CFD Online URL
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

rhoCentralFoam solver with Slip BCs fails in Parallel Only

Register Blogs Members List Search Today's Posts Mark Forums Read

LinkBack Thread Tools Search this Thread Display Modes
Old   October 3, 2012, 00:15
Default rhoCentralFoam solver fails in Parallel Only (Possible Error In Processor Patch?)
New Member
Nishit Joseph
Join Date: Nov 2010
Posts: 29
Rep Power: 15
JLight is on a distinguished road
NOTE: I appended the title, to reflect some new development. The old title was 'rhoCentralFoam solver with Slip BCs fails in Parallel Only'.

Hi OpenFoam Users!
I am trying to solve the flow over a very small cylinder at near transonic Mach numbers using rhoCentralFoam (OF version 2.1.0).

The issue is the solver does not have any Floating Point Exception Error when running as a single process. When I decompose the mesh and run on 4 cores it gives Floating Point Errors.

On debugging (when running parallel) the error was at the point when mu was calculated using sutherlands Law because the T was -ve. Once again at this time step T is not -ve when using a single processor.

To help me further debug, I modified rhoCentralFoam to
  1. Bound rho, U, rhoE and e using library boundMinMax (from OpenFOAM extended)
  2. Print the min/max values of rho, mag(rhoU), mag(U), rhoE, e and T to give me more debug info
  3. Change the boundary slip conditions to constant value

When running with constant boundary (wire wall) there is not problem running single/parallel.

So switiching back the the slip boundary conditions. When running on a single processor none of the fileds are bounded (they are all reasonably valued). However, when running in parallel, very early in the run, the values of rho, rhoU, U fields are WAY off like an order of 10^6 off!.

I have attached my modified solver "boundedRhoCentralFoam" and the case I am investigating "wire".

The case has two initial conditions. The 0 used slip/temp jump condtions at the wire's wall. The has constant wall conditions.

I believe the issue is with the BCs defined in the rhoCentralFoam solver:
  • smoluchowskiJumpTFvPatchScalarField
  • maxwellSlipUFvPatchVectorField

I have reached my limit of debuging this problem, I would really appericate any help sugestion on this.



PS: My case file is a bit large (4MB). Please download from my google Drive
Attached Files
File Type: gz boundedRhoCentralFoam.tar.gz (10.0 KB, 26 views)

Last edited by JLight; October 4, 2012 at 01:33. Reason: Forgot to attach files
JLight is offline   Reply With Quote

Old   October 4, 2012, 01:30
Default Not Slip but Processor BC that is the issue?
New Member
Nishit Joseph
Join Date: Nov 2010
Posts: 29
Rep Power: 15
JLight is on a distinguished road
Just an update on this front. After more investigation it seems to me that the boundaries created by the processor patch may as well be the cause of the issue. I am not entirely sure why this is so.

To issustrate this, I decomposed my mesh using scotach. I also fixed the deltaT to a very small value and also made the writeInterval equal to that value. This way I output results at every time step. The solver gives a Floating Point Error during the third time step when run on 4 cores. (I guess buy now I do not have to state that on a single processor the solver continues without any error and the solution does seem to form.)

Seeing the results in Paraview is really interesting. It shows slight variation at the processor patch region. Take a look at the images below. The first is with the part of the mesh that was with 'processor2' the second images is without. There is a change in the T (shown, but also in other flow field variables) at the boundary of the patches. Is this normal?

My understanding is the processor patch does not influence the solution in any way. Why is it doing so in this case? Could this explain why there is no error in a single processor and error in Parallel since the solution is getting destorted? Does this mean that my eariler assumption that the slip BC being the issue is incorrect?

Any help is greately valued.

Attached Images
File Type: jpg Paralel_Scotch_2.jpg (12.9 KB, 33 views)
File Type: jpg Paralel_Scotch_3.jpg (11.9 KB, 24 views)
JLight is offline   Reply With Quote

Old   October 11, 2012, 21:08
Default Fix for the issue
New Member
Nishit Joseph
Join Date: Nov 2010
Posts: 29
Rep Power: 15
JLight is on a distinguished road
An update for the record.

I had reported this as a bug on OF-2.1.x. I was informed that the issue is when thermalCreep is turned on.

The OF bug manager noted that "The maxwellSlipU BC does a gradient calculation of T in case of thermalCreep and this overwrites the processor buffer values."

He also inform that it will be fixed in the next major release.

Bug report is NO: 0000659.

JLight is offline   Reply With Quote


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On

Similar Threads
Thread Thread Starter Forum Replies Last Post
different results between serial solver and parallel solver wlt_1985 FLUENT 11 October 12, 2018 08:23
BCs for parallel ggi NickG OpenFOAM 1 March 15, 2010 11:12
Which parallel solver is most suitable for my app? zonexo Main CFD Forum 1 November 17, 2006 09:46
direct parallel solver sylvain Main CFD Forum 2 June 25, 2001 07:56
Wall BC's for a NS solver André Burdet Main CFD Forum 6 December 8, 2000 21:45

All times are GMT -4. The time now is 10:25.