CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > Phoenics

parallel PHOENICS

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   September 16, 2000, 04:44
Default parallel PHOENICS
  #1
George Bergantz
Guest
 
Posts: n/a
This discussion group is a terrific idea- lets' make it work.

I wanted to share my experience with parallel PHOENICS 3.1 and invite discussion on that topic. My interests are in geophysical flows, natural convection, lava, multiphase reacting flow and so on.

I have an 8-node Linux Pentium II (400 MHz), fast ethernet cluster built for me by Paralogic. Starting about a year ago they worked with CHAM to ensure that the LAM MPI libaries were correctly installed and the PHOENICS executable working. This is not quite a turn-key operation.

But the performance has been very impressive with linear speed-up. So for simulations that don't require a lot of extensive ground coding, it was pretty simple to reap solid returns. However the ground coding is complicated requiring some knowledge of how MPI works, and in addition as the whole field solver must be used (the parallel domain splitting takes place in the Z-direction, out of slab). Thus the use of GETYX, SETYX and any other in-slab niceties are moot.

We also uncovered a bug related to multiphase scalar transport in the whole field solver that awaits final resolution. There are also a number of other new features that don't lend themselves tp parallel implementation such as PLANT and the Lagrangian particle tracking routine as well as a few other features.

But I am happy with teh direction and efforts that CHAM has made so far. Who else out there is doing parallel? I would benefit from your input.

I can answer other questions as needed.

George Bergantz Dept. Geological Sciences University of Washington Seattle WA
  Reply With Quote

Old   September 20, 2000, 16:48
Default Re: parallel PHOENICS
  #2
Steven Beale
Guest
 
Posts: n/a
We are running parallel PHOENICS 3.2 on a 40-node LINUX cluster using 550 MHz Pentium 3 processors with Dolphin Interconnect cards. Some parts of the code needed to be recompiled and linked to incorporate the MPI libraries. Preliminary tests have shown significant, but not linear speed ups, around 10 times speed-up for 20 processors. However these are very early results and we are working on improving performance. Both CHAM and Dolphin have indicated their willingness to help optimize the configuration with us. There are some minor incompatabalities between the VR interface on the LINUX O/S and the (newer) WINDOWS version; meshes generated on the latter cannot be subsequently modified.

  Reply With Quote

Old   September 21, 2000, 02:04
Default Re: parallel PHOENICS
  #3
George Bergantz
Guest
 
Posts: n/a
Steve-

Our nearly linear speed-up may be a result of only having 8 processors. I have also noticed that large jobs give larger relative speed-ups. Of course it always comes down to a combination of processor and latency. I use fast ethernet which is slower than Dolphin but much cheaper. Your run times on 8 processors are no doubt faster than mine.

Have you tried to do any ground coding yet? Please report on that.

One problem I have is that my system will hang occasionally. I have not been able to resolve the source of this. As a result large jobs are done with frequent restarts to avoid losses on hanging. I have heard that older versions of the LAM MPI libraries have problems, and am anxiously waiting for an upgrade. I have also had over-heating problems, but this is not a PHOENICS issue.

What kind of cluster management software do you have? What version of Linux are you running?

Are you doing any multiphase work?

By the way, the extensive discussion on variable viscosity that I generated in the main forum was the result of realizing that PHOENICS did not have the full variable viscosity problem formulated correctly many years ago. We studied the discretization equations as given in the detailed documentation and decided that it was not fully implmented. Upon talking with many others in mechanical engineering and geophysics, I discovered that it was a common occurance. I should bring this up with CHAM.
  Reply With Quote

Old   September 25, 2000, 21:12
Default Re: parallel PHOENICS
  #4
Greg Perkins
Guest
 
Posts: n/a
Hi George,

I'm not a user of Phoenics, but am using Fluent and added my own generalised module, in C, for computing heat, mass and species source terms for a chemcial reaction between a multi-component solid and fluid within an Eulerian two-phase multi-component model. I am using this for coal combustion/gasification for insitu underground coal gasification processes. I have parallelised the code in OpenMP.

It seems to be working well, but I was wondering if its possible to do something like this in Phoenics and what the pros and cons might be?? In particular because of the large time and length scales, I'd like to be able to solve for steady gas flow but transient heat and mass transfer at each time-step - can you dynamically turn on and off the solving of various eqns????

Thanks

Greg
  Reply With Quote

Old   September 26, 2000, 02:42
Default Re: parallel PHOENICS
  #5
George Bergantz
Guest
 
Posts: n/a
This sounds like a very compex simulation- I presume that it is turbulent- and including the parallel issues makes for a complicated code. Congratulations on the progress you've made so far!

At this level of model complexity and sophistication, I urge you to contact the PHOENICS team directly for a concrete answer. The generality of the "source term approach" of the FVM volume method would suggest that almost anyting can be modeled, but in fact multiphase reaction, especially if the reaction involves products of very different density, is takes work to get convergence. Model validation and verification is also difficult, especially if the second phase feeds-back into the turbulence. But I don't believe these are PHOENICS issues especially- the differences in the way that turbulence is modleded as well as the simple form of phase momentum and heat transfer laws pretty much produce the same problems for any code. Here is what I know:

A study of the phase distribution and turbulence intensity in non-reacting two-phase pipe flow (Eulerian/Eulerian) using the PHOENICS code leads to discrepencies of 10% to 30% in phase volume fraction near pipe walls and discrepency of up to 25% in turbulence intensity. This may be acceptable for your design parameters.

The pros:

1) The common interphase drag and heat transfer laws are already included and easy to implement. In fact there are a number of libaries that treat common instances of industrial multiphase reaction. One then adds their geometry and so on to these built-in source terms. I believe that PHOENICS has been used in studies of combustion such as you describe.

2) The feature of a "shadow phase" makes it easier to relax the standard requirenment that the disperse phase have the same diameter eveywhere. Hence one can have a different drag as combustion continues with locally different dispersed phase sizes. The only other way to do thsi is to go to some kind of Lagrangian method.

3) Sources for lift forces are now added I believe.

Cons:

1) The description of the viscosity of the dispersed phase (the drag of the phase on itself) does not (to my knowledge) represent any of the common constitutive relations that are often invoked in the literature. You have to code your own which is not hard but messy in parallel implementation.

2) The requirement that the phases share the same pressure might be problematic if you have dispersed phase gathering and dispersal.

3) The parallel version has a problem with the multiphase heat transfer that has not been resolved.

4) Phase volume fraction equation is not higher order and numerical diffusion leads to smearing of phase volume fraction that may be a problem in a combustion type simulation.

One of the features of PHOENICS that savvy users like is the significant degree of control that that is available. Are you asking about whether one can have a steady dynamic template (velocity field) on which a heat transfer problem is then solved- essentially making velocity a kinematic prescription? This is commonly done in the LES simulation of multiphase flow.

I've never tried it in PHOENICS as all my applications are tightly coupled and fully time-dependent. Perhaps some other reader can speak to this issue.
  Reply With Quote

Old   September 26, 2000, 04:30
Default Re: parallel PHOENICS
  #6
Greg Perkins
Guest
 
Posts: n/a
Thanks for your response....

yes this is an interesting and complex problem. This process has many different time and length scales - on one hand there's the reactions, flow, porous solid description at small scale (cm or less, ms or less) but I really want to solve for a large domain 10m x 20m x 20m over a long time scale (>1 month)!!!!! Oblviously there's a need for serious simplification otherwise its not computable.

To get around this, I have so far solved the problem as a series of steady state solutions - and then use the results to generate a new case using the previous solution. Essentially I assume the system is quasi-steady which is reasonable, but not strictly correct. I am currenty attempting to relax this a bit, by only assuming flow is steady and then still solve transient heat & mass transfer with the computed flow solution at that time. This prompted my question regarding the capabilities of Phoenics etc.

I should mention that I use Eulerian method and assume a stationary porous solid phase - its like a block of coal - not a coal particle which is entrained in the flow as in most other applications.

Regarding turbulence - that is there but I am finding I need high resolution to get converged solutions - when I scale up to large models this resolution is likely to be prohibitive so I am thinking about using a laminar solution (like all previous models for this) and empirically add the effect on heat and mass transport and reaction. . .

Validation is coming up - results seem to be order of magnitude at the moment, but I need to do more detailed work on this.

Initially I thought I could solve this problem without such simplifications, but even with a 20 processor machine its actually not possible....

Thanks for your response,

Regards

Greg Perkins

  Reply With Quote

Old   September 26, 2000, 10:29
Default Re: parallel PHOENICS
  #7
Steven Beale
Guest
 
Posts: n/a
George:

We are a little disappointed regarding the speed-up of the Dolphin cards over fast ethernet (in view of the costs), maybe 2x. Dolphin are supposedly working with CHAM on this. Ron Jerome will be doing more extensive benchmarks for a talk in a few weeks, and we shall report these, when available. It would be good to compare the parallel performance of PHOENICS with other codes, e.g. FLUENT. That would be a good benchmark exercise for a CFD conference (interested?). We are also looking at parallelizing pre-existing visualisation codes to run on the beowulf, and also across the net. Plans are to grow the farm to 120 nodes in due course. Do you know if PHOENICS is installed on any of the really big parallel installations, e.g. in the top 500?

We have not yet done any GROUND coding, though I have one project with PLANT coding that I should like to attempt to port to the cluster. Any comments?

Our system also has problems, a couple of the nodes keep going down. Not clear why yet.

We use the SCALI user interface. Redhat 6.1 (but with the upgraded kernel version 2.2.14)

We are not using IPSA or GENTRA at this moment in time. Wei Dong is modelling fuel cells with PHOENICS on the farm: Although there are two "fluids", thus far, we just modify the properties of a single phase, i.e. treat them as though they do not intermingle from the momentum point-of-view. Be interesting to see if MUSES can be parallelized, and how.

On the variable viscosity issue. We did a multi-phase viscous/plastic (Mohr-Coulomb) flow a year or so ago. One of the conclusions we came to was that the rheology was very complex, i.e. there was no "right" constitutive closure for the system. Even for a plain old Newtonian fluid, Stokes' assumption is pretty dubious! So before you invest a lot of time in coding (or perhaps afterwards) you might want to ask yourself if the rheology you are proposing for your system is in fact sufficiently accurate for engineering purposes. Probably best to plot principal stresses (or at least pressure) and see if there's any significant changes as a function of closure system.

Regards

Steven

  Reply With Quote

Old   September 26, 2000, 12:16
Default Re: parallel PHOENICS
  #8
George Bergantz
Guest
 
Posts: n/a
Steve:

Your comments re Dolphin product are interesting. 2x is not much of a speed-up all things considered. Your farm sounds like a real operation! Interested in modeling volcanic eruptions? My farm is pretty small for such an effort.

I don't know of any other user of parallel PHOENICS other than you- by this I mean distributed computing over a network. I hope CHAM encourages others to share their experiences here.

I have used ISPA on my parallel machine with no problems other then the fact that the phase volume fractions (R1, R2) are not higher order and suffer numerical diffusion and that the between-phase scalar transport is not working with the whole field solver (default in parallel). I have not heard from CHAM regarding progress to fixing this, and so multiphase problems with heat or mass transfer should not be attempted in parallel until this is fixed.

The use of PLANT will require that the appropriate MPI commands are included. If someone could make that aspect "transparent" to the end-user it would no doubt increase the interst in the parallel product. One draw back to the parallel implementation is that the end-user has to get involved in the parallel coding issues, or at least be mindful of them.

GENTRA is another matter- I suppose that would be tricky as the parallel domain splitting would require that particles be tracked as they pass from one processor to another. The group here in astronomy that models galaxy collisions tries to do this (not with PHOENICS) an it gives them fits as the Lagrangian particles move between processors in a given time step, etc.

I am embarassed to admit that I'm still running Redhat 5.1. I am way over-due for an upgrade but wanted to do that when the new & improved (with fixed multiphase heat transfer) PHOENICS is released. I'll upgrade my LAM MPI libraries at that time as well, maybe that will fix the hanging problem.

Your comments on constitutive equations are wise indeed. One of the most vexing thinsg about geological materials is that their viscosity can vary over many orders of magnitude in a single problem as a function of both temperature and concentration and also be non-Newtonian. Ugh. PHOENICS performs best at moderate Reynolds number, and simulations that are time dependent, where the Reynolds number starts at zero, reaches some maximum and then goes to zero makes convergence a moving target. Lots of trail and error is required.

Thanks for the update and I hope to hear more,

gb
  Reply With Quote

Old   November 16, 2000, 09:29
Default Re: parallel PHOENICS
  #9
Steven Beale
Guest
 
Posts: n/a
I've posted a jpg file of our latest experience with parallel PHOENICS at http://www.icpet.nrc.ca/beale/pllelph.jpg. Basically we see reasonably linear response using both fast Ethernet and Dolpin/Scali interconnects up to about 16 processors. After that the additional investment in the Scali cards appears to be paying off with a significant drop-off in performance being observed for the fast Ethernet interconnections. The main point though is that we can now do problems of sizes not possible on conventional machines due to the problem being broken up into smaller problems of manageable size (i.e. memory).
  Reply With Quote

Old   November 16, 2000, 15:57
Default Re: parallel PHOENICS
  #10
George Bergantz
Guest
 
Posts: n/a
Thanks Steve, that is interesting and very useful as many users will probably build systems with 16 or less nodes.

Please share more of your experience and expertise as time allows; I sure benefit from it.
  Reply With Quote

Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Restart of variables in parallel phoenics RJD Phoenics 1 April 25, 2009 23:34
Runing Parallel PHOENICS J.S.8a Phoenics 0 November 28, 2002 08:06
About parallel in phoenics Bryan Lam Phoenics 6 December 27, 2001 00:54
How to run his phoenics in parallel mode? Bryan Lam Phoenics 3 December 19, 2001 09:43
Problems in Parallel PHOENICS Zeng Phoenics 3 February 27, 2001 13:28


All times are GMT -4. The time now is 04:10.