CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > General Forums > Main CFD Forum

Help

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   March 8, 2000, 22:18
Default Help
  #1
Quelos
Guest
 
Posts: n/a
I am using a CFD software to simulation. That is a steady laminar model. I found it very strange that the software gave a very small difference in the solution as every time the same case was repeated. Isn't it the solution should be constant for steady state? Is this due to the processor of the PC?
  Reply With Quote

Old   March 9, 2000, 09:52
Default Re: Help
  #2
John C. Chien
Guest
 
Posts: n/a
(1).The only time the results from a program is not exactly repeatable is when the code has some undefined variables, or it run into serious accuracy problem. (2). Run the code in double precision and check again the results. This should eliminate the accuracy problem. If the problem is still there, then, you have bugs in the program. (3). Unless the random number function is used somewhere in the program, a good program should give you identical results each time you run it.
  Reply With Quote

Old   March 9, 2000, 11:44
Default Re: Help
  #3
clifford bradford
Guest
 
Posts: n/a
if you run the code twice on the same problem with the same grid, same boundary conditions, same flow conditions the answer should be the same. the only reson i can think of why your answer should vary is single precision, presence of some random number generator in the code or that you stopped the code at different levels of convergence (or continued a simulation). try double precision, restarting completely from scratch each time, using precisely the same bc, and stopping at exactly the same number of iterations.
  Reply With Quote

Old   March 9, 2000, 18:42
Default Re: Help
  #4
Adrin Gharakhani
Guest
 
Posts: n/a
I fail to understand why a single precision or half precision or no precision code will not be repeatable! It is impossible. A code will be repeatable whether there is a bug in there or not. You will just receive the same bug over and over. This assumes that you operate the code under the same conditions (ALL conditions have to be the same, or all bets are off)

I agree with John & Clifford, though, that IF (for some reason) the problem uses a random number (and even this needs a qualifier: and IF the initial seed value is NOT fixed, for example, it is based on the CPU clock), then you will get different results for different tries, even if the code is bug-free.

If you dump the data and wish to continue (restart) the run, you have to make sure that you dump the data as binary, so that the input/output will be consistent with the precision of the code. Again, IF you dump and read all the relevant data the code WILL be repeatable irrespective of machine precision.

My suggestion: Contrary to my colleagues' recommendation, I would recommend strongly that there actually is a bug in the code if by just going from single precision to double precision you end up "solving" this repeatability issue!

Best of luck

Adrin Gharakhani
  Reply With Quote

Old   March 10, 2000, 12:11
Default Re: Help
  #5
clifford bradford
Guest
 
Posts: n/a
i think the reason why single precision can affect repeatability is that the last few digits in most floating point numbers contains some 'junk' numbers which are essentially random. for example if you do some simple operations on a float and then print out the full number you may see some weird digits on the end of the number. i say may because some compilers/operating systems are better at cleaning them up than others. so if you use single precision the 'junk' digits are in earlier decimal places and so have a greater effect on your results. this is often why it is difficult to converge residuals to the theoretical level of machine accuracy. perhaps john can give a better explanation but this is my understanding.
  Reply With Quote

Old   March 10, 2000, 15:41
Default Re: Help
  #6
Adrin Gharakhani
Guest
 
Posts: n/a
The junk exists whether it is single or double precision. But the junk cannot be random _unless_ the software (at the hardware level) uses time-dependent random processes for essential computations, such as addition/subtr/other operations. I know this is not the case. The algorithms used are all deterministic, and what's more they are tested for repeatability. If a problem like that of the pentium creeps in, the world knows about it right away.

The issue is that even the "junks" will be repeatable. No doubt about it.

More than anything else, I believe Quelos has not spelled out the entire story. For example, whether the same problem with all the same conditions was run on the same machine, etc.

Adrin Gharakhani
  Reply With Quote

Old   March 10, 2000, 16:48
Default Re: Help
  #7
clifford bradford
Guest
 
Posts: n/a
you're probably right. if he's using a commercial code which he might be doing this isn't even an issue so he's probably changing something my guess is either the grid or the number of timesteps/iterations.

  Reply With Quote

Old   March 10, 2000, 16:54
Default Re: Help
  #8
John C. Chien
Guest
 
Posts: n/a
(1). When someone is having problem in getting repeatable solutions, the standard procedure is to first run everything in double precision. Then try to locate the undefined variables, such as non-initialized variables, undefined variables, variables with subscript outside the range,...etc. (2). It is hard to know the origin of the problem, because a program can be doing all of the possible operations. Or it could be a system problem, who knows. This is a general answer to a general question. To go beyond this point, more information must be available in addition to just "a program". We had similar problem before, a in-house-modified commercially-available code was not producing consistent results, and I had to check it out by changing to double precision, running on different machine, different operating system, number of iterations between save and restart operations,... (3). The program operation can be very complicated, including job control from scripts which handles intermediate save and job restart, graphic files output, temporary files,..., formated input files, related default values....etc. It may not be obvious from the user's side (4). I don't have the time right now to answer the question " whether junks are repeatable or not" The answer to the original question was how to isolate the bugs in the program. It is possible that there is nothing wrong with the program in the first place.
  Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On



All times are GMT -4. The time now is 22:41.