On Tue, 8 Jan 2002, Richard B. Kreckel wrote:
No, no, no! This is not `meaningless garbage', although the human eye might think it is. This `garbage' is absolutely correct from the computer's point of view. See below...
You are, of course, correct about the mathematics. My point was more that the human eye is the intended audience of an _output_ routine.
Option 2) I am not entirely sure but to me this looks purely cosmetic and would only lull the user into a false sense of security.
Well, apparently it's another quanitzation error of a different source -- humans can have either a false sense of security or a false sense of panic. ;-)
I still fail to see what's so wrong about 0.33333333333333333334. It gives you some valuable information, doesn't it?
No, actually. I do recall the convention from Numerical Analysis where one more place than is accurate is written, so the fix is probably just to put that as an explicit footnote in the docs, that this is the convention followed. But the last digit is read as "ignore this digit, it really could be anything."
After all, when you throw away the last digit, we could interpret the number 0.3333333333333333333 as 0.33333333333333333327, but the original is much closer to reality.
Again, I think the feeling is that it isn't a matter of interpretation. If you specify 30 digits of accuracy, you want 30 accurate digits. In your example, if you knew you had an answer accurate to <x> digits, you'd be creating the error by interpreting an accurate output. (If I've counted your decimal places correctly.) It all comes down to how many places are *guaranteed*. Sure, it's cosmetic. But beauty is in the eye of the beholder. I promise not to say anything more on the subject. ;-) Cheers, Phil -- "Trying to do something with your life is like sitting down to eat a moose." --Douglas Wood