tags:

views:

31

answers:

1

Here is an example to explain what I'm on about:

c = Decimal(10) / Decimal(3)
c = Decimal(10) / Decimal(2)

If I do this, then print c, the inexact and rounded flags are raised. Even though the result is accurate. Shouldn't flags therefore be attributes of numbers rather than the context? This problem is especially apparent when I program lengthy calculation using a stack.

To pose a more meaningful question: How do I properly deal with this? It seems I have to keep track of everything manually. If I clear flags before calculations, I would lose some informations about numbers calculated before. Which may now appear accurate. This is especially annoying when working with numbers like 1.0000000154342.

For a critical application, it would be really nice to be sure about what is accurate and what isn't. It would also be nice to not have the wrong flags raised, just because it looks bad.

It still assume there is a good rationale behind this, which I haven't understood. I'd be thankful for an explanation.

A: 

Read this: http://docs.python.org/library/decimal.html#context-objects

You have context objects that you can save and restore so that you don't lose flag values.

S.Lott