I've barely (if at all) used real or decimal fields in a long time, so I do
not recall if this was ever working like how I would expect it to, bearing
in mind the true mathematical meaning of the term precision used with
decimal or real numbers.

 

How does the AR System store Real Number fields when a certain precision
level is selected as its property? For e.g. if I choose a Real Number field
to have a precision of 10, if I stored a number 2341.1234567891, I would
expect Remedy to display 2341.1234567891 as the value that was stored after
the commit. However it displays 2341.1234570000 after rounding up, which I
find to be strange. And in the database, it stores only 2341.123457.

 

Precision, thus I have noticed is the total count of digits BOTH before and
after the decimal point in Remedy AND NOT the count of digits after the
decimal point which according to me would be a traditional definition. Is
this a bug?

 

I am messing around with AR System 7.6.04 Patch 003 installed on Windows
2008 and using Ms-SQL 2008 R2 as the underlying DB for those who want to
know my environment.

 

Cheers

 

Joe


_______________________________________________________________________________
UNSUBSCRIBE or access ARSlist Archives at www.arslist.org
"Where the Answers Are, and have been for 20 years"

Reply via email to