https://bugzilla.wikimedia.org/show_bug.cgi?id=235

--- Comment #38 from tahrey <[email protected]> 2011-10-17 13:23:02 UTC ---
I'm sorry... what?

The converter, because I happened to feed it a number that happens to be a
multiple of ten units, is arbitarily then assuming that it's being asked to
convert to an accuracy of +/- 5 units rather than +/- 0.5, even though I'm
writing the figure the same way as I did the ones either side? So if I have a
series of numbers from 1 to 100, I will have to make the required accuracy
explicit for 10% of them? I'm having a lot of trouble seeing how this can be
considered correct, or even acceptably wrong.

So if I entered some data on a set of vintage cars, that one had a recorded top
speed of 68mph, and another that had a recorded top speed of 70mph, coming to
the edit wiki not knowing of this abberant behaviour, or of what the actual
km/h equivalents were, but having seen the Convert tool used elsewhere, may
well copy and paste said thing into my article and thereby tell metric-using
readers that these cars achieved the same top speed?

No. This is dumb. The sensible way of approaching these things is that if a
number has been entered with no decimal point, then you convert to the nearest
whole-unit equivalent in the target units, unless there is a huge enough
difference that there would be a serious loss of accuracy (e.g. one set of
units is 200x smaller than the other, in which case you insert a couple
decimals). If it is entered with a certain number of decimals - including
70.00, for example - you convert to match that number (again, adjusting if the
target is hugely different). For large non-decimal numbers where standard form
is not used, then the user should then have to specify the accuracy they wish
to display. Reduced accuracy should not be the default, because then you have a
problem - as in this case - of the unwitting user having written a
multiple-of-ten (or multiple-of-100!) figure without a decimal, but meaning it
to be unit-accurate... as is the case with the arabic numbering system... and
not having a way of making that explicit... but your system goes and assumes
that it's one significant figure instead of two or three.

There's a place for assuming how many sig figs are used depending on what
number you use, maybe even for assuming ludicrously low ones (like... one!
which is your suggestion... i'm pretty sure my own math and sci teachers
recommended never using less than 3 if it could be at all helped; so 70 becomes
113, 55 becomes 88.5... that's not OTT, now, is it?), but this ain't it. We're
not in a science lab or using some degree level maths tool, but putting out a
relatively simple and easy to understand webpage editing interface for the
world's laypeople to update a general knowledge repository with.

The option should be left there, but the default behaviour should be different.
Alternatively, explicit definition of accuracy within the convert tag should be
made mandatory for all conversions and a sensible default retroactively applied
using a bot to all the existing ones where it isn't.

Please reconsider.

-- 
Configure bugmail: https://bugzilla.wikimedia.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.
You are on the CC list for the bug.

_______________________________________________
Wikibugs-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l

Reply via email to