I have an interesting issue with a mini charting application I am
trying to create.

You can see my current attempt here:
http://ibiseye.com/?stormID=AT200403&active=1&season=2004&wind=145&name=Charley&zoom=5&lat=26.3&lng=-71.05000000000001

and click "Charley Synopsis"

Basically, we are using hurricane wind-fields and a giant database of
all the property parcels in the state of Florida to calculate how much
property value was "at risk" during a given hurricane (this is live
during the season if a storm is forecasted or does go over Florida).

The problem is that the numbers are exceptionally varying in range. If
you look at the bars on that Flex Chart you'll notice we are using the
Logarithmic scale . The reason we had to do this is that the "all"
data (or even the data within the different categories if the 'all'
data is removed) is usually hugely different in value. This difference
is exasperated by differences in property types. What happens is you
end up with a chart where the bars are barely visible for a number of
the property types (which are even less visible if the chart is not
scaled logarithmically for the "all" category).

If you mouse over the current chart values, you'll get the actual
dollar value of each bar. It's easy to see (especially in the
Residential type) just how misleading the chart could be. They look
almost the same, but the blue bar is actually 5x greater than the red one.

Has anyone run into this issue before and do you all have any good
ideas of how to get around it? Right now I'm not really even looking
for code, but more of any creative solutions to displaying that
information in a way that isn't misleading (which the current scale
may be -- you have to remember how many old people live in Florida).

Thanks for any insight,

Charlie

Reply via email to