I entered an issue about this last week, but with no feedback on that, I thought I may get a better response here.
Here is the link to the issue that goes into more detail with an example: http://code.google.com/p/google-visualization-api-issues/issues/detail?id=824&sort=-id&colspec=ID%20Type%20Status%20Priority%20Milestone%20Owner%20Summary%20Stars Basically, the issue pops up whenever you have a tree map with more than 1 level of drill down. It appears when the color value is calculated, it calculates an average based on the child directly below it. This works fine as long as you are only drilling down one level, but when you have more levels, you end up calculating the color value based on averages of averages. This leaves me with data that can vary in color drastically based on the data included in the middle tiers. The issue I linked to is the simplest dataset I could come up with to show the problem. I have seen other post discussing how the color values are calculated, but nothing discussing the legitimacy of these calculations. Has anyone else noticed this? Personally, I would like to have the ability to specify my own color values. I am calculating them anyway to display to the user, so why not just use the values I put in there. Also, it would lead to the ability to use values other than numbers that can be averaged. For instance, I have a treemap that I would like to make where the color would represent percent of utilization. This value would not be something that would make sense to be averaged with other percents on the higher levels of the treemap. Instead, it would be a value that would have to be calculated independently at each level. -- You received this message because you are subscribed to the Google Groups "Google Visualization API" group. To post to this group, send email to [email protected]. To unsubscribe from this group, send email to [email protected]. For more options, visit this group at http://groups.google.com/group/google-visualization-api?hl=en.
