On Jan 12, 8:33 pm, Dianne Hackborn <[email protected]> wrote: > Unfortunately, I don't have a good solution if you want to get the real > exactly screen dots per inch. One thing you could do is compare xdpi/ydpi > with densityDpi and if they are significantly far apart, assume the values > are bad and just fall back on densityDpi as an approximation. Be careful on > this, because a correctly working device may have densityDpi fairly > different than the real dpi -- for example the Samsung TAB uses high density > even though its screen's really density is a fair amount lower than 240.
Thanks Dianne. What's going on with the Tab? Its true dpi, which it does accurately report in the xdpi and ydpi fields, is about 170. I could imagine that a tablet might be typically held further from the face than a phone, hence the dpi that should be used to control the size of graphic elements etc. might be set to a different value - but surely, it should be set to a lower value, shouldn't it? Yet you're saying it reports a higher value. Huh? What were they thinking? I've also just looked at my AC100 (Tegra 250), and it reports xdpi = 160, yet the correct value should be about 120. As you say, this seems to be completely broken, and the Tegra example shows that 96 cannot be used as a sentinel for a wrong value. Oh well. -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/android-developers?hl=en

