Here's my situation: I have an app which does a bunch of image
manipulations, and those images are ideally full-screen sized. It
works on a G1 (or HVGA emulator), but runs out of memory on a WVGA
emulator instance, because full-screen images use twice as many
pixels. Fine, I can work around it by manipulating smaller images,
then scaling up to WVGA at the end. There's some loss of image
quality, but this is unavoidable on a WVGA device with a 16MB heap
limit, so I'll live with that.

When a real WVGA device hits the streets in the next couple of months,
though, it's likely to have more than 16MB heap per app, for just this
kind of reason. So for best image quality, I'd like my app to adapt to
this situation, and use full-screen-sized images on such a device.
IOW, I'd like to implement a heuristic which sets the image size based
on heap size.

In order to do so, however, the app needs to know what the maximum
heap size is, and I haven't yet found an SDK call which will return
this information. The various Debug.get* memory calls all seem to be
to do with how much heap you have *allocated*, not how much you
theoretically *can* allocate. I understand that this isn't necessarily
a hard number, that issues like fragmentation and GC mean that you may
not actually be able to allocate every last byte, but a theoretical
number would still be useful.

Can anyone point me to an SDK call I've missed?

Thanks,

String
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google
Groups "Android Developers" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to