markusn82 wrote: >> -- On a factory-reset G1 with Android 1.6: CPU and GC from the >> background process steals 5-10% of throughput from the foreground >> compared to no service work at all, which would fit if the background >> logic is in the low-priority class that Ms. Hackborn described. > > Where did you get this 5-10% value from? Is this overall CPU > consumption over a given period of time? Is the CPU usage from the > background thread uniform over that period of time?
The test activity has a TimerTask firing every millisecond for one minute. During each millisecond, it spends ~900,000 nanoseconds calculating the square root of pi, today being Pi Day and all. Every second, it notes how many times the millisecond timer actually was invoked during that second, as an entry in a pre-allocated long[] (so as not to generate any garbage during the run). So, we get two indicators of the load: how many missed timer ticks there were in each second, and how many square roots actually got computed overall, while the activity attempts to be actively busy for 90% CPU. It was as close as I could come to simulating something frame-rate-esque in the time I could allot to the problem. About 15 seconds into the activity's run, the background process (Service) kicks in to exercise something. One test just does its own square-root-calculating busy loop. One test allocates a handful of 1MB buffers, then does a System.gc() and stopSelf(). One test allocates about a thousand 10KB buffers (then GC/stopSelf()). One test attempts to allocate 200,000 10-byte buffers, though on hardware it can't get through all of that before the rest of the activity's minute-long run ends. Each of those test are run separately, with full restarts of each process in between runs. When running without background load, the timer counts are 1,000 (+/- about 2 for timing variances) for all but a couple of seconds scattered during the run, where they drop for around 890 on the G1 and the Ion -- the emulator does not demonstrate this effect. For the moment, I am assuming these are system events, triggered by something with hardware. It wouldn't appear to be tied to the radios, though, as the G1 was in normal GSM mode (no WiFi or Bluetooth) and the Ion was in airplane mode. The G1, in particular, was factory reset before running this test, so there are no third-party apps or anything on there other than my own stuff, and I disabled sync. When running with background load, during the period of time when the background process is doing work, the G1 and Ion both experienced degraded results. The G1 (1.6) had most of the results in the 900-950 range during the active GC period, with a couple dropping to more like 850. The Ion (1.5) was all over the map, falling to as low as ~250, but mostly hovering around 500 (plus or minus a whole bunch). The total square root calculations were similarly affected. So, to answer your questions: -- The 5-10% and 50% degradation are based on the per-second timer tick counts and the total throughput of square-root calculations by the activity. -- The CPU usage of the background thread will vary widely. In the one test, where it tries to use the CPU directly, it'll be fairly uniform, just based on implementation. Similarly, the 200,000 10-byte allocations test probably was mostly CPU load from the test proper versus the GC logic. The larger allocations would be mostly the GC process, but actual CPU utilization is less controlled, since I don't have control over the GC algorithm itself. Clearly, these tests could use more work, including a better recording of the output besides the end-of-run dump to logcat. But, they're a starting point. In particular, I was looking for signs that I could get a background operation to steal big chunks of CPU time from the foreground, by any means (GC, direct CPU usage, even stopping a service). That clearly occurred in 1.5 and did not seem to occur on 1.6. Since the G1 and Ion have similar CPUs (IIRC), if the process/thread class priority were busted, I'd've expected 1.6 to fare closer to how 1.5 did. One scenario my tests do not cover is allocations outside the standard Java heap. Perhaps the contention isn't from normal GC, but GC of Bitmap data or something, meaning only certain types of other apps will cause the frame rate hiccup. However, while I do intend to devote a bit more time to experimentation on this, I'm not going to invest tons of time. With my results appearing to corroborate Mr. Nanek's, and both lining up with what's supposed to happen, I can't afford to just blindly try stuff to see if I can capture the effect. That why everybody in tech support longs for the reproducible test case, so that we start from failure and test to contain and resolve it. BTW, out of curiosity, when you get this effect with your game on your Milestone, are you using Bluetooth for anything? I ask, because there was a recent question on StackOverflow from somebody who -- for reasons I'm not quite sure of -- is trying to do active Bluetooth pairing and WiFi SOAP requests while simultaneously playing a video stream. While Bluetooth discovery is active, video playback suffers. http://stackoverflow.com/questions/2361736/streaming-video-playback-performance-issues-with-background-tasks-on-android -- Mark Murphy (a Commons Guy) http://commonsware.com | http://twitter.com/commonsguy _The Busy Coder's Guide to *Advanced* Android Development_ Version 1.3 Available! -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en