Hi guys,
I am running some benchmarks on iOS, to run AIR 2.7 through it's paces.
I used BunnyMark (and the blitting counter part) for context:
http://www.iainlobb.com/bunnies/BitmapTest.html
iPhone 3GS results:
Display List CPU: ~18 fps
Display List GPU: 24 fps (constant frame rate)
Blitting CPU: ~19 fps
Blitting GPU: I have to retest, my build was screwed, and it got late. :-)
I'll test iPhone 4 next.
There is a problem though - I don't trust the numbers on the CPU. It
looks choppier than the frame rate would suggest, with some noticeable
hiccups that aren't reflected in the frame rate number.
I suspect the problem is that Flash is skipping frames during rendering,
but the ENTER_FRAME event is still ticking the FPS meter.
Is there a way to ensure an FPS measure only ticks if there is a RENDER
event? Would moving it from ENTER_FRAME to RENDER actually do that?
I've only ever used RENDER with manual stage.invalidate() calls in
ENTER_FRAME (and display list access all in RENDER, which might just be
incorrect usage), and I suspect I'm actually disabling built in frame
skipping by calling invalidate on every ENTER_FRAME event (because if
the frame rate slows, so does the game logic) - I'm not really sure what
the right way to use RENDER is.
So I guess that's the question, if I don't bother with
stage.invalidate() and continue to do my display list stuff in
ENTER_FRAME, and only measure FPS in RENDER, will I get a more accurate
framerate value?
Thanks,
Kevin N.
_______________________________________________
Flashcoders mailing list
Flashcoders@chattyfig.figleaf.com
http://chattyfig.figleaf.com/mailman/listinfo/flashcoders