Hi all, I've developed a remote desktop application, which is like VNC.
The remote desktop is encoded as H.264 and streamed to Android handset, while the interaction(touch screen, accelerator sensor changed, etc) on the handset is send back to the server. The application works fine except sometimes the decoding thread is very lag. For instance, decoding one frame costs about 20ms at beginning, but a few minutes later, the time increases to over 100ms. That will last for severl seconds, than the time goes back to 20ms. I've substituted a do-while loop for the decoding function , the same thing happens. So it should be no business of the decoder. Then i found if i do not send the interaction information to the server, there is no lag any more.(The interaction information is still gathered, but not sent) The uploading of interaction message is triggerd at sensor listener, and the H.264 decoding is done in another thread. Since the message is very tiny (<20bytes, per 30ms), i don't understand why it will cause the lag?? The cpu occupancy is about 60%. -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/android-developers?hl=en

