On Monday 17 November 2014 00:34:48 Nichols Andy wrote: > This is a bit of a side issue from the topic at hand, but I am the correct > person to answer this question so I will. Regarding VideoOutput on iOS > there is currently a serious limitation that we have been unable to over > come. If you are using the QWidget based API on iOS (and you probably > shouldn’t be) then everything should work fine. That is because it is easy > to embed “native” controls in QWidget hierarchies, and thats what we do for > QtMultimedia video output (overlay a AVPlayerLayer where the video output > should go). However when we would like to render video in a Qt Quick 2 > scene then we need to be able to render the video to a texture. On OS X it > is possible to render video to a OpenGL texture from a hidden AVPlayerLayer > window, and then render that in the Qt Quick 2 scene, but that API is not > available on iOS. There is AFAIK currently no high-level API to render > video from an AVPlayer to an OpenGL texture on iOS. The work around to > provide any video at all in Qt Quick 2 is to to instead render to a > “Window” control which instead falls back to the overlaying the native > video window surface on top of the QQuickWindow. Yes it is less than > ideal, but we are not the only framework with this limitation.
Why do you want to render video non-fullscreen anyway on a device with a small screen? Once the user clicks the play button, go to full screen with rotation support. -- Thiago Macieira - thiago.macieira (AT) intel.com Software Architect - Intel Open Source Technology Center _______________________________________________ Development mailing list [email protected] http://lists.qt-project.org/mailman/listinfo/development
