On Sat, May 05, 2012 at 09:08:52PM +0100, Chris Harding wrote:
> Hi guys,
> 
> I'm interested in compiling Schrödinger for iOS. I was wondering if anyone 
> has experienced building on arm, or on embedded devices at all, and can offer 
> any tips or advice.
> 
> Ultimately I'd like to stream medium quality video from the camera in real 
> time, preferably with very low latency. Anyone know if this is likely to be 
> possible on, say, the iPhone 4S, which I believe has a dual core 1GHz CPU.
> 
> I've not seen much about streaming Dirac video. Can it be done? How easy is 
> it to compared to something like H.264, where the byte stream is already 
> broken up in to NAL units and can easily be packetised.
> 

I see two difficulties.  First of all, the iOS camera delivers images
a frame at a time, as opposed to one line (or chunks of lines) at a
time.  This means that any streaming of this video automatically has
30 ms (1/30 fps) of latency.  Low-latency profile in Dirac is meant
for lower latency than this, i.e., 16 scan lines or less.  As long as
you are forced by the camera to have 30 ms of latency, you can use
any frame-based encoder (i.e., all of them) with little added latency.

The second difficulty is that the Schroedinger encoder uses a lot of
memory bandwidth and relies on large cache sizes, which is a design
choice that makes sense on high-end x86 CPUs, but not so much on a typical
ARM mobile device.



David



------------------------------------------------------------------------------
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
_______________________________________________
Schrodinger-devel mailing list
Schrodinger-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/schrodinger-devel

Reply via email to