On 3/15/11, Robert O'Callahan <[email protected]> wrote: > Instead of creating new state signalling and control API for streams, what > about the alternative approach of letting <video> and <audio> use sensors as > sources, and a way to connect the output of <video> and <audio> to encoders? > Then we'd get all the existing state machinery for free. We'd also get > sensor input for audio processing (e.g. Mozilla or Chrome's audio APIs), and > in-page video preview, and using <canvas> to take snapshots, and more... > That would make sense, assuming you are suggesting reusing objects representing media elements, rather than using HTML elements.
- [whatwg] Stream API feedback Nicklas Sandgren
- Re: [whatwg] Stream API feedback Simon Pieters
- Re: [whatwg] Stream API feedback Jonathan Dixon
- Re: [whatwg] Stream API feedback Nicklas Sandgren
- [whatwg] Stream API Feedback Lachlan Hunt
- Re: [whatwg] Stream API Feedback Lachlan Hunt
- Re: [whatwg] Stream API Feedback Robert O'Callahan
- Re: [whatwg] Stream API Feedback Bjartur Thorlacius
- Re: [whatwg] Stream API Feedback Robert O'Callahan
- Re: [whatwg] Stream API Feedback Olli Pettay
- Re: [whatwg] Stream API Feedback Lachlan Hunt
- Re: [whatwg] Stream API Feedback Olli Pettay
- Re: [whatwg] Stream API Feed... Lachlan Hunt
- Re: [whatwg] Stream API ... Olli Pettay
- Re: [whatwg] Stream API ... Philip Jägenstedt
- Re: [whatwg] Stream API ... Olli Pettay
- Re: [whatwg] Stream API ... Philip Jägenstedt
- Re: [whatwg] Stream API ... Leandro Graciá Gil
