Yes, but that's not fully spec'ed or implemented. But that is the goal, yes. :)
On Sun, Sep 13, 2015 at 7:18 AM, Russell McClellan < russell.mcclel...@gmail.com> wrote: > It looks like the latest suggestion from the github thread tracking > this issue (https://github.com/WebAudio/web-audio-api/issues/12), > would cover your use case, right? > > Thanks, > -Russell > > On Sat, Sep 12, 2015 at 11:15 PM, Kumar <srikuma...@gmail.com> wrote: > > Hi all, > > > > This is to recap an earlier conversation on synchronizing audio precisely > > with MIDI events and visuals when we're scheduling these a little into > the > > future. > > > > http://lists.w3.org/Archives/Public/public-audio/2013AprJun/0456.html > > > > Am I right in noticing that we don't yet have a solution that reliably > maps > > between AudioContext.currentTime and the DOMHiresTimeStamp value gotten > > through performance.now() or requestAnimationFrame? (This doesn't of > course > > apply to the OfflineAudioContext.) > > > > Given that the spec says currentTime "increases in realtime", the > inability > > to connect it with DOMHiresTimeStamp makes precision scheduling of MIDI > and > > visuals hard or impossible to do reliably. > > > > This issue is only somewhat related to getting latency info. It would be > > possible to construct an API for this that includes the latency info (for > > ex: context.currentRealTime - performance.now()) or have the latency be > > provided separately. Either way, a mapping is indispensable I think. > > > > Best, > > -Kumar > > > > > > > > > >