I've downloaded Unity and 3Dception's existing stuff to play with and my
first comments are: Unity is fun!  Everyone I know that has tried it thinks
it's a great environment to work in.  And 3Dception is well integrated into
the Unity world.

This new VR audio workflow looks to me to be about monitoring in the target
environment, i.e. VR, while you are creating your audio elements.  This is
probably important in this new world that everyone is still learning about.
 (Though I might counter that ambisonics probably scales from small to
large monitoring systems more gracefully than most.)  Looks like a plugin
for the DAW's that does the rotation and rendering and then a synchronized
video player to feed the VR headset and send the headtracking data back to
the plugin.

I believe the target runtime is the same as their current stuff.  You can
place individual audio objects in the scene and/or a B-format player.
Connect a widget to the camera for head position and the audio engine takes
care of mixing, rotating and rendering.  Note that the render code is
binary and different for each platform so they could potentially be using a
different method on phones where cycles are limited.  Even with unlimited
power, the required low latency means that even 512 sample blocks are too
long at the highest frame rates and some optimization is required.

Note that native Unity audio can do most of this, but I believe 3Dception
has a proprietary binaural system they think is better, and of course they
are one of the very few applications that take B-format directly, even in
this enlightened age.

David
VVAudio



On Tue, Oct 27, 2015 at 7:27 AM, Dave Hunt <[email protected]>
wrote:

> Hi,
>
> This doesn't yet seem to be available.
>
> Part of it is not too radical: the use of binaural plug-ins for a DAW.
> These have been around in various guises for quite some time, but have not
> been widely taken up despite the huge number of listeners using headphones
> (especially on mobile devices) and the widespread availability of stereo
> (as opposed to truly multi-channel other than  5.1/7.1) audio DAWs. Two
> channel plug-ins are generally compatible with all of them, multi-channel
> plug-ins are not.
>
> They do seem to offer some sort of binaural surface reflection algorithm,
> which must be on a per source basis. A binaural global reverb algorithm is
> harder to implement, and binaural plug-ins that I have seen that have a
> reverb implementation has a reverb per instance. This can get DSP
> intensive, and makes it very tedious to make global changes.
>
> They do mention incorporation of head tracking, but details are vague.
>
> The whole scheme seems to be a re-working of their games engine plug-ins,
> where sound sources and the listener can move freely,  to standard DAW
> plug-in formats. Using a DAW would be, at best, a simulation, with audio
> file delivery to games developers being discrete mono, stereo or surround
> stems as at present. The final binaural coding would be on an audio object
> or stem basis in the game engine.
>
> As far as I know you cannot undo binaural coding or spatially manipulate
> it (at least not easily and only with static sources and listener).
>
> Could be useful for binaural audio production (not exactly a huge market)
> and producing demonstrations for games audio using their existing products,
> but is not an answer to every problem. The prospect of communication with
> more interactive and non-linear audio software (e.g. Max, pd,
> SuperCollider, Live ??) would be much more interesting.
>
> Ciao,
>
> Dave Hunt
>
>
>   1. "Spatial Workstation" for 360? (VR) audio mixing/edition
>>       (Stefan Schreiber)
>>
>> From: Stefan Schreiber <[email protected]>
>> Date: 25 October 2015 19:40:41 GMT
>> To: Surround Sound discussion group <[email protected]>
>> Subject: [Sursound] "Spatial Workstation" for 360ยบ (VR) audio
>> mixing/edition
>>
>>
>> FYI...
>>
>>
>> http://www.roadtovr.com/two-big-ears-spatial-workstation-delivers-realtime-cross-platform-3d-vr-audio-mixing/
>>
>>
>> 3D spatial audio specialists Two Big Ears have launched a new 'Spatial
>>> Workstation', a platform for audio engineers to mix and edit immersive
>>> audio with realtime feedback leveraging VR headset head-tracking
>>> information.
>>>
>>
>>
>> See "Workflow" picture...
>>
>>
>> The API is designed to provide an interface for developers to deliver
>>> accurate and compelling spatial audio, that is - audio which recreates a
>>> realistic sound-stage akin to the real world. This is particularly
>>> important for virtual reality as what you hear is a key trigger point for
>>> presence
>>>
>>
>>
>> Best,
>>
>> Stefam
>>
>
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <
> https://mail.music.vt.edu/mailman/private/sursound/attachments/20151027/24a3086f/attachment.html
> >
> _______________________________________________
> Sursound mailing list
> [email protected]
> https://mail.music.vt.edu/mailman/listinfo/sursound - unsubscribe here,
> edit account or options, view archives and so on.
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: 
<https://mail.music.vt.edu/mailman/private/sursound/attachments/20151027/d426109a/attachment.html>
_______________________________________________
Sursound mailing list
[email protected]
https://mail.music.vt.edu/mailman/listinfo/sursound - unsubscribe here, edit 
account or options, view archives and so on.

Reply via email to