Nodes represent the logical routing endpoints. Sinks etc. can expose a
node, but not all of them do. For example, we want to allow routing one
input node to multiple output nodes, and this might require a combine
sink to be created, but this automatic combine sink shouldn't appear as
a node.
seems to me that
'nodes' could also mean a change in audio profiles, eg the headset is
really an audio codec attribute and routing to the headset isn't
necessarily a change of routing within PulseAudio.
The headset would have a port in PulseAudio, and that port would be
exposed as a node. Routing to that node causes the port to be activated.
So if I have a virtual sink with some sort of processing and I want to
hear the result on the headset, how does it work? It's not clear to me
that you can always think in terms of endpoints for routing. Likewise
for something like acoustic echo cancellation how would you trap the
echo reference?
There are also cases where an output cannot be used because of the setup
of another output, eg when it's physically muxed with something else or
it depends on a clock defined elsewhere. How are physical constraints
reported to the user?
That's a good question. Currently the constraints aren't visible to
clients. If a UI developer shows up and tells that he/she needs the
conflict information, then we have to come up with something. It would
be simple to just attach a list of conflicting nodes to the node
structure, but I'm afraid the conflicts aren't always that simple.
You have one right there for local outputs, most solutions provide
access to the headset or to the speaker, not to both. If this simple
conflict between 'nodes' can't be represented then I wonder how
practical this solution is.
_______________________________________________
pulseaudio-discuss mailing list
[email protected]
http://lists.freedesktop.org/mailman/listinfo/pulseaudio-discuss