On Fri, 2018-04-06 at 14:05 +0200, Pali Rohár wrote: > This opens question about another problem: Who, how and when should > handle events from input devices? Integrated PS/2 keyboard on laptop is also > input device, external USB (or PS/2) keyboard on desktop is also input > device. And they could have also MIC_MUTE or MUTE buttons. Should these > buttons mute all sound cards? Or only those which are "integrated" (e.g. > not bluetooth headsets)? And who should handle these mute buttons? > Pulseaudio? Desktop hotkey daemon (e.g. KDE has own)? Or some new daemon > for these actions? Because currently e.g. KDE already handles mute > buttons/keys by its own.
I think those — generic input devices which aren't associated with a specific audio device — are fine as they are, being handled by the desktop as you say. But I think device-specific controls are different, and should probably be handled through PulseAudio. Right now I have a Pidgin plugin¹ which opens any USB HID it can find, and then treats that as "the" headset for a call regardless of which audio device is actually in use. Which is clearly wrong, and is the underlying reason for me starting to look into how to do this "properly". > I agree with you, that mic mute button on USB/bluetooth headset should > mute just microphone on that headset. It's actually a bit more than that. Muting was the main reason I started looking in to this, in fact. Conference protocols have a network-level 'mute' where the software actually stops sending audio frames. Previously, if a user muted the headset then they wouldn't appear as "muted" on the call; they'd just be sending audio frames which happened to contain a lot of zeroes. And if the user was muted in software (by the conference organiser or the client UI), pressing the headset mute button didn't ummute them on the wire. Now it all works correctly. For Bluetooth HSP devices, it looks like this might *already* work through PA+GStreamer. There's a "mute" property on the gstpulsesrc element, which I ought to be able to monitor and set within the Pidgin GStreamer pipeline to coordinate with the protocol-level mute status. Given that, it seems to make a lot of sense to do it that way for the HID/jack devices too. It means the application doesn't have to attempt to match the discovered HID/input devices to the audio device in use in the pipeline, and it also makes the permissions on those device nodes easier to handle. ¹ http://git.infradead.org/users/dwmw2/pidgin-headset.git
Description: S/MIME cryptographic signature
_______________________________________________ pulseaudio-discuss mailing list firstname.lastname@example.org https://lists.freedesktop.org/mailman/listinfo/pulseaudio-discuss