Hi,

Maarten Lankhorst wrote:
>>> - Align buffer size to a multiple of period size
>> How can you pass the tests with that? It's wrong with both capture 
>> (annoyingly IMHO) and playback.
>I only really need it for capture, rendering needs it too since the tests
>show that this is the case,

>and seems to be against the tests too that everything is a multiple of period.
What is the case? Where did you see that?

Native mmdevapi does not align GetBufferSize on period boundaries,neither render
nor capture. The current tests may not reveal it, because 500ms is a multiple 
of 10ms,
but I've enough additional tests in my git and log data from testbot to be sure 
about that.

I consider it unsafe to diverge from native when rendering.

>but there's nothing in the code that depends on it.
Good.

>For capture it's different, you need to keep track of packets somehow.
>[...] Having one packet that's not a period is a pain,

I felt the same pain with winecoreaudio. I think I'm going to agree (with 
myself) to disagree
(with mmdevapi), as standards would say, and align capture GetBufferSize on 
period boundaries.
This will considerably simplify the code. I've not changed winecoreaudio 
capture yet.

So I find it ok if winepulse does the same:
- capture buffer as multiple of periods if it simplifies packet handling,
- render buffer exactly like native, not a multiple of period (cf. MulDiv in 
the tests).

And I'll change the capture tests to ensure it asks for buffer sizes as 
multiples of period,
so the divergence won't show up :-)

BTW, please use the MulDiv computations so as to minimize differences among the 
3-4 drivers.
Or exhibit tests that prove them wrong.

>I'd have to recheck on my windows pc
I forgot: Don't be wary of returning GetMixFormat with 6 channels if PA's 
device is 5:1. Native
does that too (we had a few logs on test.winehq with that result)
(it may be too early if dsound chokes on that, but that would be a bug in 
dsound).

>Is it really IsFormatSupported's job to deal with a WAVEFORMATEX struct
>with only cbSize and wFormatTag and it will get out something sane all the
>time, no matter how stupid the input?
I've never seen IsFormatSupported return something else than GetMixFormat and 
that is EXTENSIBLE.
For months I thought it would always return S_FALSE and GetMixFormat, no matter 
how stupid the input, but:
a) I've no tested this with ADPCM or other such stuff;
b) I've not tested channel variations with >2 channels, lacking a 5:1 or such 
card;
c) recent test.winehq has one Vista machine return E_UNSUPP_FORMAT
    when the rate was != GetMixFormat. I have a patch in my queue to have the 
render test accept that variation.

What I believe is:
1. During mmdevapi init, MS queries a GetMixFormat from each working device and 
caches that.
2. Later, IsFormatSupported returning GetMixFormat is a cheap operation: 
clone_format(cached_mix_format).
None of our drivers work this way.  Doing it would need some thinking about 
dynamic reconfiguration and plug-in devices.

Regards,
 Jörg

Reply via email to