On Tue, Mar 12, 2002 at 09:08:12AM +0100, dadou wrote:
| 
| SetCodecAttr() seems to change values in the avirc.
| try to change the value of the BitRate (hexadecimal)
| in you .avirc an try again.
| 
| If you dont want to edit yourself your .avirc
| you can do this way: run avicap
| menu configure
| and change the BitRate attribute of your codec, and then rerun your
| utility
| 
| Normally, it will work.

That would mean I can no longer programmatically change the bitrate at
runtime (i.e. when running my utility, I specify the bitrate as a command
line option, rather than having to muck with all these extra steps)!  This
would be a major loss of functionality that existed previously.

I don't think your statement is correct, though.  Processing _is_ reaching
this code

...
if (RegQueryValueExA(newkey, "BitRate", 0, 0, &bitrate, &count) == 0)
{
        double d = *(double*)(hnd+0x14c0);
        *(double*)(hnd+0x14c0) = bitrate;
        AVM_WRITE("Win32 video encoder", "BitRate %d  (%f)\n", bitrate, d);
        m_iBitrate = bitrate * 1000;
}
...

which sets the value in the codec (supposedly).  The output from the
AVM_WRITE does contain the new bitrate I originally set via SetCodecAttr
(that was part of the excerpt from my previous post).  I'm speaking under
the assumption that this is the correct way to set the bitrate and that it
isn't unset somewhere else that I'm not seeing.

_______________________________________________
Avifile mailing list
[EMAIL PROTECTED]
http://prak.org/mailman/listinfo/avifile

Reply via email to