RE: [Flac-dev] RE: basic encoder help
I'm currently facing the same problem. I added the libFLAC++ libraries to my MSVC application. I implemented the same quality levels (0-8) as used in the FLAC frontend application. But the resulting files are remarkable different between my application and the FLAC frontend (although using the same settings). It did turn out to be something in my byte ordering in the end as suggested. I tested the decompressed file against the original wav file and it checked out and played fine, but when i rearranged the byte ordering to use signed char instead of the unsigned chars i was feeding it, the compression ratio was greatly improved. As a quick hack i ended up with: for (int i=0; i2352; i+=4) { // 2352 because number of bytes in frame of CDDA sample[j++]=(((char)buffer[i+1]8)|(0x00ff(int)(char)buffer[i])); sample[j++]=(((char)buffer[i+3]8)|(0x00ff(int)(char)buffer[i+2])); } the (char) casts made all the difference. I'll change the software to use all signed chars, but this works for now. HTH. Andrew ___ Flac-dev mailing list [EMAIL PROTECTED] http://lists.xiph.org/mailman/listinfo/flac-dev
Re: [Flac-dev] RE: basic encoder help
--- [EMAIL PROTECTED] [EMAIL PROTECTED] wrote: I'm currently facing the same problem. I added the libFLAC++ libraries to my MSVC application. I implemented the same quality levels (0-8) as used in the FLAC frontend application. But the resulting files are remarkable different between my application and the FLAC frontend (although using the same settings). this is hard to debug without having all the code... what if, for each FLAC::Encoder::*::set_*() functions, after calling init(), you call the equivalent get_*() functions and see if you get back what you set. Josh __ Do you Yahoo!? Check out the new Yahoo! Front Page. www.yahoo.com ___ Flac-dev mailing list [EMAIL PROTECTED] http://lists.xiph.org/mailman/listinfo/flac-dev
RE: [Flac-dev] RE: basic encoder help
I was also playing with the signed/unsigned char thing, and it solved my problem too. Although I don't understand why this could make a difference? -Original Message- From: Andrew Gatt [mailto:[EMAIL PROTECTED] Sent: vrijdag 5 november 2004 11:24 To: [EMAIL PROTECTED] Cc: [EMAIL PROTECTED] Subject: RE: [Flac-dev] RE: basic encoder help I'm currently facing the same problem. I added the libFLAC++ libraries to my MSVC application. I implemented the same quality levels (0-8) as used in the FLAC frontend application. But the resulting files are remarkable different between my application and the FLAC frontend (although using the same settings). It did turn out to be something in my byte ordering in the end as suggested. I tested the decompressed file against the original wav file and it checked out and played fine, but when i rearranged the byte ordering to use signed char instead of the unsigned chars i was feeding it, the compression ratio was greatly improved. As a quick hack i ended up with: for (int i=0; i2352; i+=4) { // 2352 because number of bytes in frame of CDDA sample[j++]=(((char)buffer[i+1]8)|(0x00ff(int)(char)buffer[i])); sample[j++]=(((char)buffer[i+3]8)|(0x00ff(int)(char)buffer[i+2])); } the (char) casts made all the difference. I'll change the software to use all signed chars, but this works for now. HTH. Andrew ___ Flac-dev mailing list [EMAIL PROTECTED] http://lists.xiph.org/mailman/listinfo/flac-dev
RE: [Flac-dev] RE: basic encoder help
libFLAC requires input PCM be signed integers. if you were passing unsigned ints in I'm surprised it didn't clip. BTW are either of you guys the one working on FLAC support in plextools? because I hear it has the same problem, i.e. lower-than-expected compression. Josh --- Saruman [EMAIL PROTECTED] wrote: I was also playing with the signed/unsigned char thing, and it solved my problem too. Although I don't understand why this could make a difference? -Original Message- From: Andrew Gatt [mailto:[EMAIL PROTECTED] Sent: vrijdag 5 november 2004 11:24 To: [EMAIL PROTECTED] Cc: [EMAIL PROTECTED] Subject: RE: [Flac-dev] RE: basic encoder help I'm currently facing the same problem. I added the libFLAC++ libraries to my MSVC application. I implemented the same quality levels (0-8) as used in the FLAC frontend application. But the resulting files are remarkable different between my application and the FLAC frontend (although using the same settings). It did turn out to be something in my byte ordering in the end as suggested. I tested the decompressed file against the original wav file and it checked out and played fine, but when i rearranged the byte ordering to use signed char instead of the unsigned chars i was feeding it, the compression ratio was greatly improved. As a quick hack i ended up with: for (int i=0; i2352; i+=4) { // 2352 because number of bytes in frame of CDDA sample[j++]=(((char)buffer[i+1]8)|(0x00ff(int)(char)buffer[i])); sample[j++]=(((char)buffer[i+3]8)|(0x00ff(int)(char)buffer[i+2])); } the (char) casts made all the difference. I'll change the software to use all signed chars, but this works for now. HTH. Andrew ___ Flac-dev mailing list [EMAIL PROTECTED] http://lists.xiph.org/mailman/listinfo/flac-dev __ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com ___ Flac-dev mailing list [EMAIL PROTECTED] http://lists.xiph.org/mailman/listinfo/flac-dev