I was also playing with the signed/unsigned char thing, and it solved my
problem too.
Although I don't understand why this could make a difference?

-----Original Message-----
From: Andrew Gatt [mailto:[EMAIL PROTECTED] 
Sent: vrijdag 5 november 2004 11:24
To: [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Subject: RE: [Flac-dev] RE: basic encoder help

>I'm currently facing the same problem.
>I added the libFLAC++ libraries to my MSVC application.
>I implemented the same quality levels (0-8) as used in the FLAC frontend
application.
>But the resulting files are remarkable different between my application and
the FLAC frontend >(although using the same settings).

It did turn out to be something in my byte ordering in the end as suggested.
I tested the decompressed file against the original wav file and it checked
out and played fine, but when i rearranged the byte ordering to use signed
char instead of the unsigned chars i was feeding it, the compression ratio
was greatly improved. As a quick hack i ended up with:

for (int i=0; i<2352; i+=4) {                   // 2352 because number of
bytes in frame of CDDA
        
sample[j++]=(((char)buffer[i+1]<<8)|(0x00ff&(int)(char)buffer[i]));
        
sample[j++]=(((char)buffer[i+3]<<8)|(0x00ff&(int)(char)buffer[i+2]));
                }

the (char) casts made all the difference. I'll change the software to use
all signed chars, but this works for now.

HTH.

Andrew


_______________________________________________
Flac-dev mailing list
[EMAIL PROTECTED]
http://lists.xiph.org/mailman/listinfo/flac-dev

Reply via email to