Thanks Matt.  Hopefully we can have a new page on the hadoop wiki on how to
use custom compression so that people won't have to go search through the
threads to find the answer in the future.

On Thu, Jun 4, 2009 at 10:33 AM, Matt Massie <[email protected]> wrote:

> Kris-
>
> You might take a look at some of the previous lzo threads on this list for
> help.
>
> See:
> http://www.mail-archive.com/search?q=lzo&l=core-user%40hadoop.apache.org
>
> -Matt
>
>
> On Jun 4, 2009, at 10:29 AM, Kris Jirapinyo wrote:
>
>  Is there any documentation on that site on how we can use lzo?  I don't
>> see
>> any entries on the wiki page of the project.  I see an entry on the Hadoop
>> wiki (http://wiki.apache.org/hadoop/UsingLzoCompression) but seems like
>> that's more oriented towards HBase.  I am on hadoop 0.19.1.
>>
>> Thanks,
>> Kris J.
>>
>> On Thu, Jun 4, 2009 at 3:02 AM, Johan Oskarsson <[email protected]>
>> wrote:
>>
>>  We're using Lzo still, works great for those big log files:
>>> http://code.google.com/p/hadoop-gpl-compression/
>>>
>>> /Johan
>>>
>>> Kris Jirapinyo wrote:
>>>
>>>> Hi all,
>>>>  In the remove lzo JIRA ticket
>>>> https://issues.apache.org/jira/browse/HADOOP-4874 Tatu mentioned he was
>>>> going to port fastlz from C to Java and provide a patch.  Has there been
>>>>
>>> any
>>>
>>>> updates on that?  Or is anyone working on any additional custom
>>>>
>>> compression
>>>
>>>> codecs?
>>>>
>>>> Thanks,
>>>> Kris J.
>>>>
>>>>
>>>
>>>
>

Reply via email to