Hi Jessica, I assume the exception is on the remote node?  Was the TT
restarted?  Did you try 'add jar
/usr/lib/hadoop-0.20/lib/hadoop-lzo-20110217.jar' command from the hive
command line to make sure it's classpath issue?  Do you see
/usr/lib/hadoop-0.20/lib/hadoop-lzo-20110217.jar in the child classpath when
the task is executed (use 'ps aux' on the node)? -- Alex K

On Wed, Oct 5, 2011 at 12:35 PM, Jessica Owensby
<[email protected]>wrote:

> Hi Joey,
> Thanks. I forgot to say that; yes, the lzocodec class is listed in
> core-site.xml under the io.compression.codecs property:
>
> <property>
>  <name>io.compression.codecs</name>
>
>  
> <value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,com.hadoop.compression.lzo.LzoCodec,com.hadoop.compression.lzo.LzopCodec,org.apache.hadoop.io.compress.BZip2Codec</value>
> </property>
>
> I also added the mapred.child.env property to mapred site:
>
>  <property>
>    <name>mapred.child.env</name>
>    <value>JAVA_LIBRARY_PATH=/usr/lib/hadoop-0.20/lib</value>
>  </property>
>
> per these instructions:
>
> http://www.cloudera.com/blog/2009/11/hadoop-at-twitter-part-1-splittable-lzo-compression/
>
> After making each of these changes I have restarted the cluster --
> just to be sure that the new changes were being picked up.
>
> Jessica
>

Reply via email to