Hi Eli,

Moving this to cdh-u...@cloudera.org as its a CDH specific question.
You'll get better answers from the community there. You are CC'd but
to subscribe to the CDH users community, head to
https://groups.google.com/a/cloudera.org/forum/#!forum/cdh-user. I've
bcc'd common-user@ here.

What you may be hitting here is caused by version mismatch in client
vs. server. See
https://ccp.cloudera.com/display/CDHDOC/Known+Issues+and+Work+Arounds+in+CDH3#KnownIssuesandWorkAroundsinCDH3-Pig
(Point #2, but it may not be just Pig/Hive-specific)

On Tue, Apr 3, 2012 at 3:54 AM, Eli Finkelshteyn <iefin...@gmail.com> wrote:
> Hi Folks,
> A coworker of mine recently setup a new CDH3 cluster with 4 machines (3 data
> nodes, one namenode that doubles as a jobtracker). I started looking through
> it using "hadoop fs -ls," and that went fine with everything displaying
> alright. Next, I decided to test out some simple pig jobs. Each of these
> worked fine on my development pseudo cluster, but failed on the new CDH3
> cluster with the exact same erro:
>
> *java.lang.IllegalArgumentException: Compression codec
> org.apache.hadoop.io.compress.DeflateCodec not found.
>
> *
>
> This also only happened when trying to process .gz files, and it happened
> even when I just tried to load and dump one. I figured this could be a
> problem with compression configs being manually overwritten in
> core-site.xml, but that file didn't have any mention of compressions on any
> of the boxes in the CDH3 cluster. I looked at each box individually, and all
> the proper jars seem to be there, so now I'm at a bit of a loss. Any ideas
> what the problem could be?
>
> Eli
>



-- 
Harsh J

Reply via email to