In the meantime, you might apply the patch in MAHOUT-1354, build mahout
using mvn package -Phadoop2 -DskipTests=true, use that mahout version and
see if that works

Gokhan


On Wed, Dec 11, 2013 at 10:09 PM, Gokhan Capan <gkhn...@gmail.com> wrote:

> I apologize, Suneel is right, Counter breaks the binary compatibility.
>
> Well, I can say there is a work in progress for building mahout against
> hadoop2.
>
> Gokhan
>
>
> On Wed, Dec 11, 2013 at 10:03 PM, Hi There <srudamas...@yahoo.com> wrote:
>
>> Here are the full contents of my pom file:
>>
>> <project xmlns="http://maven.apache.org/POM/4.0.0"; xmlns:xsi="
>> http://www.w3.org/2001/XMLSchema-instance";
>>   xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
>> http://maven.apache.org/xsd/maven-4.0.0.xsd";>
>>   <modelVersion>4.0.0</modelVersion>
>>
>>   <groupId>clustertest</groupId>
>>   <artifactId>clustertest</artifactId>
>>   <version>1.0</version>
>>   <packaging>jar</packaging>
>>
>>   <name>clustertest</name>
>>   <url>http://maven.apache.org</url>
>>     <build>
>>         <plugins>
>>             <plugin>
>>                 <groupId>org.apache.maven.plugins</groupId>
>>                 <artifactId>maven-compiler-plugin</artifactId>
>>                 <version>2.3.2</version>
>>                 <configuration>
>>                     <source>1.7</source>
>>                     <target>1.7</target>
>>                 </configuration>
>>             </plugin>
>>         </plugins>
>>     </build>
>>     <properties>
>>     <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
>>   </properties>
>>
>>   <dependencies>
>>     <dependency>
>>       <groupId>junit</groupId>
>>       <artifactId>junit</artifactId>
>>       <version>3.8.1</version>
>>       <scope>test</scope>
>>     </dependency>
>>     <dependency>
>>       <groupId>org.apache.mahout</groupId>
>>       <artifactId>mahout-core</artifactId>
>>       <version>0.8</version>
>>       <type>jar</type>
>>       <exclusions>
>>         <exclusion>
>>           <artifactId>hadoop-core</artifactId>
>>           <groupId>org.apache.hadoop</groupId>
>>         </exclusion>
>>       </exclusions>
>>     </dependency>
>>     <dependency>
>>       <groupId>com.google.code.gson</groupId>
>>       <artifactId>gson</artifactId>
>>       <version>2.2.4</version>
>>       <type>jar</type>
>>     </dependency>
>>     <dependency>
>>       <groupId>org.apache.mahout</groupId>
>>       <artifactId>mahout-integration</artifactId>
>>       <version>0.8</version>
>>       <type>jar</type>
>>       <exclusions>
>>         <exclusion>
>>           <artifactId>hadoop-core</artifactId>
>>           <groupId>org.apache.hadoop</groupId>
>>         </exclusion>
>>       </exclusions>
>>     </dependency>
>>     <dependency>
>>         <groupId>org.apache.hadoop</groupId>
>>         <artifactId>hadoop-client</artifactId>
>>         <version>2.2.0</version>
>>     </dependency>
>>   </dependencies>
>> </project>
>>
>> How/where do I check to see whether my hadoop cluster is 2.2.0? Sorry, I
>> am new at this.
>>
>>
>>
>>
>> On Wednesday, December 11, 2013 11:56 AM, Gokhan Capan <gkhn...@gmail.com>
>> wrote:
>>
>> Could you check the following?
>>
>> Are you sure that your hadoop cluster is hadoop 2.2.0?
>> Are you sure other dependencies of your project do not have a transitive
>> dependency to hadoop?
>>
>> Gokhan
>>
>>
>>
>> On Wed, Dec 11, 2013 at 9:46 PM, Hi There <srudamas...@yahoo.com> wrote:
>>
>> > I tried to run SparseVectorsFromSequenceFiles, specifying a directory
>> with
>> > sequence files, and I got the following error:
>> >
>> > java.lang.Exception: java.lang.IncompatibleClassChangeError: Found
>> > interface org.apache.hadoop.mapreduce.Counter, but class was expected
>> >
>> > Here is a relevant snippet of my pom file for maven:
>> >
>> > <dependency>
>> >       <groupId>org.apache.mahout</groupId>
>> >       <artifactId>mahout-core</artifactId>
>> >       <version>0.8</version>
>> >       <type>jar</type>
>> >       <exclusions>
>> >         <exclusion>
>> >           <artifactId>hadoop-core</artifactId>
>> >           <groupId>org.apache.hadoop</groupId>
>> >         </exclusion>
>> >       </exclusions>
>> > </dependency>
>> > <dependency>
>> >       <groupId>org.apache.mahout</groupId>
>> >       <artifactId>mahout-integration</artifactId>
>> >       <version>0.8</version>
>> >       <type>jar</type>
>> >       <exclusions>
>> >         <exclusion>
>> >           <artifactId>hadoop-core</artifactId>
>> >           <groupId>org.apache.hadoop</groupId>
>> >         </exclusion>
>> >       </exclusions>
>> > </dependency>
>> > <dependency>
>> >         <groupId>org.apache.hadoop</groupId>
>> >         <artifactId>hadoop-client</artifactId>
>> >         <version>2.2.0</version>
>> > </dependency>
>> >
>> >
>> > How do I change this to use the correct version(s) of mahout/hadoop?
>> >
>> > Thanks!
>> >
>> >
>> >
>> >
>> > On Tuesday, December 10, 2013 1:03 PM, Gokhan Capan <gkhn...@gmail.com>
>> > wrote:
>> >
>> > I meant that you shouldn't need to modify mahout's dependencies, just
>> mvn
>> > package and it should work against hadoop 2.2.0 (Yeah, 2.2.0 is not
>> alpha)
>> >
>> > Quoting from
>> >
>> http://hadoop.apache.org/docs/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core/MapReduce_Compatibility_Hadoop1_Hadoop2.html
>> > "First, we ensure binary compatibility to the applications that use old
>> > mapred APIs. This means that applications which were built against MRv1
>> > mapred APIs can run directly on YARN without recompilation, merely by
>> > pointing them to an Apache Hadoop 2.x cluster via configuration."
>> >
>> > If you encounter with any problems, just let the list know.
>> >
>> > Best
>> >
>> >
>> >
>> > > On Dec 9, 2013, at 9:40 PM, Hi There <srudamas...@yahoo.com> wrote:
>> > >
>> > > Hi Gokhan,
>> > >
>> > > My project currently fetches every dependency through Maven--is there
>> > any way I can grab the version you mentioned that way?
>> > >
>> > > In that vein, I am using the following version of hadoop:
>> > > <dependency>
>> > >        <groupId>org.apache.hadoop</groupId>
>> > >        <artifactId>hadoop-client</artifactId>
>> > >        <version>2.2.0</version>
>> > > </dependency>
>> > >
>> > >
>> > > That's not alpha, right?
>> > >
>> > > Thanks!
>> > >
>> > >
>> > >
>> > >
>> > >
>> > > On Monday, December 9, 2013 10:05 AM, Gokhan Capan <gkhn...@gmail.com
>> >
>> > wrote:
>> > >
>> > > Mahout actually should work with hadoop-2 stable without recompiling,
>> > > not with hadoop-2 alpha though.
>> > >
>> > > We're, by the way, currently in the process of adding support to build
>> > > mahout with hadoop-2.
>> > >
>> > > Please see mahout-1354 for the relevant issue
>> > >
>> > > Sent from my iPhone
>> > >
>> > >
>> > >> On Dec 9, 2013, at 19:54, Hi There <srudamas...@yahoo.com> wrote:
>> > >>
>> > >> Is Dec 2013 still the intended release date of the next mahout
>> release
>> > that will be compatible with Hadoop 2.2.0?
>> > >>
>> > >>
>> > >>
>> > >>
>> > >> On Thursday, November 21, 2013 12:36 PM, Suneel Marthi <
>> > suneel_mar...@yahoo.com> wrote:
>> > >>
>> > >> Targeted for Dec 2013.
>> > >>
>> > >>
>> > >>
>> > >>
>> > >>
>> > >>
>> > >> On Thursday, November 21, 2013 3:26 PM, Hi There <
>> srudamas...@yahoo.com>
>> > wrote:
>> > >>
>> > >> Thanks for the reply! Is there a timeline for then the next release
>> > will be?
>> > >>
>> > >>
>> > >> Thanks,
>> > >> Victor
>> > >>
>> > >>
>> > >>
>> > >>
>> > >> On Tuesday, November 19, 2013 7:30 PM, Suneel Marthi <
>> > suneel_mar...@yahoo.com> wrote:
>> > >>
>> > >> Hi Victor,
>> > >>
>> > >> Future releases of Mahout will support Hadoop 2.x, the present
>> codebase
>> > still only supports Hadoop 1.x.
>> > >>
>> > >>
>> > >>
>> > >>
>> > >>
>> > >>
>> > >> On Tuesday, November 19, 2013 1:42 PM, Hi There <
>> srudamas...@yahoo.com>
>> > wrote:
>> > >>
>> > >>
>> > >>
>> > >> Hello,
>> > >>
>> > >> I recently upgraded to hadoop's
>> > >> newest release, and it seems one of their interfaces has changed, and
>> > >> when I try to create sparse vectors from sequence files, I get the
>> > >> following exception:
>> > >>
>> > >> java.lang.IncompatibleClassChangeError: Found interface
>> > org.apache.hadoop.mapreduce.Counter, but class was expected
>> > >>
>> > >> I can include more of the stack trace if necessary.
>> > >>
>> > >> Are there any plans in the immediate future to upgrade mahout to be
>> > compatible with the newest hadoop release?
>> > >>
>> > >> Thanks,
>> > >> Victor
>> >
>>
>
>

Reply via email to