For any Hadoop 2.4 distro, yes, set hadoop.version but also set
-Phadoop-2.4. http://spark.apache.org/docs/latest/building-with-maven.html
On Mon, Aug 4, 2014 at 9:15 AM, Patrick Wendell pwend...@gmail.com wrote:
For hortonworks, I believe it should work to just link against the
corresponding
Can you try building without any of the special `hadoop.version` flags and
just building only with -Phadoop-2.4? In the past users have reported
issues trying to build random spot versions... I think HW is supposed to be
compatible with the normal 2.4.0 build.
On Mon, Aug 4, 2014 at 8:35 AM,
Ah I see, yeah you might need to set hadoop.version and yarn.version. I
thought he profile set this automatically.
On Mon, Aug 4, 2014 at 10:02 AM, Ron's Yahoo! zlgonza...@yahoo.com wrote:
I meant yarn and hadoop defaulted to 1.0.4 so the yarn build fails since
1.0.4 doesn't exist for yarn...
...@spark.apache.org, dev@spark.apache.org
dev@spark.apache.org
Subject: Re: Issues with HDP 2.4.0.2.1.3.0-563
Ah I see, yeah you might need to set hadoop.version and yarn.version. I
thought he profile set this automatically.
On Mon, Aug 4, 2014 at 10:02 AM, Ron's Yahoo! zlgonza...@yahoo.com wrote:
I meant
What would such a profile do though? In general building for a
specific vendor version means setting hadoop.verison and/or
yarn.version. Any hard-coded value is unlikely to match what a
particular user needs. Setting protobuf versions and so on is already
done by the generic profiles.
In a
Hmm. Fair enough. I hadn¹t given that answer much thought and on
reflection think you¹re right in that a profile would just be a bad hack.
On 8/4/14, 10:35, Sean Owen so...@cloudera.com wrote:
What would such a profile do though? In general building for a
specific vendor version means setting