In make-distribution.sh, there is following check of Java version:

if [[ ! "$JAVA_VERSION" =~ "1.6" && -z "$SKIP_JAVA_TEST" ]]; then
  echo "***NOTE***: JAVA_HOME is not set to a JDK 6 installation. The
resulting"

FYI

On Sat, Dec 27, 2014 at 1:31 AM, Sean Owen <so...@cloudera.com> wrote:

> Why do you need to skip java tests? I build the distro just fine with Java
> 8.
> On Dec 27, 2014 4:21 AM, "Ted Yu" <yuzhih...@gmail.com> wrote:
>
>> In case jdk 1.7 or higher is used to build, --skip-java-test needs to be
>> specifed.
>>
>> FYI
>>
>> On Thu, Dec 25, 2014 at 5:03 PM, guxiaobo1982 <guxiaobo1...@qq.com>
>> wrote:
>>
>>> The following command works
>>>
>>> ./make-distribution.sh --tgz  -Pyarn -Dyarn.version=2.6.0 -Phadoop-2.4
>>> -Dhadoop.version=2.6.0 -Phive -DskipTests
>>>
>>> ------------------ Original ------------------
>>> *From: * "guxiaobo1982";<guxiaobo1...@qq.com>;
>>> *Send time:* Thursday, Dec 25, 2014 3:58 PM
>>> *To:* ""<guxiaobo1...@qq.com>; "Ted Yu"<yuzhih...@gmail.com>;
>>> *Cc:* "user@spark.apache.org"<user@spark.apache.org>;
>>> *Subject: * Re: How to build Spark against the latest
>>>
>>>
>>> What options should I use when running the make-distribution.sh script,
>>>
>>> I tried ./make-distribution.sh --hadoop.version 2.6.0 --with-yarn
>>> -with-hive --with-tachyon --tgz
>>> with nothing came out.
>>>
>>> Regards
>>>
>>> ------------------ Original ------------------
>>> *From: * "guxiaobo1982";<guxiaobo1...@qq.com>;
>>> *Send time:* Wednesday, Dec 24, 2014 6:52 PM
>>> *To:* "Ted Yu"<yuzhih...@gmail.com>;
>>> *Cc:* "user@spark.apache.org"<user@spark.apache.org>;
>>> *Subject: * Re: How to build Spark against the latest
>>>
>>> Hi Ted,
>>>      The reference command works, but where I can get the deployable
>>> binaries?
>>>
>>> Xiaobo Gu
>>>
>>>
>>>
>>>
>>> ------------------ Original ------------------
>>> *From: * "Ted Yu";<yuzhih...@gmail.com>;
>>> *Send time:* Wednesday, Dec 24, 2014 12:09 PM
>>> *To:* ""<guxiaobo1...@qq.com>;
>>> *Cc:* "user@spark.apache.org"<user@spark.apache.org>;
>>> *Subject: * Re: How to build Spark against the latest
>>>
>>> See http://search-hadoop.com/m/JW1q5Cew0j
>>>
>>> On Tue, Dec 23, 2014 at 8:00 PM, guxiaobo1982 <guxiaobo1...@qq.com>
>>> wrote:
>>>
>>>> Hi,
>>>> The official pom.xml file only have profile for hadoop version 2.4 as
>>>> the latest version, but I installed hadoop version 2.6.0 with ambari, how
>>>> can I build spark against it, just using mvn -Dhadoop.version=2.6.0,
>>>> or how to make a coresponding profile for it?
>>>>
>>>> Regards,
>>>>
>>>> Xiaobo
>>>>
>>>
>>>
>>

Reply via email to