I could be wrong, but I thought this was on purpose. At the time it
was set up, there was no 2.11 Kafka available? or one of its
dependencies wouldn't work with 2.11?

But I'm not sure what the OP means by "maven doesn't build Spark's
dependencies" because Ted indicates it does, and of course you can see
that these artifacts are published.

On Sun, Jan 18, 2015 at 2:46 AM, Ted Yu <yuzhih...@gmail.com> wrote:
> There're 3 jars under lib_managed/jars directory with and without
> -Dscala-2.11 flag.
>
> Difference between scala-2.10 and scala-2.11 profiles is that scala-2.10
> profile has the following:
>       <modules>
>         <module>external/kafka</module>
>       </modules>
>
> FYI
>
> On Sat, Jan 17, 2015 at 4:07 PM, Ted Yu <yuzhih...@gmail.com> wrote:
>>
>> I did the following:
>>  1655  dev/change-version-to-2.11.sh
>>  1657  mvn -DHADOOP_PROFILE=hadoop-2.4 -Pyarn,hive -Phadoop-2.4
>> -Dscala-2.11 -DskipTests clean package
>>
>> And mvn command passed.
>>
>> Did you see any cross-compilation errors ?
>>
>> Cheers
>>
>> BTW the two links you mentioned are consistent in terms of building for
>> Scala 2.11
>>
>> On Sat, Jan 17, 2015 at 3:43 PM, Walrus theCat <walrusthe...@gmail.com>
>> wrote:
>>>
>>> Hi,
>>>
>>> When I run this:
>>>
>>> dev/change-version-to-2.11.sh
>>> mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package
>>>
>>> as per here, maven doesn't build Spark's dependencies.
>>>
>>> Only when I run:
>>>
>>> dev/change-version-to-2.11.sh
>>> sbt/sbt -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests  clean package
>>>
>>> as gathered from here, do I get Spark's dependencies built without any
>>> cross-compilation errors.
>>>
>>> Question:
>>>
>>> - How can I make maven do this?
>>>
>>> - How can I specify the use of Scala 2.11 in my own .pom files?
>>>
>>> Thanks
>>
>>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to