Yes - ran dev/change-version-to-2.11.sh
But was missing -Dscala-2.11 on mvn command after a -2.10 build. Building
successfully again now after adding that.
On Tue, Apr 7, 2015 at 7:04 PM Imran Rashid wrote:
> did you run
>
> dev/change-version-to-2.11.sh
>
> before compiling? When I ran this o
did you run
dev/change-version-to-2.11.sh
before compiling? When I ran this on current master, it mostly worked:
dev/change-version-to-2.11.sh
mvn -Pyarn -Phadoop-2.4 -Pscala-2.11 -DskipTests clean package
There was a failure in building catalyst, but core built just fine for me.
The error I g
Hmm.. Make sure you are building with the right flags. I think you need to
pass -Dscala-2.11 to maven. Take a look at the upstream docs - on my phone
now so can't easily access.
On Apr 7, 2015 1:01 AM, "mjhb" wrote:
> I even deleted my local maven repository (.m2) but still stuck when
> attempti
I even deleted my local maven repository (.m2) but still stuck when
attempting to build w/ Scala-2.11:
[ERROR] Failed to execute goal on project spark-core_2.11: Could not resolve
dependencies for project
org.apache.spark:spark-core_2.11:jar:1.3.2-SNAPSHOT: The following artifacts
could not be res
The only think that can persist outside of Spark is if there is still
a live Zinc process. We took care to make sure this was a generally
stateless mechanism.
Both the 1.2.X and 1.3.X releases are built with Scala 2.11 for
packaging purposes. And these have been built as recently as in the
last fe
I resorted to deleting the spark directory between each build earlier today
(attempting maximum sterility) and then re-cloning from github and switching
to the 1.2 or 1.3 branch.
Does anything persist outside of the spark directory?
Are you able to build either 1.2 or 1.3 w/ Scala-2.11?
--
Vie
One thing that I think can cause issues is if you run build/mvn with
Scala 2.10, then try to run it with 2.11, since I think we may store
some downloaded jars relating to zinc that will get screwed up. Not
sure that's what is happening, just an idea.
On Mon, Apr 6, 2015 at 10:54 PM, Patrick Wendel
The issue is that if you invoke "build/mvn" it will start zinc again
if it sees that it is killed.
The absolute most "sterile" thing to do is this:
1. Kill any zinc processes.
2. Clean up spark "git clean -fdx" (WARNING: this will delete any
staged changes you have, if you have code modifications
I'm killing zinc (if it's running) before running each build attempt.
Trying to build as "clean" as possible.
On Mon, Apr 6, 2015 at 7:31 PM Patrick Wendell wrote:
> What if you don't run zinc? I.e. just download maven and run that "mvn
> package...". It might take longer, but I wonder if it w
What if you don't run zinc? I.e. just download maven and run that "mvn
package...". It might take longer, but I wonder if it will work.
On Mon, Apr 6, 2015 at 10:26 PM, mjhb wrote:
> Similar problem on 1.2 branch:
>
> [ERROR] Failed to execute goal on project spark-core_2.11: Could not resolve
>
Similar problem on 1.2 branch:
[ERROR] Failed to execute goal on project spark-core_2.11: Could not resolve
dependencies for project
org.apache.spark:spark-core_2.11:jar:1.2.3-SNAPSHOT: The following artifacts
could not be resolved:
org.apache.spark:spark-network-common_2.10:jar:1.2.3-SNAPSHOT,
or
$dev/change-version-to-2.11.sh
$build/mvn -e -DskipTests clean package
[ERROR] Failed to execute goal on project spark-core_2.11: Could not resolve
dependencies for project
org.apache.spark:spark-core_2.11:jar:1.3.2-SNAPSHOT: The following artifacts
could not be resolved:
org.apache.spark:spark-ne
12 matches
Mail list logo