[jira] [Updated] (SPARK-24074) Maven package resolver downloads javadoc instead of jar

2018-04-24 Thread Hyukjin Kwon (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-24074?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon updated SPARK-24074:
-
Priority: Major  (was: Critical)

Please avoid to set Critical+ which is usually reserved for committers.  Does 
this consistently happen with other packages too?

> Maven package resolver downloads javadoc instead of jar
> ---
>
> Key: SPARK-24074
> URL: https://issues.apache.org/jira/browse/SPARK-24074
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Submit
>Affects Versions: 2.3.0
>Reporter: Nadav Samet
>Priority: Major
>
> {code:java}
> // code placeholder
> {code}
> From some reason spark downloads a javadoc artifact of a package instead of 
> the jar.
> Steps to reproduce:
>  # Delete (or move) your local ~/.ivy2 cache to force Spark to resolve and 
> fetch artifacts from central:
> {code:java}
> rm -rf ~/.ivy2
> {code}
> 1. Run:
> {code:java}
> ~/dev/spark-2.3.0-bin-hadoop2.7/bin/spark-shell --packages 
> org.scalanlp:breeze_2.11:0.13.2{code}
> 2.Spark would download the javadoc instead of the jar:
> {code:java}
> downloading 
> https://repo1.maven.org/maven2/net/sourceforge/f2j/arpack_combined_all/0.1/arpack_combined_all-0.1-javadoc.jar
>  ...
> [SUCCESSFUL ] 
> net.sourceforge.f2j#arpack_combined_all;0.1!arpack_combined_all.jar 
> (610ms){code}
> 3. Later spark would complain that it couldn't find the jar:
> {code:java}
> Warning: Local jar 
> /Users/thesamet/.ivy2/jars/net.sourceforge.f2j_arpack_combined_all-0.1.jar 
> does not exist, skipping.
> Setting default log level to "WARN".
> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
> setLogLevel(newLevel).{code}
> 4. The dependency of breeze on f2j_arpack_combined seem fine: 
> [http://central.maven.org/maven2/org/scalanlp/breeze_2.11/0.13.2/breeze_2.11-0.13.2.pom]
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-24074) Maven package resolver downloads javadoc instead of jar

2018-04-24 Thread Nadav Samet (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-24074?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Nadav Samet updated SPARK-24074:

Environment: (was: {code:java}
// code placeholder
{code})

> Maven package resolver downloads javadoc instead of jar
> ---
>
> Key: SPARK-24074
> URL: https://issues.apache.org/jira/browse/SPARK-24074
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Submit
>Affects Versions: 2.3.0
>Reporter: Nadav Samet
>Priority: Critical
>
> {code:java}
> // code placeholder
> {code}
> From some reason spark downloads a javadoc artifact of a package instead of 
> the jar.
> Steps to reproduce:
>  # Delete (or move) your local ~/.ivy2 cache to force Spark to resolve and 
> fetch artifacts from central:
> {code:java}
> rm -rf ~/.ivy2
> {code}
> 1. Run:
> {code:java}
> ~/dev/spark-2.3.0-bin-hadoop2.7/bin/spark-shell --packages 
> org.scalanlp:breeze_2.11:0.13.2{code}
> 2.Spark would download the javadoc instead of the jar:
> {code:java}
> downloading 
> https://repo1.maven.org/maven2/net/sourceforge/f2j/arpack_combined_all/0.1/arpack_combined_all-0.1-javadoc.jar
>  ...
> [SUCCESSFUL ] 
> net.sourceforge.f2j#arpack_combined_all;0.1!arpack_combined_all.jar 
> (610ms){code}
> 3. Later spark would complain that it couldn't find the jar:
> {code:java}
> Warning: Local jar 
> /Users/thesamet/.ivy2/jars/net.sourceforge.f2j_arpack_combined_all-0.1.jar 
> does not exist, skipping.
> Setting default log level to "WARN".
> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
> setLogLevel(newLevel).{code}
> 4. The dependency of breeze on f2j_arpack_combined seem fine: 
> [http://central.maven.org/maven2/org/scalanlp/breeze_2.11/0.13.2/breeze_2.11-0.13.2.pom]
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-24074) Maven package resolver downloads javadoc instead of jar

2018-04-24 Thread Nadav Samet (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-24074?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Nadav Samet updated SPARK-24074:

Description: 
{code:java}
// code placeholder
{code}
>From some reason spark downloads a javadoc artifact of a package instead of 
>the jar.

Steps to reproduce:
 # Delete (or move) your local ~/.ivy2 cache to force Spark to resolve and 
fetch artifacts from central:

{code:java}
rm -rf ~/.ivy2
{code}
1. Run:
{code:java}
~/dev/spark-2.3.0-bin-hadoop2.7/bin/spark-shell --packages 
org.scalanlp:breeze_2.11:0.13.2{code}
2.Spark would download the javadoc instead of the jar:
{code:java}
downloading 
https://repo1.maven.org/maven2/net/sourceforge/f2j/arpack_combined_all/0.1/arpack_combined_all-0.1-javadoc.jar
 ...
[SUCCESSFUL ] 
net.sourceforge.f2j#arpack_combined_all;0.1!arpack_combined_all.jar 
(610ms){code}
3. Later spark would complain that it couldn't find the jar:
{code:java}
Warning: Local jar 
/Users/thesamet/.ivy2/jars/net.sourceforge.f2j_arpack_combined_all-0.1.jar does 
not exist, skipping.
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
setLogLevel(newLevel).{code}
4. The dependency of breeze on f2j_arpack_combined seem fine: 
[http://central.maven.org/maven2/org/scalanlp/breeze_2.11/0.13.2/breeze_2.11-0.13.2.pom]

 

  was:
{code:java}
// code placeholder
{code}
>From some reason spark downloads a javadoc artifact of a package instead of 
>the jar.

Steps to reproduce:
 # Delete (or move) your local ~/.ivy2 cache to force Spark to resolve and 
fetch artifacts from central:

{code:java}
rm -rf ~/.ivy2
{code}

 # Run:

{code:java}
~/dev/spark-2.3.0-bin-hadoop2.7/bin/spark-shell --packages 
org.scalanlp:breeze_2.11:0.13.2{code}

 # Spark would download the javadoc instead of the jar:
{code:java}
downloading 
https://repo1.maven.org/maven2/net/sourceforge/f2j/arpack_combined_all/0.1/arpack_combined_all-0.1-javadoc.jar
 ...
[SUCCESSFUL ] 
net.sourceforge.f2j#arpack_combined_all;0.1!arpack_combined_all.jar 
(610ms){code}

 # Later spark would complain that it couldn't find the jar:

{code:java}
Warning: Local jar 
/Users/thesamet/.ivy2/jars/net.sourceforge.f2j_arpack_combined_all-0.1.jar does 
not exist, skipping.
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
setLogLevel(newLevel).{code}

 # The dependency of breeze on f2j_arpack_combined seem fine: 
[http://central.maven.org/maven2/org/scalanlp/breeze_2.11/0.13.2/breeze_2.11-0.13.2.pom]

 


> Maven package resolver downloads javadoc instead of jar
> ---
>
> Key: SPARK-24074
> URL: https://issues.apache.org/jira/browse/SPARK-24074
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Submit
>Affects Versions: 2.3.0
> Environment: {code:java}
> // code placeholder
> {code}
>Reporter: Nadav Samet
>Priority: Critical
>
> {code:java}
> // code placeholder
> {code}
> From some reason spark downloads a javadoc artifact of a package instead of 
> the jar.
> Steps to reproduce:
>  # Delete (or move) your local ~/.ivy2 cache to force Spark to resolve and 
> fetch artifacts from central:
> {code:java}
> rm -rf ~/.ivy2
> {code}
> 1. Run:
> {code:java}
> ~/dev/spark-2.3.0-bin-hadoop2.7/bin/spark-shell --packages 
> org.scalanlp:breeze_2.11:0.13.2{code}
> 2.Spark would download the javadoc instead of the jar:
> {code:java}
> downloading 
> https://repo1.maven.org/maven2/net/sourceforge/f2j/arpack_combined_all/0.1/arpack_combined_all-0.1-javadoc.jar
>  ...
> [SUCCESSFUL ] 
> net.sourceforge.f2j#arpack_combined_all;0.1!arpack_combined_all.jar 
> (610ms){code}
> 3. Later spark would complain that it couldn't find the jar:
> {code:java}
> Warning: Local jar 
> /Users/thesamet/.ivy2/jars/net.sourceforge.f2j_arpack_combined_all-0.1.jar 
> does not exist, skipping.
> Setting default log level to "WARN".
> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
> setLogLevel(newLevel).{code}
> 4. The dependency of breeze on f2j_arpack_combined seem fine: 
> [http://central.maven.org/maven2/org/scalanlp/breeze_2.11/0.13.2/breeze_2.11-0.13.2.pom]
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org