Fwd: Change ivy cache for spark on Windows

2015-04-27 Thread Burak Yavuz
+user

-- Forwarded message --
From: Burak Yavuz 
Date: Mon, Apr 27, 2015 at 1:59 PM
Subject: Re: Change ivy cache for spark on Windows
To: mj 


Hi,

In your conf file (SPARK_HOME\conf\spark-defaults.conf) you can set:

`spark.jars.ivy \your\path`


Best,
Burak

On Mon, Apr 27, 2015 at 1:49 PM, mj  wrote:

> Hi,
>
> I'm having trouble using the --packages option for spark-shell.cmd - I have
> to use Windows at work and have been issued a username with a space in it
> that means when I use the --packages option it fails with this message:
>
> "Exception in thread "main" java.net.URISyntaxException: Illegal character
> in path at index 13: C:/Users/My Name/.ivy2/jars/spark-csv_2.10.jar"
>
> The command I'm trying to run is:
> .\spark-shell.cmd --packages com.databricks:spark-csv_2.10:1.0.3
>
> I've tried creating an ivysettings.xml file with the content below in my
> .ivy2 directory, but spark doesn't seem to pick it up. Does anyone have any
> ideas of how to get around this issue?
>
> 
> 
> 
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Change-ivy-cache-for-spark-on-Windows-tp22675.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Change ivy cache for spark on Windows

2015-04-27 Thread mj
Hi,

I'm having trouble using the --packages option for spark-shell.cmd - I have
to use Windows at work and have been issued a username with a space in it
that means when I use the --packages option it fails with this message:

"Exception in thread "main" java.net.URISyntaxException: Illegal character
in path at index 13: C:/Users/My Name/.ivy2/jars/spark-csv_2.10.jar"

The command I'm trying to run is:
.\spark-shell.cmd --packages com.databricks:spark-csv_2.10:1.0.3

I've tried creating an ivysettings.xml file with the content below in my
.ivy2 directory, but spark doesn't seem to pick it up. Does anyone have any
ideas of how to get around this issue?








--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Change-ivy-cache-for-spark-on-Windows-tp22675.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org