Re: Cannot Import Package (spark-csv)

2015-08-03 Thread Burak Yavuz
Hi, there was this issue for Scala 2.11.
https://issues.apache.org/jira/browse/SPARK-7944
It should be fixed on master branch. You may be hitting that.

Best,
Burak

On Sun, Aug 2, 2015 at 9:06 PM, Ted Yu yuzhih...@gmail.com wrote:

 I tried the following command on master branch:
 bin/spark-shell --packages com.databricks:spark-csv_2.10:1.0.3 --jars
 ../spark-csv_2.10-1.0.3.jar --master local

 I didn't reproduce the error with your command.

 FYI

 On Sun, Aug 2, 2015 at 8:57 PM, Bill Chambers 
 wchamb...@ischool.berkeley.edu wrote:

 Sure the commands are:

 scala val df =
 sqlContext.read.format(com.databricks.spark.csv).option(header,
 true).load(cars.csv)

 and get the following error:

 java.lang.RuntimeException: Failed to load class for data source:
 com.databricks.spark.csv
   at scala.sys.package$.error(package.scala:27)
   at
 org.apache.spark.sql.sources.ResolvedDataSource$.lookupDataSource(ddl.scala:220)
   at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:233)
   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:104)
   ... 49 elided

 On Sun, Aug 2, 2015 at 8:56 PM, Ted Yu yuzhih...@gmail.com wrote:

 The command you ran and the error you got were not visible.

 Mind sending them again ?

 Cheers

 On Sun, Aug 2, 2015 at 8:33 PM, billchambers 
 wchamb...@ischool.berkeley.edu wrote:

 I am trying to import the spark csv package while using the scala spark
 shell. Spark 1.4.1, Scala 2.11

 I am starting the shell with:

 bin/spark-shell --packages com.databricks:spark-csv_2.11:1.1.0 --jars
 ../sjars/spark-csv_2.11-1.1.0.jar --master local


 I then try and run



 and get the following error:



 What am i doing wrong?



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-Import-Package-spark-csv-tp24109.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org





 --
 Bill Chambers
 http://billchambers.me/
 Email wchamb...@ischool.berkeley.edu | LinkedIn
 http://linkedin.com/in/wachambers | Twitter
 https://twitter.com/b_a_chambers | Github
 https://github.com/anabranch





Re: Cannot Import Package (spark-csv)

2015-08-03 Thread Burak Yavuz
In addition, you do not need to use --jars with --packages. --packages will
get the jar for you.

Best,
Burak

On Mon, Aug 3, 2015 at 9:01 AM, Burak Yavuz brk...@gmail.com wrote:

 Hi, there was this issue for Scala 2.11.
 https://issues.apache.org/jira/browse/SPARK-7944
 It should be fixed on master branch. You may be hitting that.

 Best,
 Burak

 On Sun, Aug 2, 2015 at 9:06 PM, Ted Yu yuzhih...@gmail.com wrote:

 I tried the following command on master branch:
 bin/spark-shell --packages com.databricks:spark-csv_2.10:1.0.3 --jars
 ../spark-csv_2.10-1.0.3.jar --master local

 I didn't reproduce the error with your command.

 FYI

 On Sun, Aug 2, 2015 at 8:57 PM, Bill Chambers 
 wchamb...@ischool.berkeley.edu wrote:

 Sure the commands are:

 scala val df =
 sqlContext.read.format(com.databricks.spark.csv).option(header,
 true).load(cars.csv)

 and get the following error:

 java.lang.RuntimeException: Failed to load class for data source:
 com.databricks.spark.csv
   at scala.sys.package$.error(package.scala:27)
   at
 org.apache.spark.sql.sources.ResolvedDataSource$.lookupDataSource(ddl.scala:220)
   at
 org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:233)
   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:104)
   ... 49 elided

 On Sun, Aug 2, 2015 at 8:56 PM, Ted Yu yuzhih...@gmail.com wrote:

 The command you ran and the error you got were not visible.

 Mind sending them again ?

 Cheers

 On Sun, Aug 2, 2015 at 8:33 PM, billchambers 
 wchamb...@ischool.berkeley.edu wrote:

 I am trying to import the spark csv package while using the scala spark
 shell. Spark 1.4.1, Scala 2.11

 I am starting the shell with:

 bin/spark-shell --packages com.databricks:spark-csv_2.11:1.1.0 --jars
 ../sjars/spark-csv_2.11-1.1.0.jar --master local


 I then try and run



 and get the following error:



 What am i doing wrong?



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-Import-Package-spark-csv-tp24109.html
 Sent from the Apache Spark User List mailing list archive at
 Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org





 --
 Bill Chambers
 http://billchambers.me/
 Email wchamb...@ischool.berkeley.edu | LinkedIn
 http://linkedin.com/in/wachambers | Twitter
 https://twitter.com/b_a_chambers | Github
 https://github.com/anabranch






Cannot Import Package (spark-csv)

2015-08-02 Thread billchambers
I am trying to import the spark csv package while using the scala spark
shell. Spark 1.4.1, Scala 2.11

I am starting the shell with:

bin/spark-shell --packages com.databricks:spark-csv_2.11:1.1.0 --jars
../sjars/spark-csv_2.11-1.1.0.jar --master local


I then try and run



and get the following error:



What am i doing wrong?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-Import-Package-spark-csv-tp24109.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Cannot Import Package (spark-csv)

2015-08-02 Thread Ted Yu
The command you ran and the error you got were not visible.

Mind sending them again ?

Cheers

On Sun, Aug 2, 2015 at 8:33 PM, billchambers wchamb...@ischool.berkeley.edu
 wrote:

 I am trying to import the spark csv package while using the scala spark
 shell. Spark 1.4.1, Scala 2.11

 I am starting the shell with:

 bin/spark-shell --packages com.databricks:spark-csv_2.11:1.1.0 --jars
 ../sjars/spark-csv_2.11-1.1.0.jar --master local


 I then try and run



 and get the following error:



 What am i doing wrong?



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-Import-Package-spark-csv-tp24109.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Re: Cannot Import Package (spark-csv)

2015-08-02 Thread billchambers
Commands again are:

Sure the commands are:

scala val df =
sqlContext.read.format(com.databricks.spark.csv).option(header,
true).load(cars.csv)

and get the following error: 

java.lang.RuntimeException: Failed to load class for data source:
com.databricks.spark.csv
  at scala.sys.package$.error(package.scala:27)
  at
org.apache.spark.sql.sources.ResolvedDataSource$.lookupDataSource(ddl.scala:220)
  at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:233)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:104)
  ... 49 elided



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-Import-Package-spark-csv-tp24109p24110.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Cannot Import Package (spark-csv)

2015-08-02 Thread Ted Yu
I tried the following command on master branch:
bin/spark-shell --packages com.databricks:spark-csv_2.10:1.0.3 --jars
../spark-csv_2.10-1.0.3.jar --master local

I didn't reproduce the error with your command.

FYI

On Sun, Aug 2, 2015 at 8:57 PM, Bill Chambers 
wchamb...@ischool.berkeley.edu wrote:

 Sure the commands are:

 scala val df =
 sqlContext.read.format(com.databricks.spark.csv).option(header,
 true).load(cars.csv)

 and get the following error:

 java.lang.RuntimeException: Failed to load class for data source:
 com.databricks.spark.csv
   at scala.sys.package$.error(package.scala:27)
   at
 org.apache.spark.sql.sources.ResolvedDataSource$.lookupDataSource(ddl.scala:220)
   at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:233)
   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:104)
   ... 49 elided

 On Sun, Aug 2, 2015 at 8:56 PM, Ted Yu yuzhih...@gmail.com wrote:

 The command you ran and the error you got were not visible.

 Mind sending them again ?

 Cheers

 On Sun, Aug 2, 2015 at 8:33 PM, billchambers 
 wchamb...@ischool.berkeley.edu wrote:

 I am trying to import the spark csv package while using the scala spark
 shell. Spark 1.4.1, Scala 2.11

 I am starting the shell with:

 bin/spark-shell --packages com.databricks:spark-csv_2.11:1.1.0 --jars
 ../sjars/spark-csv_2.11-1.1.0.jar --master local


 I then try and run



 and get the following error:



 What am i doing wrong?



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-Import-Package-spark-csv-tp24109.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org





 --
 Bill Chambers
 http://billchambers.me/
 Email wchamb...@ischool.berkeley.edu | LinkedIn
 http://linkedin.com/in/wachambers | Twitter
 https://twitter.com/b_a_chambers | Github https://github.com/anabranch