[ 
https://issues.apache.org/jira/browse/SPARK-9555?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14653109#comment-14653109
 ] 

Varadharajan commented on SPARK-9555:
-------------------------------------

[~srowen] Thanks for your response.

I tried deleting ~/.ivy2 and restarting the spark shell. I don't see that sbt 
error anymore, but still i'm not able to refer to the 
"com.databricks.spark.csv" driver. 

shell { ~/projects/spark/distro/spark-1.4.1 }-> bin/spark-shell  --packages 
com.databricks:spark-csv_2.11:1.1.0 --master "local"
Ivy Default Cache set to: /Users/varadham/.ivy2/cache
The jars for the packages stored in: /Users/varadham/.ivy2/jars
:: loading settings :: url = 
jar:file:/Users/varadham/projects/spark/distro/spark-1.4.1/assembly/target/scala-2.11/spark-assembly-1.4.1-hadoop2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
com.databricks#spark-csv_2.11 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
        confs: [default]
        found com.databricks#spark-csv_2.11;1.1.0 in central
        found org.apache.commons#commons-csv;1.1 in central
        found com.univocity#univocity-parsers;1.5.1 in central
downloading 
https://repo1.maven.org/maven2/com/databricks/spark-csv_2.11/1.1.0/spark-csv_2.11-1.1.0.jar
 ...
        [SUCCESSFUL ] com.databricks#spark-csv_2.11;1.1.0!spark-csv_2.11.jar 
(467ms)
downloading 
https://repo1.maven.org/maven2/org/apache/commons/commons-csv/1.1/commons-csv-1.1.jar
 ...
        [SUCCESSFUL ] org.apache.commons#commons-csv;1.1!commons-csv.jar (167ms)
downloading 
https://repo1.maven.org/maven2/com/univocity/univocity-parsers/1.5.1/univocity-parsers-1.5.1.jar
 ...
        [SUCCESSFUL ] 
com.univocity#univocity-parsers;1.5.1!univocity-parsers.jar (478ms)
:: resolution report :: resolve 4512ms :: artifacts dl 1114ms
        :: modules in use:
        com.databricks#spark-csv_2.11;1.1.0 from central in [default]
        com.univocity#univocity-parsers;1.5.1 from central in [default]
        org.apache.commons#commons-csv;1.1 from central in [default]
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   3   |   3   |   3   |   0   ||   3   |   3   |
        ---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent
        confs: [default]
        3 artifacts copied, 0 already retrieved (264kB/5ms)
log4j:WARN No appenders could be found for logger 
(org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more 
info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/08/04 11:16:52 INFO SecurityManager: Changing view acls to: varadham
15/08/04 11:16:52 INFO SecurityManager: Changing modify acls to: varadham
15/08/04 11:16:52 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(varadham); users 
with modify permissions: Set(varadham)
15/08/04 11:16:53 INFO HttpServer: Starting HTTP Server
15/08/04 11:16:53 INFO Utils: Successfully started service 'HTTP server' on 
port 60782.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.4.1
      /_/

Using Scala version 2.11.6 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_05)
Type in expressions to have them evaluated.
Type :help for more information.
15/08/04 11:16:56 INFO Main: Spark class server started at 
http://172.18.56.195:60782
15/08/04 11:16:56 INFO SparkContext: Running Spark version 1.4.1
15/08/04 11:16:56 INFO SecurityManager: Changing view acls to: varadham
15/08/04 11:16:56 INFO SecurityManager: Changing modify acls to: varadham
15/08/04 11:16:56 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(varadham); users 
with modify permissions: Set(varadham)
15/08/04 11:16:56 INFO Slf4jLogger: Slf4jLogger started
15/08/04 11:16:56 INFO Remoting: Starting remoting
15/08/04 11:16:56 INFO Remoting: Remoting started; listening on addresses 
:[akka.tcp://[email protected]:60783]
15/08/04 11:16:56 INFO Utils: Successfully started service 'sparkDriver' on 
port 60783.
15/08/04 11:16:56 INFO SparkEnv: Registering MapOutputTracker
15/08/04 11:16:56 INFO SparkEnv: Registering BlockManagerMaster
15/08/04 11:16:56 INFO DiskBlockManager: Created local directory at 
/private/var/folders/y9/s2j35hkn1jz4nxygvx24bhvc0000gn/T/spark-a1eae3fc-27cc-4eb1-b469-debad9349e93/blockmgr-31ab8d35-15a4-4122-b056-66ee09c4fed1
15/08/04 11:16:56 INFO MemoryStore: MemoryStore started with capacity 265.1 MB
15/08/04 11:16:56 INFO HttpFileServer: HTTP File server directory is 
/private/var/folders/y9/s2j35hkn1jz4nxygvx24bhvc0000gn/T/spark-a1eae3fc-27cc-4eb1-b469-debad9349e93/httpd-4e27bf92-fbff-482a-bf22-34621039e95b
15/08/04 11:16:56 INFO HttpServer: Starting HTTP Server
15/08/04 11:16:56 INFO Utils: Successfully started service 'HTTP file server' 
on port 60784.
15/08/04 11:16:56 INFO SparkEnv: Registering OutputCommitCoordinator
15/08/04 11:16:56 INFO Utils: Successfully started service 'SparkUI' on port 
4040.
15/08/04 11:16:56 INFO SparkUI: Started SparkUI at http://172.18.56.195:4040
15/08/04 11:16:56 INFO SparkContext: Added JAR 
file:/Users/varadham/.ivy2/jars/com.databricks_spark-csv_2.11-1.1.0.jar at 
http://172.18.56.195:60784/jars/com.databricks_spark-csv_2.11-1.1.0.jar with 
timestamp 1438667216841
15/08/04 11:16:56 INFO SparkContext: Added JAR 
file:/Users/varadham/.ivy2/jars/org.apache.commons_commons-csv-1.1.jar at 
http://172.18.56.195:60784/jars/org.apache.commons_commons-csv-1.1.jar with 
timestamp 1438667216842
15/08/04 11:16:56 INFO SparkContext: Added JAR 
file:/Users/varadham/.ivy2/jars/com.univocity_univocity-parsers-1.5.1.jar at 
http://172.18.56.195:60784/jars/com.univocity_univocity-parsers-1.5.1.jar with 
timestamp 1438667216842
15/08/04 11:16:56 INFO Executor: Starting executor ID driver on host localhost
15/08/04 11:16:56 INFO Executor: Using REPL class URI: 
http://172.18.56.195:60782
15/08/04 11:16:56 INFO Utils: Successfully started service 
'org.apache.spark.network.netty.NettyBlockTransferService' on port 60785.
15/08/04 11:16:56 INFO NettyBlockTransferService: Server created on 60785
15/08/04 11:16:56 INFO BlockManagerMaster: Trying to register BlockManager
15/08/04 11:16:56 INFO BlockManagerMasterEndpoint: Registering block manager 
localhost:60785 with 265.1 MB RAM, BlockManagerId(driver, localhost, 60785)
15/08/04 11:16:56 INFO BlockManagerMaster: Registered BlockManager
15/08/04 11:16:57 INFO Main: Created spark context..
Spark context available as sc.
15/08/04 11:16:57 INFO Main: Created sql context..
SQL context available as sqlContext.



scala>val df = sqlContext.load("com.databricks.spark.csv", Map("path" -> 
"cars.csv", "header" -> "true"))
warning: there was one deprecation warning; re-run with -deprecation for details
java.lang.RuntimeException: Failed to load class for data source: 
com.databricks.spark.csv
  at scala.sys.package$.error(package.scala:27)
  at 
org.apache.spark.sql.sources.ResolvedDataSource$.lookupDataSource(ddl.scala:220)
  at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:233)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
  at org.apache.spark.sql.SQLContext.load(SQLContext.scala:1253)
  ... 49 elided



> Cannot use spark-csv in spark-shell
> -----------------------------------
>
>                 Key: SPARK-9555
>                 URL: https://issues.apache.org/jira/browse/SPARK-9555
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 1.4.1
>            Reporter: Varadharajan
>
> I wanted to use spark-csv inside spark-shell and this is the failure:
> I'm currently running Spark 1.4.1 built with Scala 2.11
> shell-> bin/spark-shell  --total-executor-cores 4 --packages 
> com.databricks:spark-csv_2.11:1.1.0 --master "local"
> Ivy Default Cache set to: /Users/varadham/.ivy2/cache
> The jars for the packages stored in: /Users/varadham/.ivy2/jars
> :: loading settings :: url = 
> jar:file:/Users/varadham/projects/spark/distro/spark-1.4.1/assembly/target/scala-2.11/spark-assembly-1.4.1-hadoop2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
> com.databricks#spark-csv_2.11 added as a dependency
> :: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
>       confs: [default]
>       found com.databricks#spark-csv_2.11;1.1.0 in central
>       found org.apache.commons#commons-csv;1.1 in list
>       found com.univocity#univocity-parsers;1.5.1 in list
> :: resolution report :: resolve 132ms :: artifacts dl 3ms
>       :: modules in use:
>       com.databricks#spark-csv_2.11;1.1.0 from central in [default]
>       com.univocity#univocity-parsers;1.5.1 from list in [default]
>       org.apache.commons#commons-csv;1.1 from list in [default]
>       ---------------------------------------------------------------------
>       |                  |            modules            ||   artifacts   |
>       |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
>       ---------------------------------------------------------------------
>       |      default     |   3   |   0   |   0   |   0   ||   3   |   0   |
>       ---------------------------------------------------------------------
> :: problems summary ::
> :::: ERRORS
>       unknown resolver sbt-chain
>       unknown resolver sbt-chain
> :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
> :: retrieving :: org.apache.spark#spark-submit-parent
>       confs: [default]
>       0 artifacts copied, 3 already retrieved (0kB/7ms)
> log4j:WARN No appenders could be found for logger 
> (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
> log4j:WARN Please initialize the log4j system properly.
> log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more 
> info.
> Using Spark's default log4j profile: 
> org/apache/spark/log4j-defaults.properties
> 15/08/03 15:43:41 INFO SecurityManager: Changing view acls to: varadham
> 15/08/03 15:43:41 INFO SecurityManager: Changing modify acls to: varadham
> 15/08/03 15:43:41 INFO SecurityManager: SecurityManager: authentication 
> disabled; ui acls disabled; users with view permissions: Set(varadham); users 
> with modify permissions: Set(varadham)
> 15/08/03 15:43:41 INFO HttpServer: Starting HTTP Server
> 15/08/03 15:43:41 INFO Utils: Successfully started service 'HTTP server' on 
> port 51112.
> Welcome to
>       ____              __
>      / __/__  ___ _____/ /__
>     _\ \/ _ \/ _ `/ __/  '_/
>    /___/ .__/\_,_/_/ /_/\_\   version 1.4.1
>       /_/
> Using Scala version 2.11.6 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_05)
> Type in expressions to have them evaluated.
> Type :help for more information.
> 15/08/03 15:43:44 INFO Main: Spark class server started at 
> http://172.18.56.195:51112
> 15/08/03 15:43:44 INFO SparkContext: Running Spark version 1.4.1
> 15/08/03 15:43:44 INFO SecurityManager: Changing view acls to: varadham
> 15/08/03 15:43:44 INFO SecurityManager: Changing modify acls to: varadham
> 15/08/03 15:43:44 INFO SecurityManager: SecurityManager: authentication 
> disabled; ui acls disabled; users with view permissions: Set(varadham); users 
> with modify permissions: Set(varadham)
> 15/08/03 15:43:44 INFO Slf4jLogger: Slf4jLogger started
> 15/08/03 15:43:45 INFO Remoting: Starting remoting
> 15/08/03 15:43:45 INFO Remoting: Remoting started; listening on addresses 
> :[akka.tcp://[email protected]:51118]
> 15/08/03 15:43:45 INFO Utils: Successfully started service 'sparkDriver' on 
> port 51118.
> 15/08/03 15:43:45 INFO SparkEnv: Registering MapOutputTracker
> 15/08/03 15:43:45 INFO SparkEnv: Registering BlockManagerMaster
> 15/08/03 15:43:45 INFO DiskBlockManager: Created local directory at 
> /private/var/folders/y9/s2j35hkn1jz4nxygvx24bhvc0000gn/T/spark-ea4df8ca-5d33-424b-ac1e-b71bc21cd098/blockmgr-e26e602e-6615-4d2f-94f5-dff0b4f6ea78
> 15/08/03 15:43:45 INFO MemoryStore: MemoryStore started with capacity 265.1 MB
> 15/08/03 15:43:45 INFO HttpFileServer: HTTP File server directory is 
> /private/var/folders/y9/s2j35hkn1jz4nxygvx24bhvc0000gn/T/spark-ea4df8ca-5d33-424b-ac1e-b71bc21cd098/httpd-df0e2057-7445-4129-9192-d6d86c26b765
> 15/08/03 15:43:45 INFO HttpServer: Starting HTTP Server
> 15/08/03 15:43:45 INFO Utils: Successfully started service 'HTTP file server' 
> on port 51119.
> 15/08/03 15:43:45 INFO SparkEnv: Registering OutputCommitCoordinator
> 15/08/03 15:43:45 INFO Utils: Successfully started service 'SparkUI' on port 
> 4040.
> 15/08/03 15:43:45 INFO SparkUI: Started SparkUI at http://172.18.56.195:4040
> 15/08/03 15:43:45 INFO SparkContext: Added JAR 
> file:/Users/varadham/.ivy2/jars/com.databricks_spark-csv_2.11-1.1.0.jar at 
> http://172.18.56.195:51119/jars/com.databricks_spark-csv_2.11-1.1.0.jar with 
> timestamp 1438596825328
> 15/08/03 15:43:45 INFO SparkContext: Added JAR 
> file:/Users/varadham/.ivy2/jars/org.apache.commons_commons-csv-1.1.jar at 
> http://172.18.56.195:51119/jars/org.apache.commons_commons-csv-1.1.jar with 
> timestamp 1438596825329
> 15/08/03 15:43:45 INFO SparkContext: Added JAR 
> file:/Users/varadham/.ivy2/jars/com.univocity_univocity-parsers-1.5.1.jar at 
> http://172.18.56.195:51119/jars/com.univocity_univocity-parsers-1.5.1.jar 
> with timestamp 1438596825330
> 15/08/03 15:43:45 INFO Executor: Starting executor ID driver on host localhost
> 15/08/03 15:43:45 INFO Executor: Using REPL class URI: 
> http://172.18.56.195:51112
> 15/08/03 15:43:45 INFO Utils: Successfully started service 
> 'org.apache.spark.network.netty.NettyBlockTransferService' on port 51121.
> 15/08/03 15:43:45 INFO NettyBlockTransferService: Server created on 51121
> 15/08/03 15:43:45 INFO BlockManagerMaster: Trying to register BlockManager
> 15/08/03 15:43:45 INFO BlockManagerMasterEndpoint: Registering block manager 
> localhost:51121 with 265.1 MB RAM, BlockManagerId(driver, localhost, 51121)
> 15/08/03 15:43:45 INFO BlockManagerMaster: Registered BlockManager
> 15/08/03 15:43:45 INFO Main: Created spark context..
> Spark context available as sc.
> 15/08/03 15:43:46 INFO Main: Created sql context..
> SQL context available as sqlContext.
> scala> import com.databricks.spark._
> <console>:20: error: object databricks is not a member of package com
>        import com.databricks.spark._



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to