[ https://issues.apache.org/jira/browse/SPARK-9555?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen resolved SPARK-9555. ------------------------------ Resolution: Not A Problem I think this is a question about usage of that library then, which is not part of Spark. There is probably another step required to register the data source. > Cannot use spark-csv in spark-shell > ----------------------------------- > > Key: SPARK-9555 > URL: https://issues.apache.org/jira/browse/SPARK-9555 > Project: Spark > Issue Type: Bug > Components: Spark Shell > Affects Versions: 1.4.1 > Reporter: Varadharajan > > I wanted to use spark-csv inside spark-shell and this is the failure: > I'm currently running Spark 1.4.1 built with Scala 2.11 > shell-> bin/spark-shell --total-executor-cores 4 --packages > com.databricks:spark-csv_2.11:1.1.0 --master "local" > Ivy Default Cache set to: /Users/varadham/.ivy2/cache > The jars for the packages stored in: /Users/varadham/.ivy2/jars > :: loading settings :: url = > jar:file:/Users/varadham/projects/spark/distro/spark-1.4.1/assembly/target/scala-2.11/spark-assembly-1.4.1-hadoop2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml > com.databricks#spark-csv_2.11 added as a dependency > :: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0 > confs: [default] > found com.databricks#spark-csv_2.11;1.1.0 in central > found org.apache.commons#commons-csv;1.1 in list > found com.univocity#univocity-parsers;1.5.1 in list > :: resolution report :: resolve 132ms :: artifacts dl 3ms > :: modules in use: > com.databricks#spark-csv_2.11;1.1.0 from central in [default] > com.univocity#univocity-parsers;1.5.1 from list in [default] > org.apache.commons#commons-csv;1.1 from list in [default] > --------------------------------------------------------------------- > | | modules || artifacts | > | conf | number| search|dwnlded|evicted|| number|dwnlded| > --------------------------------------------------------------------- > | default | 3 | 0 | 0 | 0 || 3 | 0 | > --------------------------------------------------------------------- > :: problems summary :: > :::: ERRORS > unknown resolver sbt-chain > unknown resolver sbt-chain > :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS > :: retrieving :: org.apache.spark#spark-submit-parent > confs: [default] > 0 artifacts copied, 3 already retrieved (0kB/7ms) > log4j:WARN No appenders could be found for logger > (org.apache.hadoop.metrics2.lib.MutableMetricsFactory). > log4j:WARN Please initialize the log4j system properly. > log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more > info. > Using Spark's default log4j profile: > org/apache/spark/log4j-defaults.properties > 15/08/03 15:43:41 INFO SecurityManager: Changing view acls to: varadham > 15/08/03 15:43:41 INFO SecurityManager: Changing modify acls to: varadham > 15/08/03 15:43:41 INFO SecurityManager: SecurityManager: authentication > disabled; ui acls disabled; users with view permissions: Set(varadham); users > with modify permissions: Set(varadham) > 15/08/03 15:43:41 INFO HttpServer: Starting HTTP Server > 15/08/03 15:43:41 INFO Utils: Successfully started service 'HTTP server' on > port 51112. > Welcome to > ____ __ > / __/__ ___ _____/ /__ > _\ \/ _ \/ _ `/ __/ '_/ > /___/ .__/\_,_/_/ /_/\_\ version 1.4.1 > /_/ > Using Scala version 2.11.6 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_05) > Type in expressions to have them evaluated. > Type :help for more information. > 15/08/03 15:43:44 INFO Main: Spark class server started at > http://172.18.56.195:51112 > 15/08/03 15:43:44 INFO SparkContext: Running Spark version 1.4.1 > 15/08/03 15:43:44 INFO SecurityManager: Changing view acls to: varadham > 15/08/03 15:43:44 INFO SecurityManager: Changing modify acls to: varadham > 15/08/03 15:43:44 INFO SecurityManager: SecurityManager: authentication > disabled; ui acls disabled; users with view permissions: Set(varadham); users > with modify permissions: Set(varadham) > 15/08/03 15:43:44 INFO Slf4jLogger: Slf4jLogger started > 15/08/03 15:43:45 INFO Remoting: Starting remoting > 15/08/03 15:43:45 INFO Remoting: Remoting started; listening on addresses > :[akka.tcp://sparkDriver@172.18.56.195:51118] > 15/08/03 15:43:45 INFO Utils: Successfully started service 'sparkDriver' on > port 51118. > 15/08/03 15:43:45 INFO SparkEnv: Registering MapOutputTracker > 15/08/03 15:43:45 INFO SparkEnv: Registering BlockManagerMaster > 15/08/03 15:43:45 INFO DiskBlockManager: Created local directory at > /private/var/folders/y9/s2j35hkn1jz4nxygvx24bhvc0000gn/T/spark-ea4df8ca-5d33-424b-ac1e-b71bc21cd098/blockmgr-e26e602e-6615-4d2f-94f5-dff0b4f6ea78 > 15/08/03 15:43:45 INFO MemoryStore: MemoryStore started with capacity 265.1 MB > 15/08/03 15:43:45 INFO HttpFileServer: HTTP File server directory is > /private/var/folders/y9/s2j35hkn1jz4nxygvx24bhvc0000gn/T/spark-ea4df8ca-5d33-424b-ac1e-b71bc21cd098/httpd-df0e2057-7445-4129-9192-d6d86c26b765 > 15/08/03 15:43:45 INFO HttpServer: Starting HTTP Server > 15/08/03 15:43:45 INFO Utils: Successfully started service 'HTTP file server' > on port 51119. > 15/08/03 15:43:45 INFO SparkEnv: Registering OutputCommitCoordinator > 15/08/03 15:43:45 INFO Utils: Successfully started service 'SparkUI' on port > 4040. > 15/08/03 15:43:45 INFO SparkUI: Started SparkUI at http://172.18.56.195:4040 > 15/08/03 15:43:45 INFO SparkContext: Added JAR > file:/Users/varadham/.ivy2/jars/com.databricks_spark-csv_2.11-1.1.0.jar at > http://172.18.56.195:51119/jars/com.databricks_spark-csv_2.11-1.1.0.jar with > timestamp 1438596825328 > 15/08/03 15:43:45 INFO SparkContext: Added JAR > file:/Users/varadham/.ivy2/jars/org.apache.commons_commons-csv-1.1.jar at > http://172.18.56.195:51119/jars/org.apache.commons_commons-csv-1.1.jar with > timestamp 1438596825329 > 15/08/03 15:43:45 INFO SparkContext: Added JAR > file:/Users/varadham/.ivy2/jars/com.univocity_univocity-parsers-1.5.1.jar at > http://172.18.56.195:51119/jars/com.univocity_univocity-parsers-1.5.1.jar > with timestamp 1438596825330 > 15/08/03 15:43:45 INFO Executor: Starting executor ID driver on host localhost > 15/08/03 15:43:45 INFO Executor: Using REPL class URI: > http://172.18.56.195:51112 > 15/08/03 15:43:45 INFO Utils: Successfully started service > 'org.apache.spark.network.netty.NettyBlockTransferService' on port 51121. > 15/08/03 15:43:45 INFO NettyBlockTransferService: Server created on 51121 > 15/08/03 15:43:45 INFO BlockManagerMaster: Trying to register BlockManager > 15/08/03 15:43:45 INFO BlockManagerMasterEndpoint: Registering block manager > localhost:51121 with 265.1 MB RAM, BlockManagerId(driver, localhost, 51121) > 15/08/03 15:43:45 INFO BlockManagerMaster: Registered BlockManager > 15/08/03 15:43:45 INFO Main: Created spark context.. > Spark context available as sc. > 15/08/03 15:43:46 INFO Main: Created sql context.. > SQL context available as sqlContext. > scala> import com.databricks.spark._ > <console>:20: error: object databricks is not a member of package com > import com.databricks.spark._ -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org