Github user ithjz commented on the issue:
https://github.com/apache/spark/pull/18416
I run the examples provided by the official website, errors, missing the
necessary packages, and I hope someone will help me
[hadoop@hadoop01 bin]$ sh spark-shell --master local[9]
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use
setLogLevel(newLevel).
[2017-07-06 11:59:23,252] WARN Unable to load native-hadoop library for
your platform... using builtin-java classes where applicable
(org.apache.hadoop.util.NativeCodeLoader:62)
[2017-07-06 11:59:23,356] WARN
SPARK_CLASSPATH was detected (set to
'/data/spark/jars/mysql-connector-java-5.1.40-bin.jar:').
This is deprecated in Spark 1.0+.
Please instead use:
- ./spark-submit with --driver-class-path to augment the driver classpath
- spark.executor.extraClassPath to augment the executor classpath
(org.apache.spark.SparkConf:66)
[2017-07-06 11:59:23,357] WARN Setting 'spark.executor.extraClassPath' to
'/data/spark/jars/mysql-connector-java-5.1.40-bin.jar:' as a work-around.
(org.apache.spark.SparkConf:66)
[2017-07-06 11:59:23,357] WARN Setting 'spark.driver.extraClassPath' to
'/data/spark/jars/mysql-connector-java-5.1.40-bin.jar:' as a work-around.
(org.apache.spark.SparkConf:66)
Spark context Web UI available at http://192.168.8.29:4040
Spark context available as 'sc' (master = local[9], app id =
local-1499313564077).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.1.0
/_/
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java
1.8.0_111)
Type in expressions to have them evaluated.
Type :help for more information.
scala> val ds1 =
spark.readStream.format("kafka").option("kafka.bootstrap.servers",
"host1:port1,host2:port2").option("subscribe", "topic1").load()
java.lang.ClassNotFoundException: Failed to find data source: kafka. Please
find packages at http://spark.apache.org/third-party-projects.html
at
org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:569)
at
org.apache.spark.sql.execution.datasources.DataSource.providingClass$lzycompute(DataSource.scala:86)
at
org.apache.spark.sql.execution.datasources.DataSource.providingClass(DataSource.scala:86)
at
org.apache.spark.sql.execution.datasources.DataSource.sourceSchema(DataSource.scala:197)
at
org.apache.spark.sql.execution.datasources.DataSource.sourceInfo$lzycompute(DataSource.scala:87)
at
org.apache.spark.sql.execution.datasources.DataSource.sourceInfo(DataSource.scala:87)
at
org.apache.spark.sql.execution.streaming.StreamingRelation$.apply(StreamingRelation.scala:30)
at
org.apache.spark.sql.streaming.DataStreamReader.load(DataStreamReader.scala:124)
... 48 elided
Caused by: java.lang.ClassNotFoundException: kafka.DefaultSource
at
scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at
org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25$$anonfun$apply$13.apply(DataSource.scala:554)
at
org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25$$anonfun$apply$13.apply(DataSource.scala:554)
at scala.util.Try$.apply(Try.scala:192)
at
org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25.apply(DataSource.scala:554)
at
org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25.apply(DataSource.scala:554)
at scala.util.Try.orElse(Try.scala:84)
at
org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:554)
... 55 more
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]