[ https://issues.apache.org/jira/browse/SPARK-19907?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen closed SPARK-19907. ----------------------------- > Spark Submit Does not pick up the HBase Jars > -------------------------------------------- > > Key: SPARK-19907 > URL: https://issues.apache.org/jira/browse/SPARK-19907 > Project: Spark > Issue Type: Task > Components: DStreams, Spark Submit, YARN > Affects Versions: 2.0.0 > Environment: Linux, cloudera-jdk-1.7 > Reporter: Ramchandhar Rapolu > Priority: Blocker > > Using properties file: > /opt/cloudera/parcels/SPARK2-2.0.0.cloudera1-1.cdh5.7.0.p0.113931/lib/spark2/conf/spark-defaults.conf > Adding default property: > spark.serializer=org.apache.spark.serializer.KryoSerializer > Adding default property: > spark.yarn.jars=local:/opt/cloudera/parcels/SPARK2-2.0.0.cloudera1-1.cdh5.7.0.p0.113931/lib/spark2/jars/*:/opt/cloudera/parcels/SPARK2-2.0.0.cloudera1-1.cdh5.7.0.p0.113931/lib/hbase/jars/* > Adding default property: spark.eventLog.enabled=true > Adding default property: spark.hadoop.mapreduce.application.classpath= > Adding default property: spark.shuffle.service.enabled=true > Adding default property: > spark.driver.extraLibraryPath=/opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/hadoop/lib/native > Adding default property: > spark.yarn.historyServer.address=http://instance-26765.bigstep.io:18089 > Adding default property: spark.ui.killEnabled=true > Adding default property: > spark.sql.hive.metastore.jars=${env:HADOOP_COMMON_HOME}/../hive/lib/*:${env:HADOOP_COMMON_HOME}/client/* > Adding default property: spark.dynamicAllocation.schedulerBacklogTimeout=1 > Adding default property: > spark.yarn.am.extraLibraryPath=/opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/hadoop/lib/native > Adding default property: spark.yarn.config.gatewayPath=/opt/cloudera/parcels > Adding default property: > spark.yarn.config.replacementPath={{HADOOP_COMMON_HOME}}/../../.. > Adding default property: spark.submit.deployMode=client > Adding default property: spark.shuffle.service.port=7337 > Adding default property: spark.master=yarn > Adding default property: spark.authenticate=false > Adding default property: > spark.executor.extraLibraryPath=/opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/hadoop/lib/native > Adding default property: > spark.eventLog.dir=hdfs://instance-26765.bigstep.io:8020/user/spark/spark2ApplicationHistory > Adding default property: spark.dynamicAllocation.enabled=true > Adding default property: spark.sql.catalogImplementation=hive > Adding default property: spark.hadoop.yarn.application.classpath= > Adding default property: spark.dynamicAllocation.minExecutors=0 > Adding default property: spark.dynamicAllocation.executorIdleTimeout=60 > Adding default property: spark.sql.hive.metastore.version=1.1.0 > Parsed arguments: > master yarn > deployMode client > executorMemory 2g > executorCores 1 > totalExecutorCores null > propertiesFile > /opt/cloudera/parcels/SPARK2-2.0.0.cloudera1-1.cdh5.7.0.p0.113931/lib/spark2/conf/spark-defaults.conf > driverMemory 4g > driverCores null > driverExtraClassPath null > driverExtraLibraryPath > /opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/hadoop/lib/native > driverExtraJavaOptions null > supervise false > queue null > numExecutors null > files null > pyFiles null > archives null > mainClass com.golfbreaks.spark.streaming.Test > primaryResource > file:/opt/golfbreaks/spark-jars/streaming-1.0-jar-with-dependencies.jar > name com.golfbreaks.spark.streaming.Test > childArgs [] > jars > file:/opt/golfbreaks/spark-jars/./lib/activation-1.1.jar,file:/opt/golfbreaks/spark-jars/./lib/antlr4-runtime-4.5.3.jar,file:/opt/golfbreaks/spark-jars/./lib/aopalliance-repackaged-2.4.0-b34.jar,file:/opt/golfbreaks/spark-jars/./lib/apacheds-i18n-2.0.0-M15.jar,file:/opt/golfbreaks/spark-jars/./lib/apacheds-kerberos-codec-2.0.0-M15.jar,file:/opt/golfbreaks/spark-jars/./lib/api-asn1-api-1.0.0-M20.jar,file:/opt/golfbreaks/spark-jars/./lib/api-util-1.0.0-M20.jar,file:/opt/golfbreaks/spark-jars/./lib/asm-3.1.jar,file:/opt/golfbreaks/spark-jars/./lib/avro-1.7.6-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/avro-ipc-1.7.6-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/avro-ipc-1.7.6-cdh5.7.0-tests.jar,file:/opt/golfbreaks/spark-jars/./lib/avro-mapred-1.7.6-cdh5.7.0-hadoop2.jar,file:/opt/golfbreaks/spark-jars/./lib/aws-java-sdk-core-1.10.6.jar,file:/opt/golfbreaks/spark-jars/./lib/aws-java-sdk-kms-1.10.6.jar,file:/opt/golfbreaks/spark-jars/./lib/aws-java-sdk-s3-1.10.6.jar,file:/opt/golfbreaks/spark-jars/./lib/chill_2.11-0.8.0.jar,file:/opt/golfbreaks/spark-jars/./lib/chill-java-0.8.0.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-beanutils-1.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-beanutils-core-1.8.0.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-cli-1.2.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-codec-1.9.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-collections-3.2.2.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-compiler-2.7.8.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-compress-1.4.1.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-configuration-1.6.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-crypto-1.0.0.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-daemon-1.0.13.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-digester-1.8.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-el-1.0.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-httpclient-3.1.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-io-2.4.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-lang-2.6.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-lang3-3.3.2.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-logging-1.2.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-math-2.1.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-math3-3.4.1.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-net-2.2.jar,file:/opt/golfbreaks/spark-jars/./lib/compress-lzf-1.0.3.jar,file:/opt/golfbreaks/spark-jars/./lib/core-3.1.1.jar,file:/opt/golfbreaks/spark-jars/./lib/curator-client-2.7.1.jar,file:/opt/golfbreaks/spark-jars/./lib/curator-framework-2.7.1.jar,file:/opt/golfbreaks/spark-jars/./lib/curator-recipes-2.7.1.jar,file:/opt/golfbreaks/spark-jars/./lib/disruptor-3.3.0.jar,file:/opt/golfbreaks/spark-jars/./lib/fastutil-6.3.jar,file:/opt/golfbreaks/spark-jars/./lib/findbugs-annotations-1.3.9-1.jar,file:/opt/golfbreaks/spark-jars/./lib/gson-2.8.0.jar,file:/opt/golfbreaks/spark-jars/./lib/guava-12.0.1.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-annotations-2.6.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-auth-2.6.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-aws-2.6.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-client-2.6.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-common-2.6.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-core-2.6.0-mr1-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-hdfs-2.6.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-hdfs-2.6.0-cdh5.9.0-tests.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-mapreduce-client-app-2.6.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-mapreduce-client-common-2.6.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-mapreduce-client-core-2.6.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-mapreduce-client-jobclient-2.6.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-mapreduce-client-shuffle-2.6.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-yarn-api-2.6.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-yarn-client-2.6.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-yarn-common-2.6.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-yarn-server-common-2.6.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hamcrest-core-1.3.jar,file:/opt/golfbreaks/spark-jars/./lib/hbase-annotations-1.2.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hbase-client-1.2.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hbase-common-1.2.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hbase-common-1.2.0-cdh5.9.0-tests.jar,file:/opt/golfbreaks/spark-jars/./lib/hbase-hadoop2-compat-1.2.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hbase-hadoop-compat-1.2.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hbase-prefix-tree-1.2.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hbase-procedure-1.2.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hbase-protocol-1.2.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hbase-server-1.2.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hbase-spark-1.2.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/high-scale-lib-1.1.1.jar,file:/opt/golfbreaks/spark-jars/./lib/hk2-api-2.4.0-b34.jar,file:/opt/golfbreaks/spark-jars/./lib/hk2-locator-2.4.0-b34.jar,file:/opt/golfbreaks/spark-jars/./lib/hk2-utils-2.4.0-b34.jar,file:/opt/golfbreaks/spark-jars/./lib/hsqldb-1.8.0.10.jar,file:/opt/golfbreaks/spark-jars/./lib/htrace-core-3.2.0-incubating.jar,file:/opt/golfbreaks/spark-jars/./lib/htrace-core4-4.0.1-incubating.jar,file:/opt/golfbreaks/spark-jars/./lib/httpclient-4.1.2.jar,file:/opt/golfbreaks/spark-jars/./lib/httpcore-4.1.2.jar,file:/opt/golfbreaks/spark-jars/./lib/ivy-2.4.0.jar,file:/opt/golfbreaks/spark-jars/./lib/jackson-annotations-2.6.5.jar,file:/opt/golfbreaks/spark-jars/./lib/jackson-core-2.6.5.jar,file:/opt/golfbreaks/spark-jars/./lib/jackson-core-asl-1.8.8.jar,file:/opt/golfbreaks/spark-jars/./lib/jackson-databind-2.6.5.jar,file:/opt/golfbreaks/spark-jars/./lib/jackson-jaxrs-1.8.8.jar,file:/opt/golfbreaks/spark-jars/./lib/jackson-mapper-asl-1.8.8.jar,file:/opt/golfbreaks/spark-jars/./lib/jackson-module-paranamer-2.6.5.jar,file:/opt/golfbreaks/spark-jars/./lib/jackson-module-scala_2.11-2.6.5.jar,file:/opt/golfbreaks/spark-jars/./lib/jackson-xc-1.8.3.jar,file:/opt/golfbreaks/spark-jars/./lib/jamon-runtime-2.4.1.jar,file:/opt/golfbreaks/spark-jars/./lib/janino-2.7.8.jar,file:/opt/golfbreaks/spark-jars/./lib/jasper-compiler-5.5.23.jar,file:/opt/golfbreaks/spark-jars/./lib/jasper-runtime-5.5.23.jar,file:/opt/golfbreaks/spark-jars/./lib/javassist-3.18.1-GA.jar,file:/opt/golfbreaks/spark-jars/./lib/javax.annotation-api-1.2.jar,file:/opt/golfbreaks/spark-jars/./lib/javax.inject-2.4.0-b34.jar,file:/opt/golfbreaks/spark-jars/./lib/java-xmlbuilder-0.4.jar,file:/opt/golfbreaks/spark-jars/./lib/javax.servlet-api-3.1.0.jar,file:/opt/golfbreaks/spark-jars/./lib/javax.ws.rs-api-2.0.1.jar,file:/opt/golfbreaks/spark-jars/./lib/jaxb-api-2.2.2.jar,file:/opt/golfbreaks/spark-jars/./lib/jaxb-impl-2.2.3-1.jar,file:/opt/golfbreaks/spark-jars/./lib/jcl-over-slf4j-1.7.5.jar,file:/opt/golfbreaks/spark-jars/./lib/jcodings-1.0.8.jar,file:/opt/golfbreaks/spark-jars/./lib/jersey-client-2.22.2.jar,file:/opt/golfbreaks/spark-jars/./lib/jersey-common-2.22.2.jar,file:/opt/golfbreaks/spark-jars/./lib/jersey-container-servlet-2.22.2.jar,file:/opt/golfbreaks/spark-jars/./lib/jersey-container-servlet-core-2.22.2.jar,file:/opt/golfbreaks/spark-jars/./lib/jersey-core-1.9.jar,file:/opt/golfbreaks/spark-jars/./lib/jersey-guava-2.22.2.jar,file:/opt/golfbreaks/spark-jars/./lib/jersey-json-1.9.jar,file:/opt/golfbreaks/spark-jars/./lib/jersey-media-jaxb-2.22.2.jar,file:/opt/golfbreaks/spark-jars/./lib/jersey-server-1.9.jar,file:/opt/golfbreaks/spark-jars/./lib/jersey-server-2.22.2.jar,file:/opt/golfbreaks/spark-jars/./lib/jets3t-0.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/jettison-1.1.jar,file:/opt/golfbreaks/spark-jars/./lib/jetty-6.1.26.cloudera.4.jar,file:/opt/golfbreaks/spark-jars/./lib/jetty-sslengine-6.1.26.cloudera.4.jar,file:/opt/golfbreaks/spark-jars/./lib/jetty-util-6.1.26.cloudera.4.jar,file:/opt/golfbreaks/spark-jars/./lib/joni-2.1.2.jar,file:/opt/golfbreaks/spark-jars/./lib/jsch-0.1.42.jar,file:/opt/golfbreaks/spark-jars/./lib/json4s-ast_2.11-3.5.0.jar,file:/opt/golfbreaks/spark-jars/./lib/json4s-core_2.11-3.5.0.jar,file:/opt/golfbreaks/spark-jars/./lib/json4s-jackson_2.11-3.2.11.jar,file:/opt/golfbreaks/spark-jars/./lib/json4s-native_2.11-3.5.0.jar,file:/opt/golfbreaks/spark-jars/./lib/json4s-scalap_2.11-3.5.0.jar,file:/opt/golfbreaks/spark-jars/./lib/jsp-2.1-6.1.14.jar,file:/opt/golfbreaks/spark-jars/./lib/jsp-api-2.1-6.1.14.jar,file:/opt/golfbreaks/spark-jars/./lib/jsp-api-2.1.jar,file:/opt/golfbreaks/spark-jars/./lib/jsr305-1.3.9.jar,file:/opt/golfbreaks/spark-jars/./lib/jul-to-slf4j-1.7.5.jar,file:/opt/golfbreaks/spark-jars/./lib/junit-4.12.jar,file:/opt/golfbreaks/spark-jars/./lib/kafka_2.11-0.9.0-kafka-2.0.0.jar,file:/opt/golfbreaks/spark-jars/./lib/kafka-clients-0.9.0-kafka-2.0.0.jar,file:/opt/golfbreaks/spark-jars/./lib/kryo-shaded-3.0.3.jar,file:/opt/golfbreaks/spark-jars/./lib/leveldbjni-all-1.8.jar,file:/opt/golfbreaks/spark-jars/./lib/log4j-1.2.17.jar,file:/opt/golfbreaks/spark-jars/./lib/lz4-1.3.0.jar,file:/opt/golfbreaks/spark-jars/./lib/mesos-0.21.1-shaded-protobuf.jar,file:/opt/golfbreaks/spark-jars/./lib/metrics-core-2.2.0.jar,file:/opt/golfbreaks/spark-jars/./lib/metrics-core-3.1.2.jar,file:/opt/golfbreaks/spark-jars/./lib/metrics-graphite-3.1.2.jar,file:/opt/golfbreaks/spark-jars/./lib/metrics-json-3.1.2.jar,file:/opt/golfbreaks/spark-jars/./lib/metrics-jvm-3.1.2.jar,file:/opt/golfbreaks/spark-jars/./lib/minlog-1.3.0.jar,file:/opt/golfbreaks/spark-jars/./lib/netty-3.8.0.Final.jar,file:/opt/golfbreaks/spark-jars/./lib/netty-all-4.0.29.Final.jar,file:/opt/golfbreaks/spark-jars/./lib/objenesis-2.1.jar,file:/opt/golfbreaks/spark-jars/./lib/oro-2.0.8.jar,file:/opt/golfbreaks/spark-jars/./lib/osgi-resource-locator-1.0.1.jar,file:/opt/golfbreaks/spark-jars/./lib/paranamer-2.8.jar,file:/opt/golfbreaks/spark-jars/./lib/parquet-avro-1.5.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/parquet-column-1.5.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/parquet-common-1.5.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/parquet-encoding-1.5.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/parquet-format-2.1.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/parquet-hadoop-1.5.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/parquet-jackson-1.5.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/protobuf-java-2.5.0.jar,file:/opt/golfbreaks/spark-jars/./lib/py4j-0.10.3.jar,file:/opt/golfbreaks/spark-jars/./lib/pyrolite-4.13.jar,file:/opt/golfbreaks/spark-jars/./lib/RoaringBitmap-0.5.11.jar,file:/opt/golfbreaks/spark-jars/./lib/scala-library-2.11.8.jar,file:/opt/golfbreaks/spark-jars/./lib/scala-parser-combinators_2.11-1.0.4.jar,file:/opt/golfbreaks/spark-jars/./lib/scala-reflect-2.11.7.jar,file:/opt/golfbreaks/spark-jars/./lib/scalatest_2.11-2.2.6.jar,file:/opt/golfbreaks/spark-jars/./lib/scala-xml_2.11-1.0.6.jar,file:/opt/golfbreaks/spark-jars/./lib/servlet-api-2.5-6.1.14.jar,file:/opt/golfbreaks/spark-jars/./lib/servlet-api-2.5.jar,file:/opt/golfbreaks/spark-jars/./lib/slf4j-api-1.7.5.jar,file:/opt/golfbreaks/spark-jars/./lib/slf4j-log4j12-1.7.5.jar,file:/opt/golfbreaks/spark-jars/./lib/snappy-java-1.0.4.1.jar,file:/opt/golfbreaks/spark-jars/./lib/spark-catalyst_2.11-2.0.0.cloudera1.jar,file:/opt/golfbreaks/spark-jars/./lib/spark-core_2.11-2.0.0.cloudera1.jar,file:/opt/golfbreaks/spark-jars/./lib/spark-launcher_2.11-2.0.0.cloudera1.jar,file:/opt/golfbreaks/spark-jars/./lib/spark-network-common_2.11-2.0.0.cloudera1.jar,file:/opt/golfbreaks/spark-jars/./lib/spark-network-shuffle_2.11-2.0.0.cloudera1.jar,file:/opt/golfbreaks/spark-jars/./lib/spark-sketch_2.11-2.0.0.cloudera1.jar,file:/opt/golfbreaks/spark-jars/./lib/spark-sql_2.11-2.0.0.cloudera1.jar,file:/opt/golfbreaks/spark-jars/./lib/spark-streaming_2.11-2.0.0.cloudera1.jar,file:/opt/golfbreaks/spark-jars/./lib/spark-streaming-kafka-0-8_2.11-2.0.0.cloudera1.jar,file:/opt/golfbreaks/spark-jars/./lib/spark-tags_2.11-2.0.0.cloudera1.jar,file:/opt/golfbreaks/spark-jars/./lib/spark-unsafe_2.11-2.0.0.cloudera1.jar,file:/opt/golfbreaks/spark-jars/./lib/stax-api-1.0-2.jar,file:/opt/golfbreaks/spark-jars/./lib/stream-2.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/univocity-parsers-2.1.1.jar,file:/opt/golfbreaks/spark-jars/./lib/unused-1.0.0.jar,file:/opt/golfbreaks/spark-jars/./lib/validation-api-1.1.0.Final.jar,file:/opt/golfbreaks/spark-jars/./lib/xbean-asm5-shaded-4.4.jar,file:/opt/golfbreaks/spark-jars/./lib/xercesImpl-2.9.1.jar,file:/opt/golfbreaks/spark-jars/./lib/xml-apis-1.3.04.jar,file:/opt/golfbreaks/spark-jars/./lib/xmlenc-0.52.jar,file:/opt/golfbreaks/spark-jars/./lib/xz-1.0.jar,file:/opt/golfbreaks/spark-jars/./lib/zkclient-0.7.jar,file:/opt/golfbreaks/spark-jars/./lib/zookeeper-3.4.5-cdh5.9.0.jar > packages null > packagesExclusions null > repositories null > verbose true > Spark properties used, including those specified through > --conf and those from the properties file > /opt/cloudera/parcels/SPARK2-2.0.0.cloudera1-1.cdh5.7.0.p0.113931/lib/spark2/conf/spark-defaults.conf: > spark.executor.extraLibraryPath -> > /opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/hadoop/lib/native > spark.driver.memory -> 4g > spark.authenticate -> false > spark.yarn.jars -> > local:/opt/cloudera/parcels/SPARK2-2.0.0.cloudera1-1.cdh5.7.0.p0.113931/lib/spark2/jars/*:/opt/cloudera/parcels/SPARK2-2.0.0.cloudera1-1.cdh5.7.0.p0.113931/lib/hbase/jars/* > spark.driver.extraLibraryPath -> > /opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/hadoop/lib/native > spark.yarn.historyServer.address -> http://instance-26765.bigstep.io:18089 > spark.yarn.am.extraLibraryPath -> > /opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/hadoop/lib/native > spark.eventLog.enabled -> true > spark.dynamicAllocation.schedulerBacklogTimeout -> 1 > spark.yarn.config.gatewayPath -> /opt/cloudera/parcels > spark.ui.killEnabled -> true > spark.serializer -> org.apache.spark.serializer.KryoSerializer > spark.shuffle.service.enabled -> true > spark.hadoop.yarn.application.classpath -> > spark.dynamicAllocation.minExecutors -> 0 > spark.dynamicAllocation.executorIdleTimeout -> 60 > spark.yarn.config.replacementPath -> {{HADOOP_COMMON_HOME}}/../../.. > spark.sql.hive.metastore.version -> 1.1.0 > spark.submit.deployMode -> client > spark.shuffle.service.port -> 7337 > spark.hadoop.mapreduce.application.classpath -> > spark.eventLog.dir -> > hdfs://instance-26765.bigstep.io:8020/user/spark/spark2ApplicationHistory > spark.master -> yarn > spark.dynamicAllocation.enabled -> true > spark.sql.catalogImplementation -> hive > spark.sql.hive.metastore.jars -> > ${env:HADOOP_COMMON_HOME}/../hive/lib/*:${env:HADOOP_COMMON_HOME}/client/* > > Main class: > com.golfbreaks.spark.streaming.Test > Arguments: > System properties: > spark.executor.extraLibraryPath -> > /opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/hadoop/lib/native > spark.yarn.dist.jars -> > file:/opt/golfbreaks/spark-jars/./lib/activation-1.1.jar,file:/opt/golfbreaks/spark-jars/./lib/antlr4-runtime-4.5.3.jar,file:/opt/golfbreaks/spark-jars/./lib/aopalliance-repackaged-2.4.0-b34.jar,file:/opt/golfbreaks/spark-jars/./lib/apacheds-i18n-2.0.0-M15.jar,file:/opt/golfbreaks/spark-jars/./lib/apacheds-kerberos-codec-2.0.0-M15.jar,file:/opt/golfbreaks/spark-jars/./lib/api-asn1-api-1.0.0-M20.jar,file:/opt/golfbreaks/spark-jars/./lib/api-util-1.0.0-M20.jar,file:/opt/golfbreaks/spark-jars/./lib/asm-3.1.jar,file:/opt/golfbreaks/spark-jars/./lib/avro-1.7.6-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/avro-ipc-1.7.6-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/avro-ipc-1.7.6-cdh5.7.0-tests.jar,file:/opt/golfbreaks/spark-jars/./lib/avro-mapred-1.7.6-cdh5.7.0-hadoop2.jar,file:/opt/golfbreaks/spark-jars/./lib/aws-java-sdk-core-1.10.6.jar,file:/opt/golfbreaks/spark-jars/./lib/aws-java-sdk-kms-1.10.6.jar,file:/opt/golfbreaks/spark-jars/./lib/aws-java-sdk-s3-1.10.6.jar,file:/opt/golfbreaks/spark-jars/./lib/chill_2.11-0.8.0.jar,file:/opt/golfbreaks/spark-jars/./lib/chill-java-0.8.0.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-beanutils-1.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-beanutils-core-1.8.0.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-cli-1.2.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-codec-1.9.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-collections-3.2.2.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-compiler-2.7.8.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-compress-1.4.1.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-configuration-1.6.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-crypto-1.0.0.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-daemon-1.0.13.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-digester-1.8.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-el-1.0.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-httpclient-3.1.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-io-2.4.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-lang-2.6.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-lang3-3.3.2.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-logging-1.2.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-math-2.1.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-math3-3.4.1.jar,file:/opt/golfbreaks/spark-jars/./lib/commons-net-2.2.jar,file:/opt/golfbreaks/spark-jars/./lib/compress-lzf-1.0.3.jar,file:/opt/golfbreaks/spark-jars/./lib/core-3.1.1.jar,file:/opt/golfbreaks/spark-jars/./lib/curator-client-2.7.1.jar,file:/opt/golfbreaks/spark-jars/./lib/curator-framework-2.7.1.jar,file:/opt/golfbreaks/spark-jars/./lib/curator-recipes-2.7.1.jar,file:/opt/golfbreaks/spark-jars/./lib/disruptor-3.3.0.jar,file:/opt/golfbreaks/spark-jars/./lib/fastutil-6.3.jar,file:/opt/golfbreaks/spark-jars/./lib/findbugs-annotations-1.3.9-1.jar,file:/opt/golfbreaks/spark-jars/./lib/gson-2.8.0.jar,file:/opt/golfbreaks/spark-jars/./lib/guava-12.0.1.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-annotations-2.6.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-auth-2.6.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-aws-2.6.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-client-2.6.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-common-2.6.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-core-2.6.0-mr1-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-hdfs-2.6.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-hdfs-2.6.0-cdh5.9.0-tests.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-mapreduce-client-app-2.6.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-mapreduce-client-common-2.6.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-mapreduce-client-core-2.6.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-mapreduce-client-jobclient-2.6.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-mapreduce-client-shuffle-2.6.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-yarn-api-2.6.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-yarn-client-2.6.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-yarn-common-2.6.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hadoop-yarn-server-common-2.6.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hamcrest-core-1.3.jar,file:/opt/golfbreaks/spark-jars/./lib/hbase-annotations-1.2.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hbase-client-1.2.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hbase-common-1.2.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hbase-common-1.2.0-cdh5.9.0-tests.jar,file:/opt/golfbreaks/spark-jars/./lib/hbase-hadoop2-compat-1.2.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hbase-hadoop-compat-1.2.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hbase-prefix-tree-1.2.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hbase-procedure-1.2.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hbase-protocol-1.2.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hbase-server-1.2.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/hbase-spark-1.2.0-cdh5.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/high-scale-lib-1.1.1.jar,file:/opt/golfbreaks/spark-jars/./lib/hk2-api-2.4.0-b34.jar,file:/opt/golfbreaks/spark-jars/./lib/hk2-locator-2.4.0-b34.jar,file:/opt/golfbreaks/spark-jars/./lib/hk2-utils-2.4.0-b34.jar,file:/opt/golfbreaks/spark-jars/./lib/hsqldb-1.8.0.10.jar,file:/opt/golfbreaks/spark-jars/./lib/htrace-core-3.2.0-incubating.jar,file:/opt/golfbreaks/spark-jars/./lib/htrace-core4-4.0.1-incubating.jar,file:/opt/golfbreaks/spark-jars/./lib/httpclient-4.1.2.jar,file:/opt/golfbreaks/spark-jars/./lib/httpcore-4.1.2.jar,file:/opt/golfbreaks/spark-jars/./lib/ivy-2.4.0.jar,file:/opt/golfbreaks/spark-jars/./lib/jackson-annotations-2.6.5.jar,file:/opt/golfbreaks/spark-jars/./lib/jackson-core-2.6.5.jar,file:/opt/golfbreaks/spark-jars/./lib/jackson-core-asl-1.8.8.jar,file:/opt/golfbreaks/spark-jars/./lib/jackson-databind-2.6.5.jar,file:/opt/golfbreaks/spark-jars/./lib/jackson-jaxrs-1.8.8.jar,file:/opt/golfbreaks/spark-jars/./lib/jackson-mapper-asl-1.8.8.jar,file:/opt/golfbreaks/spark-jars/./lib/jackson-module-paranamer-2.6.5.jar,file:/opt/golfbreaks/spark-jars/./lib/jackson-module-scala_2.11-2.6.5.jar,file:/opt/golfbreaks/spark-jars/./lib/jackson-xc-1.8.3.jar,file:/opt/golfbreaks/spark-jars/./lib/jamon-runtime-2.4.1.jar,file:/opt/golfbreaks/spark-jars/./lib/janino-2.7.8.jar,file:/opt/golfbreaks/spark-jars/./lib/jasper-compiler-5.5.23.jar,file:/opt/golfbreaks/spark-jars/./lib/jasper-runtime-5.5.23.jar,file:/opt/golfbreaks/spark-jars/./lib/javassist-3.18.1-GA.jar,file:/opt/golfbreaks/spark-jars/./lib/javax.annotation-api-1.2.jar,file:/opt/golfbreaks/spark-jars/./lib/javax.inject-2.4.0-b34.jar,file:/opt/golfbreaks/spark-jars/./lib/java-xmlbuilder-0.4.jar,file:/opt/golfbreaks/spark-jars/./lib/javax.servlet-api-3.1.0.jar,file:/opt/golfbreaks/spark-jars/./lib/javax.ws.rs-api-2.0.1.jar,file:/opt/golfbreaks/spark-jars/./lib/jaxb-api-2.2.2.jar,file:/opt/golfbreaks/spark-jars/./lib/jaxb-impl-2.2.3-1.jar,file:/opt/golfbreaks/spark-jars/./lib/jcl-over-slf4j-1.7.5.jar,file:/opt/golfbreaks/spark-jars/./lib/jcodings-1.0.8.jar,file:/opt/golfbreaks/spark-jars/./lib/jersey-client-2.22.2.jar,file:/opt/golfbreaks/spark-jars/./lib/jersey-common-2.22.2.jar,file:/opt/golfbreaks/spark-jars/./lib/jersey-container-servlet-2.22.2.jar,file:/opt/golfbreaks/spark-jars/./lib/jersey-container-servlet-core-2.22.2.jar,file:/opt/golfbreaks/spark-jars/./lib/jersey-core-1.9.jar,file:/opt/golfbreaks/spark-jars/./lib/jersey-guava-2.22.2.jar,file:/opt/golfbreaks/spark-jars/./lib/jersey-json-1.9.jar,file:/opt/golfbreaks/spark-jars/./lib/jersey-media-jaxb-2.22.2.jar,file:/opt/golfbreaks/spark-jars/./lib/jersey-server-1.9.jar,file:/opt/golfbreaks/spark-jars/./lib/jersey-server-2.22.2.jar,file:/opt/golfbreaks/spark-jars/./lib/jets3t-0.9.0.jar,file:/opt/golfbreaks/spark-jars/./lib/jettison-1.1.jar,file:/opt/golfbreaks/spark-jars/./lib/jetty-6.1.26.cloudera.4.jar,file:/opt/golfbreaks/spark-jars/./lib/jetty-sslengine-6.1.26.cloudera.4.jar,file:/opt/golfbreaks/spark-jars/./lib/jetty-util-6.1.26.cloudera.4.jar,file:/opt/golfbreaks/spark-jars/./lib/joni-2.1.2.jar,file:/opt/golfbreaks/spark-jars/./lib/jsch-0.1.42.jar,file:/opt/golfbreaks/spark-jars/./lib/json4s-ast_2.11-3.5.0.jar,file:/opt/golfbreaks/spark-jars/./lib/json4s-core_2.11-3.5.0.jar,file:/opt/golfbreaks/spark-jars/./lib/json4s-jackson_2.11-3.2.11.jar,file:/opt/golfbreaks/spark-jars/./lib/json4s-native_2.11-3.5.0.jar,file:/opt/golfbreaks/spark-jars/./lib/json4s-scalap_2.11-3.5.0.jar,file:/opt/golfbreaks/spark-jars/./lib/jsp-2.1-6.1.14.jar,file:/opt/golfbreaks/spark-jars/./lib/jsp-api-2.1-6.1.14.jar,file:/opt/golfbreaks/spark-jars/./lib/jsp-api-2.1.jar,file:/opt/golfbreaks/spark-jars/./lib/jsr305-1.3.9.jar,file:/opt/golfbreaks/spark-jars/./lib/jul-to-slf4j-1.7.5.jar,file:/opt/golfbreaks/spark-jars/./lib/junit-4.12.jar,file:/opt/golfbreaks/spark-jars/./lib/kafka_2.11-0.9.0-kafka-2.0.0.jar,file:/opt/golfbreaks/spark-jars/./lib/kafka-clients-0.9.0-kafka-2.0.0.jar,file:/opt/golfbreaks/spark-jars/./lib/kryo-shaded-3.0.3.jar,file:/opt/golfbreaks/spark-jars/./lib/leveldbjni-all-1.8.jar,file:/opt/golfbreaks/spark-jars/./lib/log4j-1.2.17.jar,file:/opt/golfbreaks/spark-jars/./lib/lz4-1.3.0.jar,file:/opt/golfbreaks/spark-jars/./lib/mesos-0.21.1-shaded-protobuf.jar,file:/opt/golfbreaks/spark-jars/./lib/metrics-core-2.2.0.jar,file:/opt/golfbreaks/spark-jars/./lib/metrics-core-3.1.2.jar,file:/opt/golfbreaks/spark-jars/./lib/metrics-graphite-3.1.2.jar,file:/opt/golfbreaks/spark-jars/./lib/metrics-json-3.1.2.jar,file:/opt/golfbreaks/spark-jars/./lib/metrics-jvm-3.1.2.jar,file:/opt/golfbreaks/spark-jars/./lib/minlog-1.3.0.jar,file:/opt/golfbreaks/spark-jars/./lib/netty-3.8.0.Final.jar,file:/opt/golfbreaks/spark-jars/./lib/netty-all-4.0.29.Final.jar,file:/opt/golfbreaks/spark-jars/./lib/objenesis-2.1.jar,file:/opt/golfbreaks/spark-jars/./lib/oro-2.0.8.jar,file:/opt/golfbreaks/spark-jars/./lib/osgi-resource-locator-1.0.1.jar,file:/opt/golfbreaks/spark-jars/./lib/paranamer-2.8.jar,file:/opt/golfbreaks/spark-jars/./lib/parquet-avro-1.5.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/parquet-column-1.5.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/parquet-common-1.5.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/parquet-encoding-1.5.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/parquet-format-2.1.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/parquet-hadoop-1.5.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/parquet-jackson-1.5.0-cdh5.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/protobuf-java-2.5.0.jar,file:/opt/golfbreaks/spark-jars/./lib/py4j-0.10.3.jar,file:/opt/golfbreaks/spark-jars/./lib/pyrolite-4.13.jar,file:/opt/golfbreaks/spark-jars/./lib/RoaringBitmap-0.5.11.jar,file:/opt/golfbreaks/spark-jars/./lib/scala-library-2.11.8.jar,file:/opt/golfbreaks/spark-jars/./lib/scala-parser-combinators_2.11-1.0.4.jar,file:/opt/golfbreaks/spark-jars/./lib/scala-reflect-2.11.7.jar,file:/opt/golfbreaks/spark-jars/./lib/scalatest_2.11-2.2.6.jar,file:/opt/golfbreaks/spark-jars/./lib/scala-xml_2.11-1.0.6.jar,file:/opt/golfbreaks/spark-jars/./lib/servlet-api-2.5-6.1.14.jar,file:/opt/golfbreaks/spark-jars/./lib/servlet-api-2.5.jar,file:/opt/golfbreaks/spark-jars/./lib/slf4j-api-1.7.5.jar,file:/opt/golfbreaks/spark-jars/./lib/slf4j-log4j12-1.7.5.jar,file:/opt/golfbreaks/spark-jars/./lib/snappy-java-1.0.4.1.jar,file:/opt/golfbreaks/spark-jars/./lib/spark-catalyst_2.11-2.0.0.cloudera1.jar,file:/opt/golfbreaks/spark-jars/./lib/spark-core_2.11-2.0.0.cloudera1.jar,file:/opt/golfbreaks/spark-jars/./lib/spark-launcher_2.11-2.0.0.cloudera1.jar,file:/opt/golfbreaks/spark-jars/./lib/spark-network-common_2.11-2.0.0.cloudera1.jar,file:/opt/golfbreaks/spark-jars/./lib/spark-network-shuffle_2.11-2.0.0.cloudera1.jar,file:/opt/golfbreaks/spark-jars/./lib/spark-sketch_2.11-2.0.0.cloudera1.jar,file:/opt/golfbreaks/spark-jars/./lib/spark-sql_2.11-2.0.0.cloudera1.jar,file:/opt/golfbreaks/spark-jars/./lib/spark-streaming_2.11-2.0.0.cloudera1.jar,file:/opt/golfbreaks/spark-jars/./lib/spark-streaming-kafka-0-8_2.11-2.0.0.cloudera1.jar,file:/opt/golfbreaks/spark-jars/./lib/spark-tags_2.11-2.0.0.cloudera1.jar,file:/opt/golfbreaks/spark-jars/./lib/spark-unsafe_2.11-2.0.0.cloudera1.jar,file:/opt/golfbreaks/spark-jars/./lib/stax-api-1.0-2.jar,file:/opt/golfbreaks/spark-jars/./lib/stream-2.7.0.jar,file:/opt/golfbreaks/spark-jars/./lib/univocity-parsers-2.1.1.jar,file:/opt/golfbreaks/spark-jars/./lib/unused-1.0.0.jar,file:/opt/golfbreaks/spark-jars/./lib/validation-api-1.1.0.Final.jar,file:/opt/golfbreaks/spark-jars/./lib/xbean-asm5-shaded-4.4.jar,file:/opt/golfbreaks/spark-jars/./lib/xercesImpl-2.9.1.jar,file:/opt/golfbreaks/spark-jars/./lib/xml-apis-1.3.04.jar,file:/opt/golfbreaks/spark-jars/./lib/xmlenc-0.52.jar,file:/opt/golfbreaks/spark-jars/./lib/xz-1.0.jar,file:/opt/golfbreaks/spark-jars/./lib/zkclient-0.7.jar,file:/opt/golfbreaks/spark-jars/./lib/zookeeper-3.4.5-cdh5.9.0.jar > spark.driver.memory -> 4g > spark.executor.memory -> 2g > spark.authenticate -> false > spark.yarn.jars -> > local:/opt/cloudera/parcels/SPARK2-2.0.0.cloudera1-1.cdh5.7.0.p0.113931/lib/spark2/jars/*:/opt/cloudera/parcels/SPARK2-2.0.0.cloudera1-1.cdh5.7.0.p0.113931/lib/hbase/jars/* > spark.driver.extraLibraryPath -> > /opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/hadoop/lib/native > spark.yarn.historyServer.address -> http://instance-26765.bigstep.io:18089 > spark.yarn.am.extraLibraryPath -> > /opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/hadoop/lib/native > spark.eventLog.enabled -> true > spark.dynamicAllocation.schedulerBacklogTimeout -> 1 > SPARK_SUBMIT -> true > spark.yarn.config.gatewayPath -> /opt/cloudera/parcels > spark.ui.killEnabled -> true > spark.serializer -> org.apache.spark.serializer.KryoSerializer > spark.app.name -> com.golfbreaks.spark.streaming.Test > spark.shuffle.service.enabled -> true > spark.hadoop.yarn.application.classpath -> > spark.dynamicAllocation.minExecutors -> 0 > spark.dynamicAllocation.executorIdleTimeout -> 60 > spark.yarn.config.replacementPath -> {{HADOOP_COMMON_HOME}}/../../.. > spark.sql.hive.metastore.version -> 1.1.0 > spark.jars -> > file:/opt/golfbreaks/spark-jars/streaming-1.0-jar-with-dependencies.jar > spark.submit.deployMode -> client > spark.shuffle.service.port -> 7337 > spark.hadoop.mapreduce.application.classpath -> > spark.eventLog.dir -> > hdfs://instance-26765.bigstep.io:8020/user/spark/spark2ApplicationHistory > spark.master -> yarn > spark.dynamicAllocation.enabled -> true > spark.sql.catalogImplementation -> hive > spark.executor.cores -> 1 > spark.sql.hive.metastore.jars -> > ${env:HADOOP_COMMON_HOME}/../hive/lib/*:${env:HADOOP_COMMON_HOME}/client/* > Classpath elements: > file:/opt/golfbreaks/spark-jars/streaming-1.0-jar-with-dependencies.jar > file:/opt/golfbreaks/spark-jars/./lib/activation-1.1.jar > file:/opt/golfbreaks/spark-jars/./lib/antlr4-runtime-4.5.3.jar > file:/opt/golfbreaks/spark-jars/./lib/aopalliance-repackaged-2.4.0-b34.jar > file:/opt/golfbreaks/spark-jars/./lib/apacheds-i18n-2.0.0-M15.jar > file:/opt/golfbreaks/spark-jars/./lib/apacheds-kerberos-codec-2.0.0-M15.jar > file:/opt/golfbreaks/spark-jars/./lib/api-asn1-api-1.0.0-M20.jar > file:/opt/golfbreaks/spark-jars/./lib/api-util-1.0.0-M20.jar > file:/opt/golfbreaks/spark-jars/./lib/asm-3.1.jar > file:/opt/golfbreaks/spark-jars/./lib/avro-1.7.6-cdh5.7.0.jar > file:/opt/golfbreaks/spark-jars/./lib/avro-ipc-1.7.6-cdh5.7.0.jar > file:/opt/golfbreaks/spark-jars/./lib/avro-ipc-1.7.6-cdh5.7.0-tests.jar > file:/opt/golfbreaks/spark-jars/./lib/avro-mapred-1.7.6-cdh5.7.0-hadoop2.jar > file:/opt/golfbreaks/spark-jars/./lib/aws-java-sdk-core-1.10.6.jar > file:/opt/golfbreaks/spark-jars/./lib/aws-java-sdk-kms-1.10.6.jar > file:/opt/golfbreaks/spark-jars/./lib/aws-java-sdk-s3-1.10.6.jar > file:/opt/golfbreaks/spark-jars/./lib/chill_2.11-0.8.0.jar > file:/opt/golfbreaks/spark-jars/./lib/chill-java-0.8.0.jar > file:/opt/golfbreaks/spark-jars/./lib/commons-beanutils-1.7.0.jar > file:/opt/golfbreaks/spark-jars/./lib/commons-beanutils-core-1.8.0.jar > file:/opt/golfbreaks/spark-jars/./lib/commons-cli-1.2.jar > file:/opt/golfbreaks/spark-jars/./lib/commons-codec-1.9.jar > file:/opt/golfbreaks/spark-jars/./lib/commons-collections-3.2.2.jar > file:/opt/golfbreaks/spark-jars/./lib/commons-compiler-2.7.8.jar > file:/opt/golfbreaks/spark-jars/./lib/commons-compress-1.4.1.jar > file:/opt/golfbreaks/spark-jars/./lib/commons-configuration-1.6.jar > file:/opt/golfbreaks/spark-jars/./lib/commons-crypto-1.0.0.jar > file:/opt/golfbreaks/spark-jars/./lib/commons-daemon-1.0.13.jar > file:/opt/golfbreaks/spark-jars/./lib/commons-digester-1.8.jar > file:/opt/golfbreaks/spark-jars/./lib/commons-el-1.0.jar > file:/opt/golfbreaks/spark-jars/./lib/commons-httpclient-3.1.jar > file:/opt/golfbreaks/spark-jars/./lib/commons-io-2.4.jar > file:/opt/golfbreaks/spark-jars/./lib/commons-lang-2.6.jar > file:/opt/golfbreaks/spark-jars/./lib/commons-lang3-3.3.2.jar > file:/opt/golfbreaks/spark-jars/./lib/commons-logging-1.2.jar > file:/opt/golfbreaks/spark-jars/./lib/commons-math-2.1.jar > file:/opt/golfbreaks/spark-jars/./lib/commons-math3-3.4.1.jar > file:/opt/golfbreaks/spark-jars/./lib/commons-net-2.2.jar > file:/opt/golfbreaks/spark-jars/./lib/compress-lzf-1.0.3.jar > file:/opt/golfbreaks/spark-jars/./lib/core-3.1.1.jar > file:/opt/golfbreaks/spark-jars/./lib/curator-client-2.7.1.jar > file:/opt/golfbreaks/spark-jars/./lib/curator-framework-2.7.1.jar > file:/opt/golfbreaks/spark-jars/./lib/curator-recipes-2.7.1.jar > file:/opt/golfbreaks/spark-jars/./lib/disruptor-3.3.0.jar > file:/opt/golfbreaks/spark-jars/./lib/fastutil-6.3.jar > file:/opt/golfbreaks/spark-jars/./lib/findbugs-annotations-1.3.9-1.jar > file:/opt/golfbreaks/spark-jars/./lib/gson-2.8.0.jar > file:/opt/golfbreaks/spark-jars/./lib/guava-12.0.1.jar > file:/opt/golfbreaks/spark-jars/./lib/hadoop-annotations-2.6.0-cdh5.9.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hadoop-auth-2.6.0-cdh5.9.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hadoop-aws-2.6.0-cdh5.7.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hadoop-client-2.6.0-cdh5.7.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hadoop-common-2.6.0-cdh5.9.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hadoop-core-2.6.0-mr1-cdh5.9.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hadoop-hdfs-2.6.0-cdh5.7.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hadoop-hdfs-2.6.0-cdh5.9.0-tests.jar > file:/opt/golfbreaks/spark-jars/./lib/hadoop-mapreduce-client-app-2.6.0-cdh5.7.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hadoop-mapreduce-client-common-2.6.0-cdh5.7.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hadoop-mapreduce-client-core-2.6.0-cdh5.7.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hadoop-mapreduce-client-jobclient-2.6.0-cdh5.7.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hadoop-mapreduce-client-shuffle-2.6.0-cdh5.7.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hadoop-yarn-api-2.6.0-cdh5.7.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hadoop-yarn-client-2.6.0-cdh5.7.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hadoop-yarn-common-2.6.0-cdh5.7.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hadoop-yarn-server-common-2.6.0-cdh5.7.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hamcrest-core-1.3.jar > file:/opt/golfbreaks/spark-jars/./lib/hbase-annotations-1.2.0-cdh5.9.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hbase-client-1.2.0-cdh5.9.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hbase-common-1.2.0-cdh5.9.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hbase-common-1.2.0-cdh5.9.0-tests.jar > file:/opt/golfbreaks/spark-jars/./lib/hbase-hadoop2-compat-1.2.0-cdh5.9.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hbase-hadoop-compat-1.2.0-cdh5.9.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hbase-prefix-tree-1.2.0-cdh5.9.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hbase-procedure-1.2.0-cdh5.9.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hbase-protocol-1.2.0-cdh5.9.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hbase-server-1.2.0-cdh5.9.0.jar > file:/opt/golfbreaks/spark-jars/./lib/hbase-spark-1.2.0-cdh5.9.0.jar > file:/opt/golfbreaks/spark-jars/./lib/high-scale-lib-1.1.1.jar > file:/opt/golfbreaks/spark-jars/./lib/hk2-api-2.4.0-b34.jar > file:/opt/golfbreaks/spark-jars/./lib/hk2-locator-2.4.0-b34.jar > file:/opt/golfbreaks/spark-jars/./lib/hk2-utils-2.4.0-b34.jar > file:/opt/golfbreaks/spark-jars/./lib/hsqldb-1.8.0.10.jar > file:/opt/golfbreaks/spark-jars/./lib/htrace-core-3.2.0-incubating.jar > file:/opt/golfbreaks/spark-jars/./lib/htrace-core4-4.0.1-incubating.jar > file:/opt/golfbreaks/spark-jars/./lib/httpclient-4.1.2.jar > file:/opt/golfbreaks/spark-jars/./lib/httpcore-4.1.2.jar > file:/opt/golfbreaks/spark-jars/./lib/ivy-2.4.0.jar > file:/opt/golfbreaks/spark-jars/./lib/jackson-annotations-2.6.5.jar > file:/opt/golfbreaks/spark-jars/./lib/jackson-core-2.6.5.jar > file:/opt/golfbreaks/spark-jars/./lib/jackson-core-asl-1.8.8.jar > file:/opt/golfbreaks/spark-jars/./lib/jackson-databind-2.6.5.jar > file:/opt/golfbreaks/spark-jars/./lib/jackson-jaxrs-1.8.8.jar > file:/opt/golfbreaks/spark-jars/./lib/jackson-mapper-asl-1.8.8.jar > file:/opt/golfbreaks/spark-jars/./lib/jackson-module-paranamer-2.6.5.jar > file:/opt/golfbreaks/spark-jars/./lib/jackson-module-scala_2.11-2.6.5.jar > file:/opt/golfbreaks/spark-jars/./lib/jackson-xc-1.8.3.jar > file:/opt/golfbreaks/spark-jars/./lib/jamon-runtime-2.4.1.jar > file:/opt/golfbreaks/spark-jars/./lib/janino-2.7.8.jar > file:/opt/golfbreaks/spark-jars/./lib/jasper-compiler-5.5.23.jar > file:/opt/golfbreaks/spark-jars/./lib/jasper-runtime-5.5.23.jar > file:/opt/golfbreaks/spark-jars/./lib/javassist-3.18.1-GA.jar > file:/opt/golfbreaks/spark-jars/./lib/javax.annotation-api-1.2.jar > file:/opt/golfbreaks/spark-jars/./lib/javax.inject-2.4.0-b34.jar > file:/opt/golfbreaks/spark-jars/./lib/java-xmlbuilder-0.4.jar > file:/opt/golfbreaks/spark-jars/./lib/javax.servlet-api-3.1.0.jar > file:/opt/golfbreaks/spark-jars/./lib/javax.ws.rs-api-2.0.1.jar > file:/opt/golfbreaks/spark-jars/./lib/jaxb-api-2.2.2.jar > file:/opt/golfbreaks/spark-jars/./lib/jaxb-impl-2.2.3-1.jar > file:/opt/golfbreaks/spark-jars/./lib/jcl-over-slf4j-1.7.5.jar > file:/opt/golfbreaks/spark-jars/./lib/jcodings-1.0.8.jar > file:/opt/golfbreaks/spark-jars/./lib/jersey-client-2.22.2.jar > file:/opt/golfbreaks/spark-jars/./lib/jersey-common-2.22.2.jar > file:/opt/golfbreaks/spark-jars/./lib/jersey-container-servlet-2.22.2.jar > file:/opt/golfbreaks/spark-jars/./lib/jersey-container-servlet-core-2.22.2.jar > file:/opt/golfbreaks/spark-jars/./lib/jersey-core-1.9.jar > file:/opt/golfbreaks/spark-jars/./lib/jersey-guava-2.22.2.jar > file:/opt/golfbreaks/spark-jars/./lib/jersey-json-1.9.jar > file:/opt/golfbreaks/spark-jars/./lib/jersey-media-jaxb-2.22.2.jar > file:/opt/golfbreaks/spark-jars/./lib/jersey-server-1.9.jar > file:/opt/golfbreaks/spark-jars/./lib/jersey-server-2.22.2.jar > file:/opt/golfbreaks/spark-jars/./lib/jets3t-0.9.0.jar > file:/opt/golfbreaks/spark-jars/./lib/jettison-1.1.jar > file:/opt/golfbreaks/spark-jars/./lib/jetty-6.1.26.cloudera.4.jar > file:/opt/golfbreaks/spark-jars/./lib/jetty-sslengine-6.1.26.cloudera.4.jar > file:/opt/golfbreaks/spark-jars/./lib/jetty-util-6.1.26.cloudera.4.jar > file:/opt/golfbreaks/spark-jars/./lib/joni-2.1.2.jar > file:/opt/golfbreaks/spark-jars/./lib/jsch-0.1.42.jar > file:/opt/golfbreaks/spark-jars/./lib/json4s-ast_2.11-3.5.0.jar > file:/opt/golfbreaks/spark-jars/./lib/json4s-core_2.11-3.5.0.jar > file:/opt/golfbreaks/spark-jars/./lib/json4s-jackson_2.11-3.2.11.jar > file:/opt/golfbreaks/spark-jars/./lib/json4s-native_2.11-3.5.0.jar > file:/opt/golfbreaks/spark-jars/./lib/json4s-scalap_2.11-3.5.0.jar > file:/opt/golfbreaks/spark-jars/./lib/jsp-2.1-6.1.14.jar > file:/opt/golfbreaks/spark-jars/./lib/jsp-api-2.1-6.1.14.jar > file:/opt/golfbreaks/spark-jars/./lib/jsp-api-2.1.jar > file:/opt/golfbreaks/spark-jars/./lib/jsr305-1.3.9.jar > file:/opt/golfbreaks/spark-jars/./lib/jul-to-slf4j-1.7.5.jar > file:/opt/golfbreaks/spark-jars/./lib/junit-4.12.jar > file:/opt/golfbreaks/spark-jars/./lib/kafka_2.11-0.9.0-kafka-2.0.0.jar > file:/opt/golfbreaks/spark-jars/./lib/kafka-clients-0.9.0-kafka-2.0.0.jar > file:/opt/golfbreaks/spark-jars/./lib/kryo-shaded-3.0.3.jar > file:/opt/golfbreaks/spark-jars/./lib/leveldbjni-all-1.8.jar > file:/opt/golfbreaks/spark-jars/./lib/log4j-1.2.17.jar > file:/opt/golfbreaks/spark-jars/./lib/lz4-1.3.0.jar > file:/opt/golfbreaks/spark-jars/./lib/mesos-0.21.1-shaded-protobuf.jar > file:/opt/golfbreaks/spark-jars/./lib/metrics-core-2.2.0.jar > file:/opt/golfbreaks/spark-jars/./lib/metrics-core-3.1.2.jar > file:/opt/golfbreaks/spark-jars/./lib/metrics-graphite-3.1.2.jar > file:/opt/golfbreaks/spark-jars/./lib/metrics-json-3.1.2.jar > file:/opt/golfbreaks/spark-jars/./lib/metrics-jvm-3.1.2.jar > file:/opt/golfbreaks/spark-jars/./lib/minlog-1.3.0.jar > file:/opt/golfbreaks/spark-jars/./lib/netty-3.8.0.Final.jar > file:/opt/golfbreaks/spark-jars/./lib/netty-all-4.0.29.Final.jar > file:/opt/golfbreaks/spark-jars/./lib/objenesis-2.1.jar > file:/opt/golfbreaks/spark-jars/./lib/oro-2.0.8.jar > file:/opt/golfbreaks/spark-jars/./lib/osgi-resource-locator-1.0.1.jar > file:/opt/golfbreaks/spark-jars/./lib/paranamer-2.8.jar > file:/opt/golfbreaks/spark-jars/./lib/parquet-avro-1.5.0-cdh5.7.0.jar > file:/opt/golfbreaks/spark-jars/./lib/parquet-column-1.5.0-cdh5.7.0.jar > file:/opt/golfbreaks/spark-jars/./lib/parquet-common-1.5.0-cdh5.7.0.jar > file:/opt/golfbreaks/spark-jars/./lib/parquet-encoding-1.5.0-cdh5.7.0.jar > file:/opt/golfbreaks/spark-jars/./lib/parquet-format-2.1.0-cdh5.7.0.jar > file:/opt/golfbreaks/spark-jars/./lib/parquet-hadoop-1.5.0-cdh5.7.0.jar > file:/opt/golfbreaks/spark-jars/./lib/parquet-jackson-1.5.0-cdh5.7.0.jar > file:/opt/golfbreaks/spark-jars/./lib/protobuf-java-2.5.0.jar > file:/opt/golfbreaks/spark-jars/./lib/py4j-0.10.3.jar > file:/opt/golfbreaks/spark-jars/./lib/pyrolite-4.13.jar > file:/opt/golfbreaks/spark-jars/./lib/RoaringBitmap-0.5.11.jar > file:/opt/golfbreaks/spark-jars/./lib/scala-library-2.11.8.jar > file:/opt/golfbreaks/spark-jars/./lib/scala-parser-combinators_2.11-1.0.4.jar > file:/opt/golfbreaks/spark-jars/./lib/scala-reflect-2.11.7.jar > file:/opt/golfbreaks/spark-jars/./lib/scalatest_2.11-2.2.6.jar > file:/opt/golfbreaks/spark-jars/./lib/scala-xml_2.11-1.0.6.jar > file:/opt/golfbreaks/spark-jars/./lib/servlet-api-2.5-6.1.14.jar > file:/opt/golfbreaks/spark-jars/./lib/servlet-api-2.5.jar > file:/opt/golfbreaks/spark-jars/./lib/slf4j-api-1.7.5.jar > file:/opt/golfbreaks/spark-jars/./lib/slf4j-log4j12-1.7.5.jar > file:/opt/golfbreaks/spark-jars/./lib/snappy-java-1.0.4.1.jar > file:/opt/golfbreaks/spark-jars/./lib/spark-catalyst_2.11-2.0.0.cloudera1.jar > file:/opt/golfbreaks/spark-jars/./lib/spark-core_2.11-2.0.0.cloudera1.jar > file:/opt/golfbreaks/spark-jars/./lib/spark-launcher_2.11-2.0.0.cloudera1.jar > file:/opt/golfbreaks/spark-jars/./lib/spark-network-common_2.11-2.0.0.cloudera1.jar > file:/opt/golfbreaks/spark-jars/./lib/spark-network-shuffle_2.11-2.0.0.cloudera1.jar > file:/opt/golfbreaks/spark-jars/./lib/spark-sketch_2.11-2.0.0.cloudera1.jar > file:/opt/golfbreaks/spark-jars/./lib/spark-sql_2.11-2.0.0.cloudera1.jar > file:/opt/golfbreaks/spark-jars/./lib/spark-streaming_2.11-2.0.0.cloudera1.jar > file:/opt/golfbreaks/spark-jars/./lib/spark-streaming-kafka-0-8_2.11-2.0.0.cloudera1.jar > file:/opt/golfbreaks/spark-jars/./lib/spark-tags_2.11-2.0.0.cloudera1.jar > file:/opt/golfbreaks/spark-jars/./lib/spark-unsafe_2.11-2.0.0.cloudera1.jar > file:/opt/golfbreaks/spark-jars/./lib/stax-api-1.0-2.jar > file:/opt/golfbreaks/spark-jars/./lib/stream-2.7.0.jar > file:/opt/golfbreaks/spark-jars/./lib/univocity-parsers-2.1.1.jar > file:/opt/golfbreaks/spark-jars/./lib/unused-1.0.0.jar > file:/opt/golfbreaks/spark-jars/./lib/validation-api-1.1.0.Final.jar > file:/opt/golfbreaks/spark-jars/./lib/xbean-asm5-shaded-4.4.jar > file:/opt/golfbreaks/spark-jars/./lib/xercesImpl-2.9.1.jar > file:/opt/golfbreaks/spark-jars/./lib/xml-apis-1.3.04.jar > file:/opt/golfbreaks/spark-jars/./lib/xmlenc-0.52.jar > file:/opt/golfbreaks/spark-jars/./lib/xz-1.0.jar > file:/opt/golfbreaks/spark-jars/./lib/zkclient-0.7.jar > file:/opt/golfbreaks/spark-jars/./lib/zookeeper-3.4.5-cdh5.9.0.jar > 17/03/10 19:19:01 INFO spark.SparkContext: Running Spark version > 2.0.0.cloudera1 > 17/03/10 19:19:01 INFO spark.SecurityManager: Changing view acls to: hdfs > 17/03/10 19:19:01 INFO spark.SecurityManager: Changing modify acls to: hdfs > 17/03/10 19:19:01 INFO spark.SecurityManager: Changing view acls groups to: > 17/03/10 19:19:01 INFO spark.SecurityManager: Changing modify acls groups to: > 17/03/10 19:19:01 INFO spark.SecurityManager: SecurityManager: authentication > disabled; ui acls disabled; users with view permissions: Set(hdfs); groups > with view permissions: Set(); users with modify permissions: Set(hdfs); > groups with modify permissions: Set() > 17/03/10 19:19:02 INFO util.Utils: Successfully started service 'sparkDriver' > on port 44306. > 17/03/10 19:19:02 INFO spark.SparkEnv: Registering MapOutputTracker > 17/03/10 19:19:02 INFO spark.SparkEnv: Registering BlockManagerMaster > 17/03/10 19:19:02 INFO storage.DiskBlockManager: Created local directory at > /tmp/blockmgr-26ec5da2-b735-4402-832b-10446069948d > 17/03/10 19:19:02 INFO memory.MemoryStore: MemoryStore started with capacity > 2004.6 MB > 17/03/10 19:19:02 INFO spark.SparkEnv: Registering OutputCommitCoordinator > 17/03/10 19:19:02 INFO util.log: Logging initialized @2045ms > 17/03/10 19:19:02 INFO server.Server: jetty-9.2.z-SNAPSHOT > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@7c310a67{/jobs,null,AVAILABLE} > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@77a22fe3{/jobs/json,null,AVAILABLE} > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@38f61d71{/jobs/job,null,AVAILABLE} > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@260702ee{/jobs/job/json,null,AVAILABLE} > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@16516ac3{/stages,null,AVAILABLE} > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@3b3833a7{/stages/json,null,AVAILABLE} > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@66c73750{/stages/stage,null,AVAILABLE} > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@210099e7{/stages/stage/json,null,AVAILABLE} > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@2e781d9e{/stages/pool,null,AVAILABLE} > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@4fe087e7{/stages/pool/json,null,AVAILABLE} > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@1dfa77a9{/storage,null,AVAILABLE} > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@26c6079f{/storage/json,null,AVAILABLE} > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@131e6b9c{/storage/rdd,null,AVAILABLE} > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@322ad892{/storage/rdd/json,null,AVAILABLE} > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@1af072f9{/environment,null,AVAILABLE} > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@200c4740{/environment/json,null,AVAILABLE} > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@619cb30{/executors,null,AVAILABLE} > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@76abf71{/executors/json,null,AVAILABLE} > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@713e9784{/executors/threadDump,null,AVAILABLE} > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@444d9531{/executors/threadDump/json,null,AVAILABLE} > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@417de6ff{/static,null,AVAILABLE} > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@30c890f0{/,null,AVAILABLE} > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@3fa39595{/api,null,AVAILABLE} > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@cb189d7{/stages/stage/kill,null,AVAILABLE} > 17/03/10 19:19:02 INFO server.ServerConnector: Started > ServerConnector@2323dd3b{HTTP/1.1}{0.0.0.0:4040} > 17/03/10 19:19:02 INFO server.Server: Started @2142ms > 17/03/10 19:19:02 INFO util.Utils: Successfully started service 'SparkUI' on > port 4040. > 17/03/10 19:19:02 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at > http://84.40.61.250:4040 > 17/03/10 19:19:02 INFO spark.SparkContext: Added JAR > file:/opt/golfbreaks/spark-jars/streaming-1.0-jar-with-dependencies.jar at > spark://84.40.61.250:44306/jars/streaming-1.0-jar-with-dependencies.jar with > timestamp 1489173542595 > 17/03/10 19:19:02 INFO executor.Executor: Starting executor ID driver on host > localhost > 17/03/10 19:19:02 INFO util.Utils: Successfully started service > 'org.apache.spark.network.netty.NettyBlockTransferService' on port 40903. > 17/03/10 19:19:02 INFO netty.NettyBlockTransferService: Server created on > 84.40.61.250:40903 > 17/03/10 19:19:02 INFO storage.BlockManager: external shuffle service port = > 7337 > 17/03/10 19:19:02 INFO storage.BlockManagerMaster: Registering BlockManager > BlockManagerId(driver, 84.40.61.250, 40903) > 17/03/10 19:19:02 INFO storage.BlockManagerMasterEndpoint: Registering block > manager 84.40.61.250:40903 with 2004.6 MB RAM, BlockManagerId(driver, > 84.40.61.250, 40903) > 17/03/10 19:19:02 INFO storage.BlockManagerMaster: Registered BlockManager > BlockManagerId(driver, 84.40.61.250, 40903) > 17/03/10 19:19:02 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@2c94e76a{/metrics/json,null,AVAILABLE} > 17/03/10 19:19:03 INFO scheduler.EventLoggingListener: Logging events to > hdfs://instance-26765.bigstep.io:8020/user/spark/spark2ApplicationHistory/local-1489173542628 > 17/03/10 19:19:03 INFO scheduler.ReceiverTracker: Starting 1 receivers > 17/03/10 19:19:03 INFO scheduler.ReceiverTracker: ReceiverTracker started > 17/03/10 19:19:03 INFO kafka.KafkaInputDStream: Slide time = 1000 ms > 17/03/10 19:19:03 INFO kafka.KafkaInputDStream: Storage level = Serialized 1x > Replicated > 17/03/10 19:19:03 INFO kafka.KafkaInputDStream: Checkpoint interval = null > 17/03/10 19:19:03 INFO kafka.KafkaInputDStream: Remember interval = 1000 ms > 17/03/10 19:19:03 INFO kafka.KafkaInputDStream: Initialized and validated > org.apache.spark.streaming.kafka.KafkaInputDStream@732e70a0 > 17/03/10 19:19:03 INFO dstream.FilteredDStream: Slide time = 1000 ms > 17/03/10 19:19:03 INFO dstream.FilteredDStream: Storage level = Serialized 1x > Replicated > 17/03/10 19:19:03 INFO dstream.FilteredDStream: Checkpoint interval = null > 17/03/10 19:19:03 INFO dstream.FilteredDStream: Remember interval = 1000 ms > 17/03/10 19:19:03 INFO dstream.FilteredDStream: Initialized and validated > org.apache.spark.streaming.dstream.FilteredDStream@4f11189a > 17/03/10 19:19:03 INFO dstream.MappedDStream: Slide time = 1000 ms > 17/03/10 19:19:03 INFO dstream.MappedDStream: Storage level = Serialized 1x > Replicated > 17/03/10 19:19:03 INFO dstream.MappedDStream: Checkpoint interval = null > 17/03/10 19:19:03 INFO dstream.MappedDStream: Remember interval = 1000 ms > 17/03/10 19:19:03 INFO dstream.MappedDStream: Initialized and validated > org.apache.spark.streaming.dstream.MappedDStream@4ccc69fe > 17/03/10 19:19:03 INFO dstream.TransformedDStream: Slide time = 1000 ms > 17/03/10 19:19:03 INFO dstream.TransformedDStream: Storage level = Serialized > 1x Replicated > 17/03/10 19:19:03 INFO dstream.TransformedDStream: Checkpoint interval = null > 17/03/10 19:19:03 INFO dstream.TransformedDStream: Remember interval = 1000 ms > 17/03/10 19:19:03 INFO dstream.TransformedDStream: Initialized and validated > org.apache.spark.streaming.dstream.TransformedDStream@63a2aa36 > 17/03/10 19:19:03 INFO dstream.ForEachDStream: Slide time = 1000 ms > 17/03/10 19:19:03 INFO dstream.ForEachDStream: Storage level = Serialized 1x > Replicated > 17/03/10 19:19:03 INFO dstream.ForEachDStream: Checkpoint interval = null > 17/03/10 19:19:03 INFO dstream.ForEachDStream: Remember interval = 1000 ms > 17/03/10 19:19:03 INFO dstream.ForEachDStream: Initialized and validated > org.apache.spark.streaming.dstream.ForEachDStream@1c5f5c87 > 17/03/10 19:19:03 INFO util.RecurringTimer: Started timer for JobGenerator at > time 1489173544000 > 17/03/10 19:19:03 INFO scheduler.JobGenerator: Started JobGenerator at > 1489173544000 ms > 17/03/10 19:19:03 INFO scheduler.JobScheduler: Started JobScheduler > 17/03/10 19:19:03 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@37f7ce71{/streaming,null,AVAILABLE} > 17/03/10 19:19:03 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@7ee19841{/streaming/json,null,AVAILABLE} > 17/03/10 19:19:03 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@5e6fe928{/streaming/batch,null,AVAILABLE} > 17/03/10 19:19:03 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@36099a63{/streaming/batch/json,null,AVAILABLE} > 17/03/10 19:19:03 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@7f97c0e9{/static/streaming,null,AVAILABLE} > 17/03/10 19:19:03 INFO streaming.StreamingContext: StreamingContext started > 17/03/10 19:19:03 INFO scheduler.ReceiverTracker: Receiver 0 started > 17/03/10 19:19:03 INFO scheduler.DAGScheduler: Got job 0 (start at > Test.scala:92) with 1 output partitions > 17/03/10 19:19:03 INFO scheduler.DAGScheduler: Final stage: ResultStage 0 > (start at Test.scala:92) > 17/03/10 19:19:03 INFO scheduler.DAGScheduler: Parents of final stage: List() > 17/03/10 19:19:03 INFO scheduler.DAGScheduler: Missing parents: List() > 17/03/10 19:19:03 INFO scheduler.DAGScheduler: Submitting ResultStage 0 > (Receiver 0 ParallelCollectionRDD[0] at makeRDD at > ReceiverTracker.scala:610), which has no missing parents > 17/03/10 19:19:04 ERROR scheduler.JobScheduler: Error generating jobs for > time 1489173544000 ms > java.lang.RuntimeException: java.lang.ClassNotFoundException: Class > org.apache.hadoop.hbase.mapreduce.TableOutputFormat not found > at > org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2199) > at > org.apache.hadoop.mapreduce.task.JobContextImpl.getOutputFormatClass(JobContextImpl.java:232) > at > org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply$mcV$sp(PairRDDFunctions.scala:1087) > at > org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:1078) > at > org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:1078) > at > org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) > at > org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) > at org.apache.spark.rdd.RDD.withScope(RDD.scala:358) > at > org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopDataset(PairRDDFunctions.scala:1078) > at com.golfbreaks.spark.streaming.Test$$anonfun$4.apply(Test.scala:88) > at com.golfbreaks.spark.streaming.Test$$anonfun$4.apply(Test.scala:53) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$transform$1$$anonfun$apply$21.apply(DStream.scala:666) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$transform$1$$anonfun$apply$21.apply(DStream.scala:666) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$transform$2$$anonfun$5.apply(DStream.scala:680) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$transform$2$$anonfun$5.apply(DStream.scala:678) > at > org.apache.spark.streaming.dstream.TransformedDStream.compute(TransformedDStream.scala:46) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1$$anonfun$apply$7.apply(DStream.scala:341) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1$$anonfun$apply$7.apply(DStream.scala:341) > at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1.apply(DStream.scala:340) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1.apply(DStream.scala:340) > at > org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:415) > at > org.apache.spark.streaming.dstream.TransformedDStream.createRDDWithLocalProperties(TransformedDStream.scala:65) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1.apply(DStream.scala:335) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1.apply(DStream.scala:333) > at scala.Option.orElse(Option.scala:289) > at > org.apache.spark.streaming.dstream.DStream.getOrCompute(DStream.scala:330) > at > org.apache.spark.streaming.dstream.ForEachDStream.generateJob(ForEachDStream.scala:48) > at > org.apache.spark.streaming.DStreamGraph$$anonfun$1.apply(DStreamGraph.scala:117) > at > org.apache.spark.streaming.DStreamGraph$$anonfun$1.apply(DStreamGraph.scala:116) > at > scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) > at > scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) > at > scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) > at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) > at > scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241) > at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104) > at > org.apache.spark.streaming.DStreamGraph.generateJobs(DStreamGraph.scala:116) > at > org.apache.spark.streaming.scheduler.JobGenerator$$anonfun$3.apply(JobGenerator.scala:249) > at > org.apache.spark.streaming.scheduler.JobGenerator$$anonfun$3.apply(JobGenerator.scala:247) > at scala.util.Try$.apply(Try.scala:192) > at > org.apache.spark.streaming.scheduler.JobGenerator.generateJobs(JobGenerator.scala:247) > at > org.apache.spark.streaming.scheduler.JobGenerator.org$apache$spark$streaming$scheduler$JobGenerator$$processEvent(JobGenerator.scala:183) > at > org.apache.spark.streaming.scheduler.JobGenerator$$anon$1.onReceive(JobGenerator.scala:89) > at > org.apache.spark.streaming.scheduler.JobGenerator$$anon$1.onReceive(JobGenerator.scala:88) > at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48) > Caused by: java.lang.ClassNotFoundException: Class > org.apache.hadoop.hbase.mapreduce.TableOutputFormat not found > at > org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105) > at > org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2197) > ... 44 more > Exception in thread "main" java.lang.RuntimeException: > java.lang.ClassNotFoundException: Class > org.apache.hadoop.hbase.mapreduce.TableOutputFormat not found > at > org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2199) > at > org.apache.hadoop.mapreduce.task.JobContextImpl.getOutputFormatClass(JobContextImpl.java:232) > at > org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply$mcV$sp(PairRDDFunctions.scala:1087) > at > org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:1078) > at > org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:1078) > at > org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) > at > org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) > at org.apache.spark.rdd.RDD.withScope(RDD.scala:358) > at > org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopDataset(PairRDDFunctions.scala:1078) > at com.golfbreaks.spark.streaming.Test$$anonfun$4.apply(Test.scala:88) > at com.golfbreaks.spark.streaming.Test$$anonfun$4.apply(Test.scala:53) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$transform$1$$anonfun$apply$21.apply(DStream.scala:666) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$transform$1$$anonfun$apply$21.apply(DStream.scala:666) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$transform$2$$anonfun$5.apply(DStream.scala:680) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$transform$2$$anonfun$5.apply(DStream.scala:678) > at > org.apache.spark.streaming.dstream.TransformedDStream.compute(TransformedDStream.scala:46) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1$$anonfun$apply$7.apply(DStream.scala:341) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1$$anonfun$apply$7.apply(DStream.scala:341) > at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1.apply(DStream.scala:340) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1.apply(DStream.scala:340) > at > org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:415) > at > org.apache.spark.streaming.dstream.TransformedDStream.createRDDWithLocalProperties(TransformedDStream.scala:65) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1.apply(DStream.scala:335) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1.apply(DStream.scala:333) > at scala.Option.orElse(Option.scala:289) > at > org.apache.spark.streaming.dstream.DStream.getOrCompute(DStream.scala:330) > at > org.apache.spark.streaming.dstream.ForEachDStream.generateJob(ForEachDStream.scala:48) > at > org.apache.spark.streaming.DStreamGraph$$anonfun$1.apply(DStreamGraph.scala:117) > at > org.apache.spark.streaming.DStreamGraph$$anonfun$1.apply(DStreamGraph.scala:116) > at > scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) > at > scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) > at > scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) > at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) > at > scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241) > at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104) > at > org.apache.spark.streaming.DStreamGraph.generateJobs(DStreamGraph.scala:116) > at > org.apache.spark.streaming.scheduler.JobGenerator$$anonfun$3.apply(JobGenerator.scala:249) > at > org.apache.spark.streaming.scheduler.JobGenerator$$anonfun$3.apply(JobGenerator.scala:247) > at scala.util.Try$.apply(Try.scala:192) > at > org.apache.spark.streaming.scheduler.JobGenerator.generateJobs(JobGenerator.scala:247) > at > org.apache.spark.streaming.scheduler.JobGenerator.org$apache$spark$streaming$scheduler$JobGenerator$$processEvent(JobGenerator.scala:183) > at > org.apache.spark.streaming.scheduler.JobGenerator$$anon$1.onReceive(JobGenerator.scala:89) > at > org.apache.spark.streaming.scheduler.JobGenerator$$anon$1.onReceive(JobGenerator.scala:88) > at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48) > Caused by: java.lang.ClassNotFoundException: Class > org.apache.hadoop.hbase.mapreduce.TableOutputFormat not found > at > org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105) > at > org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2197) > ... 44 more > 17/03/10 19:19:04 INFO streaming.StreamingContext: Invoking > stop(stopGracefully=false) from shutdown hook > 17/03/10 19:19:04 INFO scheduler.ReceiverTracker: Sent stop signal to all 1 > receivers > 17/03/10 19:19:04 INFO memory.MemoryStore: Block broadcast_0 stored as values > in memory (estimated size 71.3 KB, free 2004.5 MB) > 17/03/10 19:19:04 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as > bytes in memory (estimated size 25.9 KB, free 2004.5 MB) > 17/03/10 19:19:04 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in > memory on 84.40.61.250:40903 (size: 25.9 KB, free: 2004.6 MB) > 17/03/10 19:19:04 INFO spark.SparkContext: Created broadcast 0 from broadcast > at DAGScheduler.scala:1012 > 17/03/10 19:19:04 INFO scheduler.DAGScheduler: Submitting 1 missing tasks > from ResultStage 0 (Receiver 0 ParallelCollectionRDD[0] at makeRDD at > ReceiverTracker.scala:610) > 17/03/10 19:19:04 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with > 1 tasks > 17/03/10 19:19:04 INFO scheduler.TaskSetManager: Starting task 0.0 in stage > 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7149 > bytes) > 17/03/10 19:19:04 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID > 0) > 17/03/10 19:19:04 INFO executor.Executor: Fetching > spark://84.40.61.250:44306/jars/streaming-1.0-jar-with-dependencies.jar with > timestamp 1489173542595 > 17/03/10 19:19:04 INFO client.TransportClientFactory: Successfully created > connection to /84.40.61.250:44306 after 47 ms (0 ms spent in bootstraps) > 17/03/10 19:19:04 INFO util.Utils: Fetching > spark://84.40.61.250:44306/jars/streaming-1.0-jar-with-dependencies.jar to > /tmp/spark-0240c93e-4bd3-4c59-b56c-40c5b6499c44/userFiles-86a0ab49-294c-40f2-88a5-c6ff6514116b/fetchFileTemp3828224251343929630.tmp > 17/03/10 19:19:05 ERROR scheduler.JobScheduler: Error generating jobs for > time 1489173545000 ms > java.lang.RuntimeException: java.lang.ClassNotFoundException: Class > org.apache.hadoop.hbase.mapreduce.TableOutputFormat not found > at > org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2199) > at > org.apache.hadoop.mapreduce.task.JobContextImpl.getOutputFormatClass(JobContextImpl.java:232) > at > org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply$mcV$sp(PairRDDFunctions.scala:1087) > at > org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:1078) > at > org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:1078) > at > org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) > at > org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) > at org.apache.spark.rdd.RDD.withScope(RDD.scala:358) > at > org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopDataset(PairRDDFunctions.scala:1078) > at com.golfbreaks.spark.streaming.Test$$anonfun$4.apply(Test.scala:88) > at com.golfbreaks.spark.streaming.Test$$anonfun$4.apply(Test.scala:53) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$transform$1$$anonfun$apply$21.apply(DStream.scala:666) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$transform$1$$anonfun$apply$21.apply(DStream.scala:666) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$transform$2$$anonfun$5.apply(DStream.scala:680) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$transform$2$$anonfun$5.apply(DStream.scala:678) > at > org.apache.spark.streaming.dstream.TransformedDStream.compute(TransformedDStream.scala:46) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1$$anonfun$apply$7.apply(DStream.scala:341) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1$$anonfun$apply$7.apply(DStream.scala:341) > at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1.apply(DStream.scala:340) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1.apply(DStream.scala:340) > at > org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:415) > at > org.apache.spark.streaming.dstream.TransformedDStream.createRDDWithLocalProperties(TransformedDStream.scala:65) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1.apply(DStream.scala:335) > at > org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1.apply(DStream.scala:333) > at scala.Option.orElse(Option.scala:289) > at > org.apache.spark.streaming.dstream.DStream.getOrCompute(DStream.scala:330) > at > org.apache.spark.streaming.dstream.ForEachDStream.generateJob(ForEachDStream.scala:48) > at > org.apache.spark.streaming.DStreamGraph$$anonfun$1.apply(DStreamGraph.scala:117) > at > org.apache.spark.streaming.DStreamGraph$$anonfun$1.apply(DStreamGraph.scala:116) > at > scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) > at > scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) > at > scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) > at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) > at > scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241) > at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104) > at > org.apache.spark.streaming.DStreamGraph.generateJobs(DStreamGraph.scala:116) > at > org.apache.spark.streaming.scheduler.JobGenerator$$anonfun$3.apply(JobGenerator.scala:249) > at > org.apache.spark.streaming.scheduler.JobGenerator$$anonfun$3.apply(JobGenerator.scala:247) > at scala.util.Try$.apply(Try.scala:192) > at > org.apache.spark.streaming.scheduler.JobGenerator.generateJobs(JobGenerator.scala:247) > at > org.apache.spark.streaming.scheduler.JobGenerator.org$apache$spark$streaming$scheduler$JobGenerator$$processEvent(JobGenerator.scala:183) > at > org.apache.spark.streaming.scheduler.JobGenerator$$anon$1.onReceive(JobGenerator.scala:89) > at > org.apache.spark.streaming.scheduler.JobGenerator$$anon$1.onReceive(JobGenerator.scala:88) > at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48) > Caused by: java.lang.ClassNotFoundException: Class > org.apache.hadoop.hbase.mapreduce.TableOutputFormat not found > at > org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105) > at > org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2197) > ... 44 more > 17/03/10 19:19:05 INFO executor.Executor: Adding > file:/tmp/spark-0240c93e-4bd3-4c59-b56c-40c5b6499c44/userFiles-86a0ab49-294c-40f2-88a5-c6ff6514116b/streaming-1.0-jar-with-dependencies.jar > to class loader > 17/03/10 19:19:05 INFO util.RecurringTimer: Started timer for BlockGenerator > at time 1489173545400 > 17/03/10 19:19:05 INFO receiver.BlockGenerator: Started BlockGenerator > 17/03/10 19:19:05 INFO receiver.BlockGenerator: Started block pushing thread > 17/03/10 19:19:05 INFO receiver.ReceiverSupervisorImpl: Stopping receiver > with message: Registered unsuccessfully because Driver refused to start > receiver 0: > 17/03/10 19:19:05 WARN receiver.ReceiverSupervisorImpl: Skip stopping > receiver because it has not yet stared > 17/03/10 19:19:05 INFO receiver.BlockGenerator: Stopping BlockGenerator > 17/03/10 19:19:05 INFO util.RecurringTimer: Stopped timer for BlockGenerator > after time 1489173545600 > 17/03/10 19:19:05 INFO receiver.BlockGenerator: Waiting for block pushing > thread to terminate > 17/03/10 19:19:05 INFO receiver.BlockGenerator: Pushing out the last 0 blocks > 17/03/10 19:19:05 INFO receiver.BlockGenerator: Stopped block pushing thread > 17/03/10 19:19:05 INFO receiver.BlockGenerator: Stopped BlockGenerator > 17/03/10 19:19:05 INFO receiver.ReceiverSupervisorImpl: Waiting for receiver > to be stopped > 17/03/10 19:19:05 INFO receiver.ReceiverSupervisorImpl: Stopped receiver > without error > 17/03/10 19:19:05 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID > 0). 787 bytes result sent to driver > 17/03/10 19:19:05 INFO scheduler.TaskSetManager: Finished task 0.0 in stage > 0.0 (TID 0) in 1313 ms on localhost (executor driver) (1/1) > 17/03/10 19:19:05 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, > whose tasks have all completed, from pool > 17/03/10 19:19:05 INFO scheduler.DAGScheduler: ResultStage 0 (start at > Test.scala:92) finished in 1.335 s > 17/03/10 19:19:05 INFO scheduler.ReceiverTracker: All of the receivers have > deregistered successfully > 17/03/10 19:19:05 INFO scheduler.ReceiverTracker: ReceiverTracker stopped > 17/03/10 19:19:05 INFO scheduler.JobGenerator: Stopping JobGenerator > immediately > 17/03/10 19:19:05 INFO util.RecurringTimer: Stopped timer for JobGenerator > after time 1489173545000 > 17/03/10 19:19:05 INFO scheduler.JobGenerator: Stopped JobGenerator > 17/03/10 19:19:05 INFO scheduler.JobScheduler: Stopped JobScheduler > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@37f7ce71{/streaming,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@5e6fe928{/streaming/batch,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@7f97c0e9{/static/streaming,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO streaming.StreamingContext: StreamingContext stopped > successfully > 17/03/10 19:19:05 INFO spark.SparkContext: Invoking stop() from shutdown hook > 17/03/10 19:19:05 INFO server.ServerConnector: Stopped > ServerConnector@2323dd3b{HTTP/1.1}{0.0.0.0:4040} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@cb189d7{/stages/stage/kill,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@3fa39595{/api,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@30c890f0{/,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@417de6ff{/static,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@444d9531{/executors/threadDump/json,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@713e9784{/executors/threadDump,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@76abf71{/executors/json,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@619cb30{/executors,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@200c4740{/environment/json,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@1af072f9{/environment,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@322ad892{/storage/rdd/json,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@131e6b9c{/storage/rdd,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@26c6079f{/storage/json,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@1dfa77a9{/storage,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@4fe087e7{/stages/pool/json,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@2e781d9e{/stages/pool,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@210099e7{/stages/stage/json,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@66c73750{/stages/stage,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@3b3833a7{/stages/json,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@16516ac3{/stages,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@260702ee{/jobs/job/json,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@38f61d71{/jobs/job,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@77a22fe3{/jobs/json,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@7c310a67{/jobs,null,UNAVAILABLE} > 17/03/10 19:19:05 INFO ui.SparkUI: Stopped Spark web UI at > http://84.40.61.250:4040 > 17/03/10 19:19:05 INFO spark.MapOutputTrackerMasterEndpoint: > MapOutputTrackerMasterEndpoint stopped! > 17/03/10 19:19:05 INFO memory.MemoryStore: MemoryStore cleared > 17/03/10 19:19:05 INFO storage.BlockManager: BlockManager stopped > 17/03/10 19:19:05 INFO storage.BlockManagerMaster: BlockManagerMaster stopped > 17/03/10 19:19:05 INFO > scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: > OutputCommitCoordinator stopped! > 17/03/10 19:19:05 INFO spark.SparkContext: Successfully stopped SparkContext > 17/03/10 19:19:05 INFO util.ShutdownHookManager: Shutdown hook called > 17/03/10 19:19:05 INFO util.ShutdownHookManager: Deleting directory > /tmp/spark-0240c93e-4bd3-4c59-b56c-40c5b6499c44 -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org