Hello everyone,
I am new to mahout. I am trying to run this command

/Documents/apache-mahout-distribution-0.11.2/bin/mahout
spark-itemsimilarity -i spdemo1.txt -o /spout --filter1 like -fc 1 -ic 2

before running the command i uploaded file to hdfs

bin/hadoop dfs -put spdemo.txt /spdemo1.txt

but i am geting this long list of error.Do I need to install spark too?


SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in
[jar:file:/Users/rohitjain/Documents/apache-mahout-distribution-0.11.2/mahout-examples-0.11.2-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in
[jar:file:/Users/rohitjain/Documents/apache-mahout-distribution-0.11.2/mahout-mr-0.11.2-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in
[jar:file:/Users/rohitjain/Documents/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in
[jar:file:/Users/rohitjain/Documents/apache-mahout-distribution-0.11.2/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

16/04/14 10:53:09 WARN SparkConf: The configuration key
'spark.kryoserializer.buffer.mb' has been deprecated as of Spark 1.4 and
may be removed in the future. Please use spark.kryoserializer.buffer
instead. The default value for spark.kryoserializer.buffer.mb was
previously specified as '0.064'. Fractional values are no longer accepted.
To specify the equivalent now, one may use '64k'.

16/04/14 10:53:09 WARN SparkConf: The configuration key
'spark.kryoserializer.buffer.mb' has been deprecated as of Spark 1.4 and
and may be removed in the future. Please use the new key
'spark.kryoserializer.buffer' instead.

16/04/14 10:53:10 INFO SparkContext: Running Spark version 1.6.1

16/04/14 10:53:10 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable

16/04/14 10:53:10 WARN SparkConf: The configuration key
'spark.kryoserializer.buffer.mb' has been deprecated as of Spark 1.4 and
may be removed in the future. Please use spark.kryoserializer.buffer
instead. The default value for spark.kryoserializer.buffer.mb was
previously specified as '0.064'. Fractional values are no longer accepted.
To specify the equivalent now, one may use '64k'.

16/04/14 10:53:10 WARN SparkConf: The configuration key
'spark.kryoserializer.buffer.mb' has been deprecated as of Spark 1.4 and
and may be removed in the future. Please use the new key
'spark.kryoserializer.buffer' instead.

16/04/14 10:53:10 INFO SecurityManager: Changing view acls to: rohitjain

16/04/14 10:53:10 INFO SecurityManager: Changing modify acls to: rohitjain

16/04/14 10:53:10 INFO SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users with view permissions: Set(rohitjain);
users with modify permissions: Set(rohitjain)

16/04/14 10:53:11 WARN SparkConf: The configuration key
'spark.kryoserializer.buffer.mb' has been deprecated as of Spark 1.4 and
may be removed in the future. Please use spark.kryoserializer.buffer
instead. The default value for spark.kryoserializer.buffer.mb was
previously specified as '0.064'. Fractional values are no longer accepted.
To specify the equivalent now, one may use '64k'.

16/04/14 10:53:11 WARN SparkConf: The configuration key
'spark.kryoserializer.buffer.mb' has been deprecated as of Spark 1.4 and
and may be removed in the future. Please use the new key
'spark.kryoserializer.buffer' instead.

16/04/14 10:53:11 WARN SparkConf: The configuration key
'spark.kryoserializer.buffer.mb' has been deprecated as of Spark 1.4 and
may be removed in the future. Please use spark.kryoserializer.buffer
instead. The default value for spark.kryoserializer.buffer.mb was
previously specified as '0.064'. Fractional values are no longer accepted.
To specify the equivalent now, one may use '64k'.

16/04/14 10:53:11 WARN SparkConf: The configuration key
'spark.kryoserializer.buffer.mb' has been deprecated as of Spark 1.4 and
and may be removed in the future. Please use the new key
'spark.kryoserializer.buffer' instead.

16/04/14 10:53:11 INFO Utils: Successfully started service 'sparkDriver' on
port 64062.

16/04/14 10:53:11 INFO Slf4jLogger: Slf4jLogger started

16/04/14 10:53:11 INFO Remoting: Starting remoting

16/04/14 10:53:11 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://sparkDriverActorSystem@192.168.1.122:64063]

16/04/14 10:53:11 INFO Utils: Successfully started service
'sparkDriverActorSystem' on port 64063.

16/04/14 10:53:11 INFO SparkEnv: Registering MapOutputTracker

16/04/14 10:53:11 WARN SparkConf: The configuration key
'spark.kryoserializer.buffer.mb' has been deprecated as of Spark 1.4 and
may be removed in the future. Please use spark.kryoserializer.buffer
instead. The default value for spark.kryoserializer.buffer.mb was
previously specified as '0.064'. Fractional values are no longer accepted.
To specify the equivalent now, one may use '64k'.

16/04/14 10:53:11 WARN SparkConf: The configuration key
'spark.kryoserializer.buffer.mb' has been deprecated as of Spark 1.4 and
and may be removed in the future. Please use the new key
'spark.kryoserializer.buffer' instead.

16/04/14 10:53:11 WARN SparkConf: The configuration key
'spark.kryoserializer.buffer.mb' has been deprecated as of Spark 1.4 and
may be removed in the future. Please use spark.kryoserializer.buffer
instead. The default value for spark.kryoserializer.buffer.mb was
previously specified as '0.064'. Fractional values are no longer accepted.
To specify the equivalent now, one may use '64k'.

16/04/14 10:53:11 WARN SparkConf: The configuration key
'spark.kryoserializer.buffer.mb' has been deprecated as of Spark 1.4 and
and may be removed in the future. Please use the new key
'spark.kryoserializer.buffer' instead.

16/04/14 10:53:11 INFO SparkEnv: Registering BlockManagerMaster

16/04/14 10:53:11 INFO DiskBlockManager: Created local directory at
/private/var/folders/17/vhrqnx4x1n10yqbl1vnch9000000gn/T/blockmgr-6cdfd6f1-56c8-445c-95e0-70305b0f5f5a

16/04/14 10:53:11 INFO MemoryStore: MemoryStore started with capacity 2.4 GB

16/04/14 10:53:12 INFO SparkEnv: Registering OutputCommitCoordinator

16/04/14 10:53:12 INFO Server: jetty-8.y.z-SNAPSHOT

16/04/14 10:53:12 INFO AbstractConnector: Started
SelectChannelConnector@0.0.0.0:4040

16/04/14 10:53:12 INFO Utils: Successfully started service 'SparkUI' on
port 4040.

16/04/14 10:53:12 INFO SparkUI: Started SparkUI at http://192.168.1.122:4040

16/04/14 10:53:12 INFO HttpFileServer: HTTP File server directory is
/private/var/folders/17/vhrqnx4x1n10yqbl1vnch9000000gn/T/spark-ad6a714b-6dcb-410c-aef3-2b098695743b/httpd-11f32419-5f8a-4b61-9654-9ea2d32cab5d

16/04/14 10:53:12 INFO HttpServer: Starting HTTP Server

16/04/14 10:53:12 INFO Server: jetty-8.y.z-SNAPSHOT

16/04/14 10:53:12 INFO AbstractConnector: Started
SocketConnector@0.0.0.0:64064

16/04/14 10:53:12 INFO Utils: Successfully started service 'HTTP file
server' on port 64064.

16/04/14 10:53:12 INFO SparkContext: Added JAR
/Users/rohitjain/Documents/apache-mahout-distribution-0.11.2/mahout-hdfs-0.11.2.jar
at http://192.168.1.122:64064/jars/mahout-hdfs-0.11.2.jar with timestamp
1460611392248

16/04/14 10:53:12 INFO SparkContext: Added JAR
/Users/rohitjain/Documents/apache-mahout-distribution-0.11.2/mahout-math-0.11.2.jar
at http://192.168.1.122:64064/jars/mahout-math-0.11.2.jar with timestamp
1460611392253

16/04/14 10:53:12 INFO SparkContext: Added JAR
/Users/rohitjain/Documents/apache-mahout-distribution-0.11.2/mahout-math-scala_2.10-0.11.2.jar
at http://192.168.1.122:64064/jars/mahout-math-scala_2.10-0.11.2.jar with
timestamp 1460611392255

16/04/14 10:53:12 INFO SparkContext: Added JAR
/Users/rohitjain/Documents/apache-mahout-distribution-0.11.2/mahout-spark_2.10-0.11.2-dependency-reduced.jar
at
http://192.168.1.122:64064/jars/mahout-spark_2.10-0.11.2-dependency-reduced.jar
with timestamp 1460611392303

16/04/14 10:53:12 INFO SparkContext: Added JAR
/Users/rohitjain/Documents/apache-mahout-distribution-0.11.2/mahout-spark_2.10-0.11.2.jar
at http://192.168.1.122:64064/jars/mahout-spark_2.10-0.11.2.jar with
timestamp 1460611392304

16/04/14 10:53:12 WARN SparkConf: The configuration key
'spark.kryoserializer.buffer.mb' has been deprecated as of Spark 1.4 and
may be removed in the future. Please use spark.kryoserializer.buffer
instead. The default value for spark.kryoserializer.buffer.mb was
previously specified as '0.064'. Fractional values are no longer accepted.
To specify the equivalent now, one may use '64k'.

16/04/14 10:53:12 WARN SparkConf: The configuration key
'spark.kryoserializer.buffer.mb' has been deprecated as of Spark 1.4 and
and may be removed in the future. Please use the new key
'spark.kryoserializer.buffer' instead.

16/04/14 10:53:12 WARN SparkConf: The configuration key
'spark.kryoserializer.buffer.mb' has been deprecated as of Spark 1.4 and
may be removed in the future. Please use spark.kryoserializer.buffer
instead. The default value for spark.kryoserializer.buffer.mb was
previously specified as '0.064'. Fractional values are no longer accepted.
To specify the equivalent now, one may use '64k'.

16/04/14 10:53:12 WARN SparkConf: The configuration key
'spark.kryoserializer.buffer.mb' has been deprecated as of Spark 1.4 and
and may be removed in the future. Please use the new key
'spark.kryoserializer.buffer' instead.

16/04/14 10:53:12 INFO Executor: Starting executor ID driver on host
localhost

16/04/14 10:53:12 INFO Utils: Successfully started service
'org.apache.spark.network.netty.NettyBlockTransferService' on port 64065.

16/04/14 10:53:12 INFO NettyBlockTransferService: Server created on 64065

16/04/14 10:53:12 INFO BlockManagerMaster: Trying to register BlockManager

16/04/14 10:53:12 INFO BlockManagerMasterEndpoint: Registering block
manager localhost:64065 with 2.4 GB RAM, BlockManagerId(driver, localhost,
64065)

16/04/14 10:53:12 INFO BlockManagerMaster: Registered BlockManager

java.lang.reflect.InvocationTargetException

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:497)

at org.xerial.snappy.SnappyLoader.loadNativeLibrary(SnappyLoader.java:317)

at org.xerial.snappy.SnappyLoader.load(SnappyLoader.java:219)

at org.xerial.snappy.Snappy.<clinit>(Snappy.java:44)

at
org.apache.spark.io.SnappyCompressionCodec$.liftedTree1$1(CompressionCodec.scala:169)

at
org.apache.spark.io.SnappyCompressionCodec$.org$apache$spark$io$SnappyCompressionCodec$$version$lzycompute(CompressionCodec.scala:168)

at
org.apache.spark.io.SnappyCompressionCodec$.org$apache$spark$io$SnappyCompressionCodec$$version(CompressionCodec.scala:168)

at
org.apache.spark.io.SnappyCompressionCodec.<init>(CompressionCodec.scala:152)

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)

at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

at java.lang.reflect.Constructor.newInstance(Constructor.java:422)

at
org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:72)

at
org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:65)

at org.apache.spark.broadcast.TorrentBroadcast.org
$apache$spark$broadcast$TorrentBroadcast$$setConf(TorrentBroadcast.scala:73)

at
org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:80)

at
org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)

at
org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:63)

at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1326)

at
org.apache.spark.SparkContext$$anonfun$hadoopFile$1.apply(SparkContext.scala:1014)

at
org.apache.spark.SparkContext$$anonfun$hadoopFile$1.apply(SparkContext.scala:1011)

at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)

at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)

at org.apache.spark.SparkContext.withScope(SparkContext.scala:714)

at org.apache.spark.SparkContext.hadoopFile(SparkContext.scala:1011)

at
org.apache.spark.SparkContext$$anonfun$textFile$1.apply(SparkContext.scala:832)

at
org.apache.spark.SparkContext$$anonfun$textFile$1.apply(SparkContext.scala:830)

at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)

at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)

at org.apache.spark.SparkContext.withScope(SparkContext.scala:714)

at org.apache.spark.SparkContext.textFile(SparkContext.scala:830)

at
org.apache.mahout.drivers.TDIndexedDatasetReader$class.elementReader(TextDelimitedReaderWriter.scala:61)

at
org.apache.mahout.drivers.TextDelimitedIndexedDatasetReader.elementReader(TextDelimitedReaderWriter.scala:304)

at
org.apache.mahout.drivers.TextDelimitedIndexedDatasetReader.elementReader(TextDelimitedReaderWriter.scala:304)

at
org.apache.mahout.math.indexeddataset.Reader$class.readElementsFrom(ReaderWriter.scala:75)

at
org.apache.mahout.drivers.TextDelimitedIndexedDatasetReader.readElementsFrom(TextDelimitedReaderWriter.scala:304)

at
org.apache.mahout.sparkbindings.SparkEngine$.indexedDatasetDFSReadElements(SparkEngine.scala:382)

at
org.apache.mahout.sparkbindings.SparkEngine$.indexedDatasetDFSReadElements(SparkEngine.scala:39)

at
org.apache.mahout.math.indexeddataset.package$.indexedDatasetDFSReadElements(package.scala:336)

at
org.apache.mahout.drivers.ItemSimilarityDriver$.readIndexedDatasets(ItemSimilarityDriver.scala:152)

at
org.apache.mahout.drivers.ItemSimilarityDriver$.process(ItemSimilarityDriver.scala:201)

at
org.apache.mahout.drivers.ItemSimilarityDriver$$anonfun$main$1.apply(ItemSimilarityDriver.scala:112)

at
org.apache.mahout.drivers.ItemSimilarityDriver$$anonfun$main$1.apply(ItemSimilarityDriver.scala:110)

at scala.Option.map(Option.scala:145)

at
org.apache.mahout.drivers.ItemSimilarityDriver$.main(ItemSimilarityDriver.scala:110)

at
org.apache.mahout.drivers.ItemSimilarityDriver.main(ItemSimilarityDriver.scala)

Caused by: java.lang.UnsatisfiedLinkError: no snappyjava in
java.library.path

at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1864)

at java.lang.Runtime.loadLibrary0(Runtime.java:870)

at java.lang.System.loadLibrary(System.java:1122)

at
org.xerial.snappy.SnappyNativeLoader.loadLibrary(SnappyNativeLoader.java:52)

... 49 more

Exception in thread "main" java.lang.reflect.InvocationTargetException

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)

at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

at java.lang.reflect.Constructor.newInstance(Constructor.java:422)

at
org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:72)

at
org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:65)

at org.apache.spark.broadcast.TorrentBroadcast.org
$apache$spark$broadcast$TorrentBroadcast$$setConf(TorrentBroadcast.scala:73)

at
org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:80)

at
org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)

at
org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:63)

at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1326)

at
org.apache.spark.SparkContext$$anonfun$hadoopFile$1.apply(SparkContext.scala:1014)

at
org.apache.spark.SparkContext$$anonfun$hadoopFile$1.apply(SparkContext.scala:1011)

at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)

at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)

at org.apache.spark.SparkContext.withScope(SparkContext.scala:714)

at org.apache.spark.SparkContext.hadoopFile(SparkContext.scala:1011)

at
org.apache.spark.SparkContext$$anonfun$textFile$1.apply(SparkContext.scala:832)

at
org.apache.spark.SparkContext$$anonfun$textFile$1.apply(SparkContext.scala:830)

at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)

at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)

at org.apache.spark.SparkContext.withScope(SparkContext.scala:714)

at org.apache.spark.SparkContext.textFile(SparkContext.scala:830)

at
org.apache.mahout.drivers.TDIndexedDatasetReader$class.elementReader(TextDelimitedReaderWriter.scala:61)

at
org.apache.mahout.drivers.TextDelimitedIndexedDatasetReader.elementReader(TextDelimitedReaderWriter.scala:304)

at
org.apache.mahout.drivers.TextDelimitedIndexedDatasetReader.elementReader(TextDelimitedReaderWriter.scala:304)

at
org.apache.mahout.math.indexeddataset.Reader$class.readElementsFrom(ReaderWriter.scala:75)

at
org.apache.mahout.drivers.TextDelimitedIndexedDatasetReader.readElementsFrom(TextDelimitedReaderWriter.scala:304)

at
org.apache.mahout.sparkbindings.SparkEngine$.indexedDatasetDFSReadElements(SparkEngine.scala:382)

at
org.apache.mahout.sparkbindings.SparkEngine$.indexedDatasetDFSReadElements(SparkEngine.scala:39)

at
org.apache.mahout.math.indexeddataset.package$.indexedDatasetDFSReadElements(package.scala:336)

at
org.apache.mahout.drivers.ItemSimilarityDriver$.readIndexedDatasets(ItemSimilarityDriver.scala:152)

at
org.apache.mahout.drivers.ItemSimilarityDriver$.process(ItemSimilarityDriver.scala:201)

at
org.apache.mahout.drivers.ItemSimilarityDriver$$anonfun$main$1.apply(ItemSimilarityDriver.scala:112)

at
org.apache.mahout.drivers.ItemSimilarityDriver$$anonfun$main$1.apply(ItemSimilarityDriver.scala:110)

at scala.Option.map(Option.scala:145)

at
org.apache.mahout.drivers.ItemSimilarityDriver$.main(ItemSimilarityDriver.scala:110)

at
org.apache.mahout.drivers.ItemSimilarityDriver.main(ItemSimilarityDriver.scala)

Caused by: java.lang.IllegalArgumentException:
org.xerial.snappy.SnappyError: [FAILED_TO_LOAD_NATIVE_LIBRARY] null

at
org.apache.spark.io.SnappyCompressionCodec$.liftedTree1$1(CompressionCodec.scala:171)

at
org.apache.spark.io.SnappyCompressionCodec$.org$apache$spark$io$SnappyCompressionCodec$$version$lzycompute(CompressionCodec.scala:168)

at
org.apache.spark.io.SnappyCompressionCodec$.org$apache$spark$io$SnappyCompressionCodec$$version(CompressionCodec.scala:168)

at
org.apache.spark.io.SnappyCompressionCodec.<init>(CompressionCodec.scala:152)

... 38 more

Caused by: org.xerial.snappy.SnappyError: [FAILED_TO_LOAD_NATIVE_LIBRARY]
null

at org.xerial.snappy.SnappyLoader.load(SnappyLoader.java:229)

at org.xerial.snappy.Snappy.<clinit>(Snappy.java:44)

at
org.apache.spark.io.SnappyCompressionCodec$.liftedTree1$1(CompressionCodec.scala:169)

... 41 more

16/04/14 10:53:13 INFO SparkContext: Invoking stop() from shutdown hook

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/metrics/json,null}

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage/kill,null}

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/api,null}

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/,null}

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/static,null}

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/executors/threadDump,null}

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/executors/json,null}

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/executors,null}

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/environment/json,null}

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/environment,null}

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/storage/rdd/json,null}

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/storage/rdd,null}

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/storage/json,null}

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/storage,null}

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/pool/json,null}

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/pool,null}

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage/json,null}

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage,null}

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/json,null}

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages,null}

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/jobs/job/json,null}

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/jobs/job,null}

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/jobs/json,null}

16/04/14 10:53:13 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/jobs,null}

16/04/14 10:53:13 INFO SparkUI: Stopped Spark web UI at
http://192.168.1.122:4040

16/04/14 10:53:13 INFO MapOutputTrackerMasterEndpoint:
MapOutputTrackerMasterEndpoint stopped!

16/04/14 10:53:13 INFO MemoryStore: MemoryStore cleared

16/04/14 10:53:13 INFO BlockManager: BlockManager stopped

16/04/14 10:53:13 INFO BlockManagerMaster: BlockManagerMaster stopped

16/04/14 10:53:13 INFO
OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:
OutputCommitCoordinator stopped!

16/04/14 10:53:13 INFO SparkContext: Successfully stopped SparkContext

16/04/14 10:53:13 INFO ShutdownHookManager: Shutdown hook called

16/04/14 10:53:13 INFO ShutdownHookManager: Deleting directory
/private/var/folders/17/vhrqnx4x1n10yqbl1vnch9000000gn/T/spark-ad6a714b-6dcb-410c-aef3-2b098695743b/httpd-11f32419-5f8a-4b61-9654-9ea2d32cab5d

16/04/14 10:53:13 INFO RemoteActorRefProvider$RemotingTerminator: Shutting
down remote daemon.

16/04/14 10:53:13 INFO ShutdownHookManager: Deleting directory
/private/var/folders/17/vhrqnx4x1n10yqbl1vnch9000000gn/T/spark-ad6a714b-6dcb-410c-aef3-2b098695743b

-- 
Thanks & Regards,

*Rohit Jain*
Web developer | Consultant
Mob +91 8097283931

Reply via email to