Hi,

I am trying to execute Beam example in local spark setup. When I try to submit 
the sample WordCount jar via spark-submit, the job just hangs at 'INFO 
SparkRunner$Evaluator: Evaluating ParMultiDo(ExtractWords)’. But it runs fine 
when executed directly. Below is the command I used to submit the job in spark 
local,

$ ~/spark/bin/spark-submit --class "org.apache.beam.examples.WordCount"  
--master local[4] target/word-count-beam-0.1.jar —inputFile=./pom.xml 
--output=csvout --runner=SparkRunner


Have attached log file for reference. Can anyone please help me find out whats 
going on?


Regards,
Sathish. J


log4j:WARN No appenders could be found for logger (org.apache.beam.sdk.options.PipelineOptionsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/08/01 15:10:52 INFO SparkContext: Running Spark version 2.1.1
17/08/01 15:10:52 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/08/01 15:10:53 INFO SecurityManager: Changing view acls to: sathishjayaraman
17/08/01 15:10:53 INFO SecurityManager: Changing modify acls to: sathishjayaraman
17/08/01 15:10:53 INFO SecurityManager: Changing view acls groups to: 
17/08/01 15:10:53 INFO SecurityManager: Changing modify acls groups to: 
17/08/01 15:10:53 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(sathishjayaraman); groups with view permissions: Set(); users  with modify permissions: Set(sathishjayaraman); groups with modify permissions: Set()
17/08/01 15:10:53 INFO Utils: Successfully started service 'sparkDriver' on port 57704.
17/08/01 15:10:53 INFO SparkEnv: Registering MapOutputTracker
17/08/01 15:10:53 INFO SparkEnv: Registering BlockManagerMaster
17/08/01 15:10:53 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
17/08/01 15:10:53 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
17/08/01 15:10:53 INFO DiskBlockManager: Created local directory at /private/var/folders/d3/d1mrkc4s023d3qv1jr6w4cg00000gp/T/blockmgr-c434da8b-59e9-4225-a0f2-bde4becd6c90
17/08/01 15:10:53 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
17/08/01 15:10:53 INFO SparkEnv: Registering OutputCommitCoordinator
17/08/01 15:10:54 INFO Utils: Successfully started service 'SparkUI' on port 4040.
17/08/01 15:10:54 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.0.2:4040
17/08/01 15:10:54 INFO SparkContext: Added JAR file:/Users/sathishjayaraman/java_projects/beam-examples/word-count-beam/target/word-count-beam-0.1.jar at spark://192.168.0.2:57704/jars/word-count-beam-0.1.jar with timestamp 1501580454234
17/08/01 15:10:54 INFO Executor: Starting executor ID driver on host localhost
17/08/01 15:10:54 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 57705.
17/08/01 15:10:54 INFO NettyBlockTransferService: Server created on 192.168.0.2:57705
17/08/01 15:10:54 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
17/08/01 15:10:54 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.0.2, 57705, None)
17/08/01 15:10:54 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.0.2:57705 with 366.3 MB RAM, BlockManagerId(driver, 192.168.0.2, 57705, None)
17/08/01 15:10:54 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.0.2, 57705, None)
17/08/01 15:10:54 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.0.2, 57705, None)
17/08/01 15:10:54 INFO SparkRunner$Evaluator: Entering directly-translatable composite transform: 'WordCount.CountWords/Count.PerElement/Combine.perKey(Count)'
17/08/01 15:10:54 INFO SparkRunner$Evaluator: Entering directly-translatable composite transform: 'WriteCounts/WriteFiles/View.AsIterable'
17/08/01 15:10:54 INFO SparkRunner$Evaluator: Entering directly-translatable composite transform: 'WriteCounts/WriteFiles/Create.Values'
17/08/01 15:10:54 INFO MetricsAccumulator: Instantiated metrics accumulator: org.apache.beam.runners.core.metrics.MetricsContainerStepMap@6a87026
17/08/01 15:10:54 INFO AggregatorsAccumulator: Instantiated aggregators accumulator: 
17/08/01 15:10:54 INFO SparkRunner$Evaluator: Evaluating Read(CompressedSource)
17/08/01 15:10:54 INFO SparkRunner$Evaluator: Evaluating ParMultiDo(ExtractWords)

Reply via email to