Hi,
I create a beam pipeline in a Scala program, using the SparkRunner as runner 
with sparkMaster=local[4]

However, when executing the following code:
pipeline.run().waitUntilFinish();



The program does not terminate even if the console displays:



17/05/10 17:45:40 INFO SparkContext: Successfully stopped SparkContext

17/05/10 17:45:40 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down 
remote daemon.

17/05/10 17:45:40 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon 
shut down; proceeding with flushing remote transports.

17/05/10 17:45:40 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut 
down.




If I use pipeline in a right way?
Can anybody tell me how to let the program terminate normally?

Best regards,
bluejoe

Reply via email to