Hi,

Is it possible to create a standalone job in scala using sparkR? If
possible can you provide me with the information of the setup process.
(Like the dependencies in SBT and where to include the JAR files)

This is my use-case:

1. I have a Spark Streaming standalone Job running in local machine which
streams twitter data.
2. I have an R script which performs Sentiment Analysis.

I am looking for an optimal way where I could combine these two operations
into a single job and run using "SBT Run" command.

I came across this document which talks about embedding R into scala (
http://dahl.byu.edu/software/jvmr/dahl-payne-uppalapati-2013.pdf) but was
not sure if that would work well within the spark context.

Thanks,
Pawan Venugopal

Reply via email to