> [Venkat] Are you saying - pull in the SharkServer2 code in my standalone
>  spark application (as a part of the standalone application process), pass
> in
> the spark context of the standalone app to SharkServer2 Sparkcontext at
> startup and viola we get a SQL/JDBC interfaces for the RDDs   of the
> Standalone app that are exposed as tables? Thanks for the clarification.
>

Yeah, thats should work although it is pretty hacky and is not officially
supported.  It might be interesting to augment Shark to allow the user to
invoke custom applications using the same SQLContext.  If this is something
you'd have time to implement I'd be happy to discuss the design further.

Reply via email to