Yes, using the new Spark structured streaming you can keep
submitting streaming jobs against the same SparkContext in different
requests (or you can create a new SparkContext if required in a
request). The SparkJob implementation will get handle of the
SparkContext which will be existing one or new one depending on the
REST API calls -- see its github page for details on transient vs
persistent SparkContexts. With the old Spark streaming model, you cannot add new DStreams once StreamingContext has started (which has been a limitation of the old streaming model), so you can submit against the same context but only until one last job starts the StreamingContext. regards sumedh On Monday 24 July 2017 06:09 AM, kant kodali wrote:
--------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org |
- Is there a way to run Spark SQL through REST? kant kodali
- Re: Is there a way to run Spark SQL through REST? Jean Georges Perrin
- Re: Is there a way to run Spark SQL through REST? Sumedh Wale
- Re: Is there a way to run Spark SQL through R... kant kodali
- Re: Is there a way to run Spark SQL throu... Sumedh Wale