If you dont want the fileStream to start only after certain event has
happened, why not start the streamingContext after that event?

TD

On Sun, May 17, 2015 at 7:51 PM, Haopu Wang <hw...@qilinsoft.com> wrote:

>  I want to use file stream as input. And I look at SparkStreaming
> document again, it's saying file stream doesn't need a receiver at all.
>
> So I'm wondering if I can control a specific DStream instance.
>
>
>  ------------------------------
>
> *From:* Evo Eftimov [mailto:evo.efti...@isecc.com]
> *Sent:* Monday, May 18, 2015 12:39 AM
> *To:* 'Akhil Das'; Haopu Wang
> *Cc:* 'user'
> *Subject:* RE: [SparkStreaming] Is it possible to delay the start of some
> DStream in the application?
>
>
>
> You can make ANY *standard* receiver sleep by implementing a custom
> Message Deserializer class with sleep method inside it.
>
>
>
> *From:* Akhil Das [mailto:ak...@sigmoidanalytics.com]
> *Sent:* Sunday, May 17, 2015 4:29 PM
> *To:* Haopu Wang
> *Cc:* user
> *Subject:* Re: [SparkStreaming] Is it possible to delay the start of some
> DStream in the application?
>
>
>
> Why not just trigger your batch job with that event?
>
>
>
> If you really need streaming, then you can create a custom receiver and
> make the receiver sleep till the event has happened. That will obviously
> run your streaming pipelines without having any data to process.
>
>
>   Thanks
>
> Best Regards
>
>
>
> On Fri, May 15, 2015 at 4:39 AM, Haopu Wang <hw...@qilinsoft.com> wrote:
>
> In my application, I want to start a DStream computation only after an
> special event has happened (for example, I want to start the receiver
> only after the reference data has been properly initialized).
>
> My question is: it looks like the DStream will be started right after
> the StreaminContext has been started. Is it possible to delay the start
> of specific DStream?
>
> Thank you very much!
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>
>

Reply via email to