you are creating a new hive context per microbatch? is that a good idea?

On Tue, Nov 22, 2016 at 8:51 AM, Dirceu Semighini Filho <
dirceu.semigh...@gmail.com> wrote:

> Has anybody seen this behavior (see tha attached picture) in Spark
> Streaming?
> It started to happen here after I changed the HiveContext creation to
> stream.foreachRDD {
> rdd =>
> val hiveContext = new HiveContext(rdd.sparkContext)
> }
>
> Is this expected?
>
> Kind Regards,
> Dirceu
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>

Reply via email to