I doubt that will make it as we are pretty slammed with other things and
the author needs to address the comments / merge conflict still.

I'll add that in general I recommend users use the HiveContext, even if
they aren't using Hive at all.  Its a strict super set of the functionality
provided by SQLContext and the parser is much more powerful.

On Mon, May 11, 2015 at 4:00 PM, Oleg Shirokikh <o...@solver.com> wrote:

>  Michael – Thanks for the response – that’s right, I haven’t noticed that
> Spark Shell instantiates sqlContext as a HiveContext, not actual Spark SQL
> Context… I’ve seen the PR to add STDDEV to data frames.. Can I expect this
> to be added to Spark SQL in Spark 1.4 or it’s still uncertain? It would be
> really helpful to know in order to understand if I have to change existing
> code to use HiveContext instead of SQLContext (which would be undesired)…
> Thanks!
>
>
>
> *From:* Michael Armbrust [mailto:mich...@databricks.com]
> *Sent:* Saturday, May 09, 2015 11:32 AM
> *To:* Oleg Shirokikh
> *Cc:* user
> *Subject:* Re: Spark SQL: STDDEV working in Spark Shell but not in a
> standalone app
>
>
>
> Are you perhaps using a HiveContext in the shell but a SQLContext in your
> app?  I don't think we natively implement stddev until 1.4.0
>
>
>
> On Fri, May 8, 2015 at 4:44 PM, barmaley <o...@solver.com> wrote:
>
> Given a registered table from data frame, I'm able to execute queries like
> sqlContext.sql("SELECT STDDEV(col1) FROM table") from Spark Shell just
> fine.
> However, when I run exactly the same code in a standalone app on a cluster,
> it throws an exception: "java.util.NoSuchElementException: key not found:
> STDDEV"...
>
> Is STDDEV ia among default functions in Spark SQL? I'd appreciate if you
> could comment what's going on with the above.
>
> Thanks
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-STDDEV-working-in-Spark-Shell-but-not-in-a-standalone-app-tp22825.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>
>

Reply via email to