Hello,
Riccardo I was able to make it run, the problem is that HiveContext doesn't
exists any more in Spark 2.0.2, as far I can see. But exists the method
enableHiveSupport to add the hive functionality to SparkSession. To enable
this the spark-hive_2.11 dependency is needed.
In the Spark API
You can use the function w/o hive and you can try:
scala> Seq(1.0, 8.0).toDF("a").selectExpr("percentile_approx(a, 0.5)").show
++
|percentile_approx(a, CAST(0.5 AS DOUBLE), 1)|
++
|
Hi Andres,
I can't find the refrence, last time I searched for that I found that
'percentile_approx' is only available via hive context. You should register
a temp table and use it from there.
Best,
On Tue, Jun 13, 2017 at 8:52 PM, Andrés Ivaldi wrote:
> Hello, I`m trying