Please unsubscribe me.
Thanks,
Alfredo
On Thu, Mar 26, 2020, 8:35 PM Andrew Melo wrote:
> Hello all,
>
> Is there a way to register classes within a datasourcev2 implementation in
> the Kryo serializer?
>
> I've attempted the following in both the constructor and static block of
> my toplevel class:
>
> SparkContext context =
ase of Spark 3 with support for Hadoop 3.2 that you
> can try now:
>
> https://archive.apache.org/dist/spark/spark-3.0.0-preview/spark-3.0.0-preview-bin-hadoop3.2.tgz
>
> Enjoy!
>
>
>
> On Tue, Nov 19, 2019 at 3:44 PM Alfredo Marquez <
> alfredo.g.marq...@gmail.com>
Does anyone else have some insight to this question?
Thanks,
Alfredo
On Mon, Nov 18, 2019, 3:00 PM Alfredo Marquez
wrote:
> Hello Nicolas,
>
> Well the issue is that with Hive 3, Spark gets it's own metastore,
> separate from the Hive 3 metastore. So how do you reconcile thi
I also would like know the answer to this question.
Thanks,
Alfredo
On Tue, Nov 19, 2019, 8:24 AM bsikander wrote:
> Hi,
> Are Spark 2.4.4 and Hadoop 3.2.0 compatible?
> I tried to search the mailing list but couldn't find anything relevant.
>
>
>
>
>
> --
> Sent from: http://apache-spark-user
nsactional tables.
> So I would say you should be able to read any hive 3 regular table with
> any of spark, pyspark or sparkR.
>
>
> [1]
> https://parisni.frama.io/posts/playing-with-hive-spark-metastore-versions/
>
> On Mon, Nov 18, 2019 at 11:23:50AM -0600, Alfredo Marquez wr
Hello,
Our company is moving to Hive 3, and they are saying that there is no
SparkR implementation in Spark 2.3.x + that will connect to Hive 3. Is
this true?
If it is true, will this be addressed in the Spark 3 release?
I don't use python, so losing SparkR to get work done on Hadoop is a huge