Ross,
The problem you're hitting is that there aren't many logical plans that
work with the v2 source API yet. Here, you're creating an InsertIntoTable
logical plan from SQL, which can't be converted to a physical plan because
there is no rule to convert it either to the right logical plan for v2,
Data source v2 catalog support(table/view) is still in progress. There are
several threads in the dev list discussing it, please join the discussion
if you are interested. Thanks for trying!
On Thu, Sep 6, 2018 at 7:23 PM Ross Lawley wrote:
> Hi,
>
> I hope this is the correct mailinglist. I've
Hi,
I hope this is the correct mailinglist. I've been adding v2 support to the
MongoDB Spark connector using Spark 2.3.1. I've noticed one of my tests
pass when using the original DefaultSource but errors with my v2
implementation:
The code I'm running is:
val df = spark.loadDS[Character]()
df.c