I think this is something we are going to change to completely decouple the
Hive support and catalog.


On Sun, Jan 22, 2017 at 4:51 AM Shuai Lin <linshuai2...@gmail.com> wrote:

> Hi all,
>
> Currently when the in-memory catalog is used, e.g. through `--conf
> spark.sql.catalogImplementation=in-memory`, we can create a persistent
> table, but inserting into this table would fail with error message "Hive
> support is required to insert into the following tables..".
>
>     sql("create table t1 (id int, name string, dept string)") // OK
>     sql("insert into t1 values (1, 'name1', 'dept1')")  // ERROR
>
>
> This doesn't make sense for me, because this table would always be empty
> if we can't insert into it, thus would be of no use. But I wonder if there
> are other good reasons for the current logic. If not, I would propose to
> raise an error when creating the table in the first place.
>
> Thanks!
>
> Regards,
> Shuai Lin (@lins05)
>
>
>

Reply via email to