[
https://issues.apache.org/jira/browse/HUDI-2208?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17395549#comment-17395549
]
ASF GitHub Bot commented on HUDI-2208:
--------------------------------------
nsivabalan commented on a change in pull request #3328:
URL: https://github.com/apache/hudi/pull/3328#discussion_r684803078
##########
File path:
hudi-spark-datasource/hudi-spark/src/test/scala/org/apache/spark/sql/hudi/TestInsertTable.scala
##########
@@ -303,5 +304,184 @@ class TestInsertTable extends TestHoodieSqlBase {
"assertion failed: Required select columns count: 4, Current select
columns(including static partition column)" +
" count: 3,columns: (1,a1,10)"
)
+ spark.sql("set hoodie.sql.bulk.insert.enable = true")
+ spark.sql("set hoodie.sql.insert.mode= strict")
+
+ val tableName2 = generateTableName
Review comment:
Can we also enhance the test w/ both type of partitions(single level and
multi-level).
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
> [SQL] Support Bulk Insert For Spark Sql
> ---------------------------------------
>
> Key: HUDI-2208
> URL: https://issues.apache.org/jira/browse/HUDI-2208
> Project: Apache Hudi
> Issue Type: Sub-task
> Reporter: pengzhiwei
> Assignee: pengzhiwei
> Priority: Blocker
> Labels: pull-request-available, release-blocker
> Fix For: 0.9.0
>
>
> Support the bulk insert for spark sql
--
This message was sent by Atlassian Jira
(v8.3.4#803005)