Dongjoon Hyun created SPARK-19881:
-------------------------------------

             Summary: Support Dynamic Partition Inserts params with SET command
                 Key: SPARK-19881
                 URL: https://issues.apache.org/jira/browse/SPARK-19881
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 2.1.0, 2.0.0
            Reporter: Dongjoon Hyun
            Priority: Minor


Currently, `SET` command does not pass the values to Hive. In most case, Spark 
handles well. However, for the dynamic partition insert, users meet the 
following situation. 

{code}
scala> spark.range(1001).selectExpr("id as key", "id as 
value").registerTempTable("t1001")

scala> sql("create table p (value int) partitioned by (key int)").show

scala> sql("insert into table p partition(key) select key, value from t1001")
org.apache.spark.SparkException: Dynamic partition strict mode requires at 
least one static partition column. To turn this off set 
hive.exec.dynamic.partition.mode=nonstrict

scala> sql("set hive.exec.dynamic.partition.mode=nonstrict")

scala> sql("insert into table p partition(key) select key, value from t1001")
org.apache.hadoop.hive.ql.metadata.HiveException: Number of dynamic partitions 
created is 1001, which is more than 1000. To solve this try to set 
hive.exec.max.dynamic.partitions to at least 1001.

scala> sql("set hive.exec.dynamic.partition.mode=1001")

scala> sql("insert into table p partition(key) select key, value from t1001")
org.apache.hadoop.hive.ql.metadata.HiveException: Number of dynamic partitions 
created is 1001, which is more than 1000. To solve this try to set 
hive.exec.max.dynamic.partitions to at least 1001.

<== Repeat the same error message.
{code}

The root cause is that `hive` parameters are passed to `HiveClient` on 
creating. So, The workaround is using `--hiveconf`.

We had better handle this case without misleading error messages.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to