[jira] [Assigned] (SPARK-27555) cannot create table by using the hive default fileformat in both hive-site.xml and spark-defaults.conf

2019-05-08 Thread Hyukjin Kwon (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-27555?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon reassigned SPARK-27555:


Assignee: Sandeep Katta  (was: Hyukjin Kwon)

> cannot create table by using the hive default fileformat in both 
> hive-site.xml and spark-defaults.conf
> --
>
> Key: SPARK-27555
> URL: https://issues.apache.org/jira/browse/SPARK-27555
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 2.3.2
>Reporter: Hui WANG
>Assignee: Sandeep Katta
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: Try.pdf
>
>
> *You can see details in attachment called Try.pdf.*
> I already seen https://issues.apache.org/jira/browse/SPARK-17620 
> and https://issues.apache.org/jira/browse/SPARK-18397
> and I check source code of Spark for the change of  set 
> "spark.sql.hive.covertCTAS=true" and then spark will use 
> "spark.sql.sources.default" which is parquet as storage format in "create 
> table as select" scenario.
> But my case is just create table without select. When I set  
> hive.default.fileformat=parquet in hive-site.xml or set  
> spark.hadoop.hive.default.fileformat=parquet in spark-defaults.conf, after 
> create a table, when i check the hive table, it still use textfile fileformat.
>  
> It seems HiveSerDe gets the value of the hive.default.fileformat parameter 
> from SQLConf
> The parameter values in SQLConf are copied from SparkContext's SparkConf at 
> SparkSession initialization, while the configuration parameters in 
> hive-site.xml are loaded into SparkContext's hadoopConfiguration parameters 
> by SharedState, And all the config with "spark.hadoop" conf are setted to 
> hadoopconfig, so the configuration does not take effect.
>  
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-27555) cannot create table by using the hive default fileformat in both hive-site.xml and spark-defaults.conf

2019-05-03 Thread Hyukjin Kwon (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-27555?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon reassigned SPARK-27555:


Assignee: Hyukjin Kwon

> cannot create table by using the hive default fileformat in both 
> hive-site.xml and spark-defaults.conf
> --
>
> Key: SPARK-27555
> URL: https://issues.apache.org/jira/browse/SPARK-27555
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 2.3.2
>Reporter: Hui WANG
>Assignee: Hyukjin Kwon
>Priority: Major
> Attachments: Try.pdf
>
>
> *You can see details in attachment called Try.pdf.*
> I already seen https://issues.apache.org/jira/browse/SPARK-17620 
> and https://issues.apache.org/jira/browse/SPARK-18397
> and I check source code of Spark for the change of  set 
> "spark.sql.hive.covertCTAS=true" and then spark will use 
> "spark.sql.sources.default" which is parquet as storage format in "create 
> table as select" scenario.
> But my case is just create table without select. When I set  
> hive.default.fileformat=parquet in hive-site.xml or set  
> spark.hadoop.hive.default.fileformat=parquet in spark-defaults.conf, after 
> create a table, when i check the hive table, it still use textfile fileformat.
>  
> It seems HiveSerDe gets the value of the hive.default.fileformat parameter 
> from SQLConf
> The parameter values in SQLConf are copied from SparkContext's SparkConf at 
> SparkSession initialization, while the configuration parameters in 
> hive-site.xml are loaded into SparkContext's hadoopConfiguration parameters 
> by SharedState, And all the config with "spark.hadoop" conf are setted to 
> hadoopconfig, so the configuration does not take effect.
>  
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-27555) cannot create table by using the hive default fileformat in both hive-site.xml and spark-defaults.conf

2019-04-29 Thread Apache Spark (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-27555?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-27555:


Assignee: (was: Apache Spark)

> cannot create table by using the hive default fileformat in both 
> hive-site.xml and spark-defaults.conf
> --
>
> Key: SPARK-27555
> URL: https://issues.apache.org/jira/browse/SPARK-27555
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 2.3.2
>Reporter: Hui WANG
>Priority: Major
> Attachments: Try.pdf
>
>
> *You can see details in attachment called Try.pdf.*
> I already seen https://issues.apache.org/jira/browse/SPARK-17620 
> and https://issues.apache.org/jira/browse/SPARK-18397
> and I check source code of Spark for the change of  set 
> "spark.sql.hive.covertCTAS=true" and then spark will use 
> "spark.sql.sources.default" which is parquet as storage format in "create 
> table as select" scenario.
> But my case is just create table without select. When I set  
> hive.default.fileformat=parquet in hive-site.xml or set  
> spark.hadoop.hive.default.fileformat=parquet in spark-defaults.conf, after 
> create a table, when i check the hive table, it still use textfile fileformat.
>  
> It seems HiveSerDe gets the value of the hive.default.fileformat parameter 
> from SQLConf
> The parameter values in SQLConf are copied from SparkContext's SparkConf at 
> SparkSession initialization, while the configuration parameters in 
> hive-site.xml are loaded into SparkContext's hadoopConfiguration parameters 
> by SharedState, And all the config with "spark.hadoop" conf are setted to 
> hadoopconfig, so the configuration does not take effect.
>  
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-27555) cannot create table by using the hive default fileformat in both hive-site.xml and spark-defaults.conf

2019-04-29 Thread Apache Spark (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-27555?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-27555:


Assignee: Apache Spark

> cannot create table by using the hive default fileformat in both 
> hive-site.xml and spark-defaults.conf
> --
>
> Key: SPARK-27555
> URL: https://issues.apache.org/jira/browse/SPARK-27555
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 2.3.2
>Reporter: Hui WANG
>Assignee: Apache Spark
>Priority: Major
> Attachments: Try.pdf
>
>
> *You can see details in attachment called Try.pdf.*
> I already seen https://issues.apache.org/jira/browse/SPARK-17620 
> and https://issues.apache.org/jira/browse/SPARK-18397
> and I check source code of Spark for the change of  set 
> "spark.sql.hive.covertCTAS=true" and then spark will use 
> "spark.sql.sources.default" which is parquet as storage format in "create 
> table as select" scenario.
> But my case is just create table without select. When I set  
> hive.default.fileformat=parquet in hive-site.xml or set  
> spark.hadoop.hive.default.fileformat=parquet in spark-defaults.conf, after 
> create a table, when i check the hive table, it still use textfile fileformat.
>  
> It seems HiveSerDe gets the value of the hive.default.fileformat parameter 
> from SQLConf
> The parameter values in SQLConf are copied from SparkContext's SparkConf at 
> SparkSession initialization, while the configuration parameters in 
> hive-site.xml are loaded into SparkContext's hadoopConfiguration parameters 
> by SharedState, And all the config with "spark.hadoop" conf are setted to 
> hadoopconfig, so the configuration does not take effect.
>  
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org