[jira] [Updated] (SPARK-26499) JdbcUtils.makeGetter does not handle ByteType

2018-12-29 Thread Thomas D'Silva (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26499?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Thomas D'Silva updated SPARK-26499:
---
Description: 
I am trying to use the  DataSource V2 API to read from a JDBC source. While 
using {{JdbcUtils.resultSetToSparkInternalRows}} to create an internal row from 
a ResultSet that has a column of type TINYINT I ran into the following exception
{code:java}
java.lang.IllegalArgumentException: Unsupported type tinyint
at 
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.org$apache$spark$sql$execution$datasources$jdbc$JdbcUtils$$makeGetter(JdbcUtils.scala:502)
at 
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$org$apache$spark$sql$execution$datasources$jdbc$JdbcUtils$$makeGetters$1.apply(JdbcUtils.scala:379)
at 
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$org$apache$spark$sql$execution$datasources$jdbc$JdbcUtils$$makeGetters$1.apply(JdbcUtils.scala:379)
at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at 
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:186)
at 
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.org$apache$spark$sql$execution$datasources$jdbc$JdbcUtils$$makeGetters(JdbcUtils.scala:379)
at 
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anon$1.(JdbcUtils.scala:340)
{code}
This happens because ByteType is not handled in {{JdbcUtils.makeGetter}}.


  was:
I am trying to use the  DataSource V2 API to read from a JDBC source. While 
using {{JdbcUtils.resultSetToSparkInternalRows}} to create an internal row from 
a ResultSet that has a column of type TINYINT I ran into the following exception
{code:java}
java.lang.IllegalArgumentException: Unsupported type tinyint
at 
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.org$apache$spark$sql$execution$datasources$jdbc$JdbcUtils$$makeGetter(JdbcUtils.scala:502)
at 
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$org$apache$spark$sql$execution$datasources$jdbc$JdbcUtils$$makeGetters$1.apply(JdbcUtils.scala:379)
at 
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$org$apache$spark$sql$execution$datasources$jdbc$JdbcUtils$$makeGetters$1.apply(JdbcUtils.scala:379)
at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at 
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:186)
at 
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.org$apache$spark$sql$execution$datasources$jdbc$JdbcUtils$$makeGetters(JdbcUtils.scala:379)
at 
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anon$1.(JdbcUtils.scala:340)
{code}
This happens because ByteType is not handled in {{JdbcUtils.makeGetter}}.

Also, since {{JdbcUtils.getCommonJDBCType}} maps ByteType to TinyInt, I think 
{{getCatalystType}} should map TINYINT to ByteType (it currently maps TINYINT 
to IntegerType).

Summary: JdbcUtils.makeGetter does not handle ByteType  (was: 
JdbcUtils.getCatalystType maps TINYINT to IntegerType instead of ByteType)

> JdbcUtils.makeGetter does not handle ByteType
> -
>
> Key: SPARK-26499
> URL: https://issues.apache.org/jira/browse/SPARK-26499
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 2.4.0
>Reporter: Thomas D'Silva
>Priority: Major
>
> I am trying to use the  DataSource V2 API to read from a JDBC source. While 
> using {{JdbcUtils.resultSetToSparkInternalRows}} to create an internal row 
> from a ResultSet that has a column of type TINYINT I ran into the following 
> exception
> {code:java}
> java.lang.IllegalArgumentException: Unsupported type tinyint
>   at 
> org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.org$apache$spark$sql$execution$datasources$jdbc$JdbcUtils$$makeGetter(JdbcUtils.scala:502)
>   at 
> org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$org$apache$spark$sql$execution$datasources$jdbc$JdbcUtils$$makeGetters$1.apply(JdbcUtils.scala:379)
>   

[jira] [Commented] (SPARK-26433) Tail method for spark DataFrame

2018-12-29 Thread Jan Gorecki (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-26433?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16730887#comment-16730887
 ] 

Jan Gorecki commented on SPARK-26433:
-

[~hyukjin.kwon] Thank you for your comment but not sure if I understood 
correctly. You mean I should first collect data to client and then extract last 
few rows of dataframe? If so it doesn't seems to be a feasible solution, as 
data in spark are likely to not fit into client machine. `Tail` is exactly the 
operation that one would want to perform BEFORE collecting data to client. 
Could you confirm?

> Tail method for spark DataFrame
> ---
>
> Key: SPARK-26433
> URL: https://issues.apache.org/jira/browse/SPARK-26433
> Project: Spark
>  Issue Type: New Feature
>  Components: PySpark
>Affects Versions: 2.4.0
>Reporter: Jan Gorecki
>Priority: Major
>
> There is a head method for spark dataframes which work fine but there doesn't 
> seems to be tail method.
> ```
> >>> ans   
> >>>   
> DataFrame[v1: bigint] 
>   
> >>> ans.head(3)   
> >>>  
> [Row(v1=299443), Row(v1=299493), Row(v1=300751)]
> >>> ans.tail(3)
> Traceback (most recent call last):
>   File "", line 1, in 
>   File 
> "/home/jan/git/db-benchmark/spark/py-spark/lib/python3.6/site-packages/py
> spark/sql/dataframe.py", line 1300, in __getattr__
> "'%s' object has no attribute '%s'" % (self.__class__.__name__, name))
> AttributeError: 'DataFrame' object has no attribute 'tail'
> ```
> I would like to feature request Tail method for spark dataframe



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-26477) Use ConfigEntry for hardcoded configs for unsafe category.

2018-12-29 Thread Apache Spark (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26477?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-26477:


Assignee: Apache Spark

> Use ConfigEntry for hardcoded configs for unsafe category.
> --
>
> Key: SPARK-26477
> URL: https://issues.apache.org/jira/browse/SPARK-26477
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Assignee: Apache Spark
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-26477) Use ConfigEntry for hardcoded configs for unsafe category.

2018-12-29 Thread Apache Spark (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26477?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-26477:


Assignee: (was: Apache Spark)

> Use ConfigEntry for hardcoded configs for unsafe category.
> --
>
> Key: SPARK-26477
> URL: https://issues.apache.org/jira/browse/SPARK-26477
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-26363) Avoid duplicated KV store lookups for task table

2018-12-29 Thread Sean Owen (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26363?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen updated SPARK-26363:
--
Priority: Minor  (was: Major)

>  Avoid duplicated KV store lookups for task table
> -
>
> Key: SPARK-26363
> URL: https://issues.apache.org/jira/browse/SPARK-26363
> Project: Spark
>  Issue Type: Improvement
>  Components: Web UI
>Affects Versions: 3.0.0
>Reporter: Gengliang Wang
>Assignee: Gengliang Wang
>Priority: Minor
> Fix For: 3.0.0
>
>
> In the method `taskList`(Since https://github.com/apache/spark/pull/21688),  
> the executor log value is queried in KV store for every task(method 
> `constructTaskData`).
> We can use a hashmap for reducing duplicated KV store lookups in the method.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-26463) Use ConfigEntry for hardcoded configs for scheduler categories.

2018-12-29 Thread Kazuaki Ishizaki (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-26463?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16730882#comment-16730882
 ] 

Kazuaki Ishizaki commented on SPARK-26463:
--

I will work for this

> Use ConfigEntry for hardcoded configs for scheduler categories.
> ---
>
> Key: SPARK-26463
> URL: https://issues.apache.org/jira/browse/SPARK-26463
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>
> Make the following hardcoded configs to use {{ConfigEntry}}.
> {code}
> spark.dynamicAllocation
> spark.scheduler
> spark.rpc
> spark.task
> spark.speculation
> spark.cleaner
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-26442) Use ConfigEntry for hardcoded configs.

2018-12-29 Thread Kazuaki Ishizaki (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-26442?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16730883#comment-16730883
 ] 

Kazuaki Ishizaki commented on SPARK-26442:
--

Thank you for updaing them.

> Use ConfigEntry for hardcoded configs.
> --
>
> Key: SPARK-26442
> URL: https://issues.apache.org/jira/browse/SPARK-26442
> Project: Spark
>  Issue Type: Umbrella
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>
> This umbrella JIRA is to make hardcoded configs to use {{ConfigEntry}}.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-26442) Use ConfigEntry for hardcoded configs.

2018-12-29 Thread Kazuaki Ishizaki (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-26442?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16730883#comment-16730883
 ] 

Kazuaki Ishizaki edited comment on SPARK-26442 at 12/30/18 4:18 AM:


Thank you for updating them.


was (Author: kiszk):
Thank you for updaing them.

> Use ConfigEntry for hardcoded configs.
> --
>
> Key: SPARK-26442
> URL: https://issues.apache.org/jira/browse/SPARK-26442
> Project: Spark
>  Issue Type: Umbrella
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>
> This umbrella JIRA is to make hardcoded configs to use {{ConfigEntry}}.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-26503) Get rid of spark.sql.legacy.timeParser.enabled

2018-12-29 Thread Apache Spark (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26503?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-26503:


Assignee: (was: Apache Spark)

> Get rid of spark.sql.legacy.timeParser.enabled
> --
>
> Key: SPARK-26503
> URL: https://issues.apache.org/jira/browse/SPARK-26503
> Project: Spark
>  Issue Type: Task
>  Components: SQL
>Affects Versions: 3.0.0
>Reporter: Maxim Gekk
>Priority: Minor
>
> The flag is used in CSV/JSON datasources as well in time related functions to 
> control parsing/formatting date/timestamps. By default, DateTimeFormat is 
> used for the purpose but the flag allow switch back to SimpleDateFormat and 
> some fallback. In the major release 3.0, the flag should be removed, and new 
> formatters DateFormatter/TimestampFormatter should be used by default.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-26503) Get rid of spark.sql.legacy.timeParser.enabled

2018-12-29 Thread Apache Spark (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26503?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-26503:


Assignee: Apache Spark

> Get rid of spark.sql.legacy.timeParser.enabled
> --
>
> Key: SPARK-26503
> URL: https://issues.apache.org/jira/browse/SPARK-26503
> Project: Spark
>  Issue Type: Task
>  Components: SQL
>Affects Versions: 3.0.0
>Reporter: Maxim Gekk
>Assignee: Apache Spark
>Priority: Minor
>
> The flag is used in CSV/JSON datasources as well in time related functions to 
> control parsing/formatting date/timestamps. By default, DateTimeFormat is 
> used for the purpose but the flag allow switch back to SimpleDateFormat and 
> some fallback. In the major release 3.0, the flag should be removed, and new 
> formatters DateFormatter/TimestampFormatter should be used by default.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-26502) Get rid of hiveResultString() in QueryExecution

2018-12-29 Thread Sean Owen (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-26502?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16730880#comment-16730880
 ] 

Sean Owen commented on SPARK-26502:
---

It's used in SparkSQLDriver; can it really be moved to test code?

> Get rid of hiveResultString() in QueryExecution
> ---
>
> Key: SPARK-26502
> URL: https://issues.apache.org/jira/browse/SPARK-26502
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 2.4.0
>Reporter: Maxim Gekk
>Priority: Minor
>
> The method hiveResultString() of QueryExecution is used in test and 
> SparkSQLDriver. It should be moved from QueryExecution to more specific class.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-26363) Avoid duplicated KV store lookups for task table

2018-12-29 Thread Sean Owen (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26363?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen reassigned SPARK-26363:
-

Assignee: Gengliang Wang

>  Avoid duplicated KV store lookups for task table
> -
>
> Key: SPARK-26363
> URL: https://issues.apache.org/jira/browse/SPARK-26363
> Project: Spark
>  Issue Type: Improvement
>  Components: Web UI
>Affects Versions: 3.0.0
>Reporter: Gengliang Wang
>Assignee: Gengliang Wang
>Priority: Major
> Fix For: 3.0.0
>
>
> In the method `taskList`(Since https://github.com/apache/spark/pull/21688),  
> the executor log value is queried in KV store for every task(method 
> `constructTaskData`).
> We can use a hashmap for reducing duplicated KV store lookups in the method.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26363) Avoid duplicated KV store lookups for task table

2018-12-29 Thread Sean Owen (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26363?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen resolved SPARK-26363.
---
   Resolution: Fixed
Fix Version/s: 3.0.0

Issue resolved by pull request 23310
[https://github.com/apache/spark/pull/23310]

>  Avoid duplicated KV store lookups for task table
> -
>
> Key: SPARK-26363
> URL: https://issues.apache.org/jira/browse/SPARK-26363
> Project: Spark
>  Issue Type: Improvement
>  Components: Web UI
>Affects Versions: 3.0.0
>Reporter: Gengliang Wang
>Assignee: Gengliang Wang
>Priority: Major
> Fix For: 3.0.0
>
>
> In the method `taskList`(Since https://github.com/apache/spark/pull/21688),  
> the executor log value is queried in KV store for every task(method 
> `constructTaskData`).
> We can use a hashmap for reducing duplicated KV store lookups in the method.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-26487) Use ConfigEntry for hardcoded configs for admin category.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26487?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-26487.
-

> Use ConfigEntry for hardcoded configs for admin category.
> -
>
> Key: SPARK-26487
> URL: https://issues.apache.org/jira/browse/SPARK-26487
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-26471) Use ConfigEntry for hardcoded configs for speculation category.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26471?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-26471.
-

> Use ConfigEntry for hardcoded configs for speculation category.
> ---
>
> Key: SPARK-26471
> URL: https://issues.apache.org/jira/browse/SPARK-26471
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-26481) Use ConfigEntry for hardcoded configs for reducer category.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26481?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-26481.
-

> Use ConfigEntry for hardcoded configs for reducer category.
> ---
>
> Key: SPARK-26481
> URL: https://issues.apache.org/jira/browse/SPARK-26481
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-26465) Use ConfigEntry for hardcoded configs for jars category.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26465?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-26465.
-

> Use ConfigEntry for hardcoded configs for jars category.
> 
>
> Key: SPARK-26465
> URL: https://issues.apache.org/jira/browse/SPARK-26465
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-26490) Use ConfigEntry for hardcoded configs for r category.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26490?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-26490.
-

> Use ConfigEntry for hardcoded configs for r category.
> -
>
> Key: SPARK-26490
> URL: https://issues.apache.org/jira/browse/SPARK-26490
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-26484) Use ConfigEntry for hardcoded configs for authenticate category.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26484?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-26484.
-

> Use ConfigEntry for hardcoded configs for authenticate category.
> 
>
> Key: SPARK-26484
> URL: https://issues.apache.org/jira/browse/SPARK-26484
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-26488) Use ConfigEntry for hardcoded configs for modify.acl category.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26488?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-26488.
-

> Use ConfigEntry for hardcoded configs for modify.acl category.
> --
>
> Key: SPARK-26488
> URL: https://issues.apache.org/jira/browse/SPARK-26488
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-26486) Use ConfigEntry for hardcoded configs for metrics category.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26486?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-26486.
-

> Use ConfigEntry for hardcoded configs for metrics category.
> ---
>
> Key: SPARK-26486
> URL: https://issues.apache.org/jira/browse/SPARK-26486
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-26485) Use ConfigEntry for hardcoded configs for master.rest/ui categories.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26485?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-26485.
-

> Use ConfigEntry for hardcoded configs for master.rest/ui categories.
> 
>
> Key: SPARK-26485
> URL: https://issues.apache.org/jira/browse/SPARK-26485
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-26479) Use ConfigEntry for hardcoded configs for locality category.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26479?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-26479.
-

> Use ConfigEntry for hardcoded configs for locality category.
> 
>
> Key: SPARK-26479
> URL: https://issues.apache.org/jira/browse/SPARK-26479
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-26474) Use ConfigEntry for hardcoded configs for worker category.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26474?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-26474.
-

> Use ConfigEntry for hardcoded configs for worker category.
> --
>
> Key: SPARK-26474
> URL: https://issues.apache.org/jira/browse/SPARK-26474
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-26476) Use ConfigEntry for hardcoded configs for cleaner category.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26476?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-26476.
-

> Use ConfigEntry for hardcoded configs for cleaner category.
> ---
>
> Key: SPARK-26476
> URL: https://issues.apache.org/jira/browse/SPARK-26476
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-26475) Use ConfigEntry for hardcoded configs for buffer category.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26475?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-26475.
-

> Use ConfigEntry for hardcoded configs for buffer category.
> --
>
> Key: SPARK-26475
> URL: https://issues.apache.org/jira/browse/SPARK-26475
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-26480) Use ConfigEntry for hardcoded configs for broadcast category.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26480?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-26480.
-

> Use ConfigEntry for hardcoded configs for broadcast category.
> -
>
> Key: SPARK-26480
> URL: https://issues.apache.org/jira/browse/SPARK-26480
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-26478) Use ConfigEntry for hardcoded configs for rdd category.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26478?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-26478.
-

> Use ConfigEntry for hardcoded configs for rdd category.
> ---
>
> Key: SPARK-26478
> URL: https://issues.apache.org/jira/browse/SPARK-26478
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-26473) Use ConfigEntry for hardcoded configs for deploy category.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26473?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-26473.
-

> Use ConfigEntry for hardcoded configs for deploy category.
> --
>
> Key: SPARK-26473
> URL: https://issues.apache.org/jira/browse/SPARK-26473
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-26483) Use ConfigEntry for hardcoded configs for ssl category.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26483?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-26483.
-

> Use ConfigEntry for hardcoded configs for ssl category.
> ---
>
> Key: SPARK-26483
> URL: https://issues.apache.org/jira/browse/SPARK-26483
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-26467) Use ConfigEntry for hardcoded configs for rpc category.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26467?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-26467.
-

> Use ConfigEntry for hardcoded configs for rpc category.
> ---
>
> Key: SPARK-26467
> URL: https://issues.apache.org/jira/browse/SPARK-26467
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-26468) Use ConfigEntry for hardcoded configs for task category.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26468?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-26468.
-

> Use ConfigEntry for hardcoded configs for task category.
> 
>
> Key: SPARK-26468
> URL: https://issues.apache.org/jira/browse/SPARK-26468
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-26469) Use ConfigEntry for hardcoded configs for io category.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26469?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-26469.
-

> Use ConfigEntry for hardcoded configs for io category.
> --
>
> Key: SPARK-26469
> URL: https://issues.apache.org/jira/browse/SPARK-26469
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-26472) Use ConfigEntry for hardcoded configs for serializer category.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26472?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-26472.
-

> Use ConfigEntry for hardcoded configs for serializer category.
> --
>
> Key: SPARK-26472
> URL: https://issues.apache.org/jira/browse/SPARK-26472
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-26461) Use ConfigEntry for hardcoded configs for dynamicAllocation category.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26461?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-26461.
-

> Use ConfigEntry for hardcoded configs for dynamicAllocation category.
> -
>
> Key: SPARK-26461
> URL: https://issues.apache.org/jira/browse/SPARK-26461
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-26464) Use ConfigEntry for hardcoded configs for storage category.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26464?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-26464.
-

> Use ConfigEntry for hardcoded configs for storage category.
> ---
>
> Key: SPARK-26464
> URL: https://issues.apache.org/jira/browse/SPARK-26464
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-26460) Use ConfigEntry for hardcoded configs for kryo/kryoserializer categories.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26460?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-26460.
-

> Use ConfigEntry for hardcoded configs for kryo/kryoserializer categories.
> -
>
> Key: SPARK-26460
> URL: https://issues.apache.org/jira/browse/SPARK-26460
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26443) Use ConfigEntry for hardcoded configs for history category.

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26443?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-26443.
---
   Resolution: Fixed
 Assignee: Takuya Ueshin
Fix Version/s: 3.0.0

This is resolved via https://github.com/apache/spark/pull/23384

> Use ConfigEntry for hardcoded configs for history category.
> ---
>
> Key: SPARK-26443
> URL: https://issues.apache.org/jira/browse/SPARK-26443
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Assignee: Takuya Ueshin
>Priority: Major
> Fix For: 3.0.0
>
>
> Make hardcoded "spark.history" configs to use {{ConfigEntry}} and put them in 
> {{History}} config object.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-26442) Use ConfigEntry for hardcoded configs.

2018-12-29 Thread Takuya Ueshin (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-26442?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16730819#comment-16730819
 ] 

Takuya Ueshin commented on SPARK-26442:
---

I've combined based on the related category and the rough estimation. Thanks!

> Use ConfigEntry for hardcoded configs.
> --
>
> Key: SPARK-26442
> URL: https://issues.apache.org/jira/browse/SPARK-26442
> Project: Spark
>  Issue Type: Umbrella
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>
> This umbrella JIRA is to make hardcoded configs to use {{ConfigEntry}}.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-26491) Use ConfigEntry for hardcoded configs for test categories.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26491?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin updated SPARK-26491:
--
Description: 
Make the following hardcoded configs to use ConfigEntry.

{code}
spark.test
spark.testing
{code}


> Use ConfigEntry for hardcoded configs for test categories.
> --
>
> Key: SPARK-26491
> URL: https://issues.apache.org/jira/browse/SPARK-26491
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>
> Make the following hardcoded configs to use ConfigEntry.
> {code}
> spark.test
> spark.testing
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-26491) Use ConfigEntry for hardcoded configs for test categories.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26491?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin updated SPARK-26491:
--
Summary: Use ConfigEntry for hardcoded configs for test categories.  (was: 
Use ConfigEntry for hardcoded configs for test category.)

> Use ConfigEntry for hardcoded configs for test categories.
> --
>
> Key: SPARK-26491
> URL: https://issues.apache.org/jira/browse/SPARK-26491
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26488) Use ConfigEntry for hardcoded configs for modify.acl category.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26488?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin resolved SPARK-26488.
---
Resolution: Duplicate

> Use ConfigEntry for hardcoded configs for modify.acl category.
> --
>
> Key: SPARK-26488
> URL: https://issues.apache.org/jira/browse/SPARK-26488
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-26462) Use ConfigEntry for hardcoded configs for execution categories.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26462?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin updated SPARK-26462:
--
Description: 
Make the following hardcoded configs to use ConfigEntry.

{code}
spark.memory
spark.storage
spark.io
spark.buffer
spark.rdd
spark.locality
spark.broadcast
spark.reducer
{code}


> Use ConfigEntry for hardcoded configs for execution categories.
> ---
>
> Key: SPARK-26462
> URL: https://issues.apache.org/jira/browse/SPARK-26462
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>
> Make the following hardcoded configs to use ConfigEntry.
> {code}
> spark.memory
> spark.storage
> spark.io
> spark.buffer
> spark.rdd
> spark.locality
> spark.broadcast
> spark.reducer
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26486) Use ConfigEntry for hardcoded configs for metrics category.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26486?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin resolved SPARK-26486.
---
Resolution: Duplicate

> Use ConfigEntry for hardcoded configs for metrics category.
> ---
>
> Key: SPARK-26486
> URL: https://issues.apache.org/jira/browse/SPARK-26486
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26478) Use ConfigEntry for hardcoded configs for rdd category.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26478?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin resolved SPARK-26478.
---
Resolution: Duplicate

> Use ConfigEntry for hardcoded configs for rdd category.
> ---
>
> Key: SPARK-26478
> URL: https://issues.apache.org/jira/browse/SPARK-26478
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-26462) Use ConfigEntry for hardcoded configs for execution categories.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26462?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin updated SPARK-26462:
--
Summary: Use ConfigEntry for hardcoded configs for execution categories.  
(was: Use ConfigEntry for hardcoded configs for memory category.)

> Use ConfigEntry for hardcoded configs for execution categories.
> ---
>
> Key: SPARK-26462
> URL: https://issues.apache.org/jira/browse/SPARK-26462
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26461) Use ConfigEntry for hardcoded configs for dynamicAllocation category.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26461?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin resolved SPARK-26461.
---
Resolution: Duplicate

> Use ConfigEntry for hardcoded configs for dynamicAllocation category.
> -
>
> Key: SPARK-26461
> URL: https://issues.apache.org/jira/browse/SPARK-26461
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-26489) Use ConfigEntry for hardcoded configs for python/r categories.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26489?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin updated SPARK-26489:
--
Description: 
Make the following hardcoded configs to use ConfigEntry.

{code}
spark.python
spark.r
{code}


> Use ConfigEntry for hardcoded configs for python/r categories.
> --
>
> Key: SPARK-26489
> URL: https://issues.apache.org/jira/browse/SPARK-26489
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>
> Make the following hardcoded configs to use ConfigEntry.
> {code}
> spark.python
> spark.r
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26474) Use ConfigEntry for hardcoded configs for worker category.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26474?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin resolved SPARK-26474.
---
Resolution: Duplicate

> Use ConfigEntry for hardcoded configs for worker category.
> --
>
> Key: SPARK-26474
> URL: https://issues.apache.org/jira/browse/SPARK-26474
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-26463) Use ConfigEntry for hardcoded configs for scheduler categories.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26463?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin updated SPARK-26463:
--
Description: 
Make the following hardcoded configs to use {{ConfigEntry}}.

{code}
spark.dynamicAllocation
spark.scheduler
spark.rpc
spark.task
spark.speculation
spark.cleaner
{code}


> Use ConfigEntry for hardcoded configs for scheduler categories.
> ---
>
> Key: SPARK-26463
> URL: https://issues.apache.org/jira/browse/SPARK-26463
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>
> Make the following hardcoded configs to use {{ConfigEntry}}.
> {code}
> spark.dynamicAllocation
> spark.scheduler
> spark.rpc
> spark.task
> spark.speculation
> spark.cleaner
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-26466) Use ConfigEntry for hardcoded configs for submit categories.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26466?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin updated SPARK-26466:
--
Description: 
Make the following hardcoded configs to use {{ConfigEntry}}.

{code}
spark.kryo
spark.kryoserializer
spark.jars
spark.submit
spark.serializer
spark.deploy
spark.worker
{code}


> Use ConfigEntry for hardcoded configs for submit categories.
> 
>
> Key: SPARK-26466
> URL: https://issues.apache.org/jira/browse/SPARK-26466
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>
> Make the following hardcoded configs to use {{ConfigEntry}}.
> {code}
> spark.kryo
> spark.kryoserializer
> spark.jars
> spark.submit
> spark.serializer
> spark.deploy
> spark.worker
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26490) Use ConfigEntry for hardcoded configs for r category.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26490?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin resolved SPARK-26490.
---
Resolution: Duplicate

> Use ConfigEntry for hardcoded configs for r category.
> -
>
> Key: SPARK-26490
> URL: https://issues.apache.org/jira/browse/SPARK-26490
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-26489) Use ConfigEntry for hardcoded configs for python/r categories.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26489?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin updated SPARK-26489:
--
Summary: Use ConfigEntry for hardcoded configs for python/r categories.  
(was: Use ConfigEntry for hardcoded configs for python category.)

> Use ConfigEntry for hardcoded configs for python/r categories.
> --
>
> Key: SPARK-26489
> URL: https://issues.apache.org/jira/browse/SPARK-26489
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26487) Use ConfigEntry for hardcoded configs for admin category.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26487?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin resolved SPARK-26487.
---
Resolution: Duplicate

> Use ConfigEntry for hardcoded configs for admin category.
> -
>
> Key: SPARK-26487
> URL: https://issues.apache.org/jira/browse/SPARK-26487
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-26482) Use ConfigEntry for hardcoded configs for ui categories.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26482?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin updated SPARK-26482:
--
Description: 
Make the following hardcoded configs to use ConfigEntry.

{code}
spark.ui
spark.ssl
spark.authenticate
spark.master.rest
spark.master.ui
spark.metrics
spark.admin
spark.modify.acl
{code}


> Use ConfigEntry for hardcoded configs for ui categories.
> 
>
> Key: SPARK-26482
> URL: https://issues.apache.org/jira/browse/SPARK-26482
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>
> Make the following hardcoded configs to use ConfigEntry.
> {code}
> spark.ui
> spark.ssl
> spark.authenticate
> spark.master.rest
> spark.master.ui
> spark.metrics
> spark.admin
> spark.modify.acl
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26484) Use ConfigEntry for hardcoded configs for authenticate category.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26484?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin resolved SPARK-26484.
---
Resolution: Duplicate

> Use ConfigEntry for hardcoded configs for authenticate category.
> 
>
> Key: SPARK-26484
> URL: https://issues.apache.org/jira/browse/SPARK-26484
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26485) Use ConfigEntry for hardcoded configs for master.rest/ui categories.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26485?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin resolved SPARK-26485.
---
Resolution: Duplicate

> Use ConfigEntry for hardcoded configs for master.rest/ui categories.
> 
>
> Key: SPARK-26485
> URL: https://issues.apache.org/jira/browse/SPARK-26485
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26483) Use ConfigEntry for hardcoded configs for ssl category.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26483?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin resolved SPARK-26483.
---
Resolution: Duplicate

> Use ConfigEntry for hardcoded configs for ssl category.
> ---
>
> Key: SPARK-26483
> URL: https://issues.apache.org/jira/browse/SPARK-26483
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26481) Use ConfigEntry for hardcoded configs for reducer category.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26481?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin resolved SPARK-26481.
---
Resolution: Duplicate

> Use ConfigEntry for hardcoded configs for reducer category.
> ---
>
> Key: SPARK-26481
> URL: https://issues.apache.org/jira/browse/SPARK-26481
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-26482) Use ConfigEntry for hardcoded configs for ui categories.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26482?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin updated SPARK-26482:
--
Summary: Use ConfigEntry for hardcoded configs for ui categories.  (was: 
Use ConfigEntry for hardcoded configs for ui category.)

> Use ConfigEntry for hardcoded configs for ui categories.
> 
>
> Key: SPARK-26482
> URL: https://issues.apache.org/jira/browse/SPARK-26482
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26480) Use ConfigEntry for hardcoded configs for broadcast category.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26480?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin resolved SPARK-26480.
---
Resolution: Duplicate

> Use ConfigEntry for hardcoded configs for broadcast category.
> -
>
> Key: SPARK-26480
> URL: https://issues.apache.org/jira/browse/SPARK-26480
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26479) Use ConfigEntry for hardcoded configs for locality category.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26479?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin resolved SPARK-26479.
---
Resolution: Duplicate

> Use ConfigEntry for hardcoded configs for locality category.
> 
>
> Key: SPARK-26479
> URL: https://issues.apache.org/jira/browse/SPARK-26479
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26475) Use ConfigEntry for hardcoded configs for buffer category.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26475?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin resolved SPARK-26475.
---
Resolution: Duplicate

> Use ConfigEntry for hardcoded configs for buffer category.
> --
>
> Key: SPARK-26475
> URL: https://issues.apache.org/jira/browse/SPARK-26475
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26469) Use ConfigEntry for hardcoded configs for io category.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26469?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin resolved SPARK-26469.
---
Resolution: Duplicate

> Use ConfigEntry for hardcoded configs for io category.
> --
>
> Key: SPARK-26469
> URL: https://issues.apache.org/jira/browse/SPARK-26469
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26464) Use ConfigEntry for hardcoded configs for storage category.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26464?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin resolved SPARK-26464.
---
Resolution: Duplicate

> Use ConfigEntry for hardcoded configs for storage category.
> ---
>
> Key: SPARK-26464
> URL: https://issues.apache.org/jira/browse/SPARK-26464
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26468) Use ConfigEntry for hardcoded configs for task category.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26468?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin resolved SPARK-26468.
---
Resolution: Duplicate

> Use ConfigEntry for hardcoded configs for task category.
> 
>
> Key: SPARK-26468
> URL: https://issues.apache.org/jira/browse/SPARK-26468
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26471) Use ConfigEntry for hardcoded configs for speculation category.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26471?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin resolved SPARK-26471.
---
Resolution: Duplicate

> Use ConfigEntry for hardcoded configs for speculation category.
> ---
>
> Key: SPARK-26471
> URL: https://issues.apache.org/jira/browse/SPARK-26471
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26467) Use ConfigEntry for hardcoded configs for rpc category.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26467?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin resolved SPARK-26467.
---
Resolution: Duplicate

> Use ConfigEntry for hardcoded configs for rpc category.
> ---
>
> Key: SPARK-26467
> URL: https://issues.apache.org/jira/browse/SPARK-26467
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26476) Use ConfigEntry for hardcoded configs for cleaner category.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26476?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin resolved SPARK-26476.
---
Resolution: Duplicate

> Use ConfigEntry for hardcoded configs for cleaner category.
> ---
>
> Key: SPARK-26476
> URL: https://issues.apache.org/jira/browse/SPARK-26476
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-26463) Use ConfigEntry for hardcoded configs for scheduler categories.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26463?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin updated SPARK-26463:
--
Summary: Use ConfigEntry for hardcoded configs for scheduler categories.  
(was: Use ConfigEntry for hardcoded configs for scheduler category.)

> Use ConfigEntry for hardcoded configs for scheduler categories.
> ---
>
> Key: SPARK-26463
> URL: https://issues.apache.org/jira/browse/SPARK-26463
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26473) Use ConfigEntry for hardcoded configs for deploy category.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26473?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin resolved SPARK-26473.
---
Resolution: Duplicate

> Use ConfigEntry for hardcoded configs for deploy category.
> --
>
> Key: SPARK-26473
> URL: https://issues.apache.org/jira/browse/SPARK-26473
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26472) Use ConfigEntry for hardcoded configs for serializer category.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26472?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin resolved SPARK-26472.
---
Resolution: Duplicate

> Use ConfigEntry for hardcoded configs for serializer category.
> --
>
> Key: SPARK-26472
> URL: https://issues.apache.org/jira/browse/SPARK-26472
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26465) Use ConfigEntry for hardcoded configs for jars category.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26465?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin resolved SPARK-26465.
---
Resolution: Duplicate

> Use ConfigEntry for hardcoded configs for jars category.
> 
>
> Key: SPARK-26465
> URL: https://issues.apache.org/jira/browse/SPARK-26465
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26460) Use ConfigEntry for hardcoded configs for kryo/kryoserializer categories.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26460?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin resolved SPARK-26460.
---
Resolution: Duplicate

> Use ConfigEntry for hardcoded configs for kryo/kryoserializer categories.
> -
>
> Key: SPARK-26460
> URL: https://issues.apache.org/jira/browse/SPARK-26460
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-26466) Use ConfigEntry for hardcoded configs for submit categories.

2018-12-29 Thread Takuya Ueshin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26466?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takuya Ueshin updated SPARK-26466:
--
Summary: Use ConfigEntry for hardcoded configs for submit categories.  
(was: Use ConfigEntry for hardcoded configs for submit category.)

> Use ConfigEntry for hardcoded configs for submit categories.
> 
>
> Key: SPARK-26466
> URL: https://issues.apache.org/jira/browse/SPARK-26466
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26496) Avoid to use Random.nextString in StreamingInnerJoinSuite

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26496?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-26496.
---
   Resolution: Fixed
 Assignee: Hyukjin Kwon
Fix Version/s: 3.0.0
   2.4.1
   2.3.3

This is resolved via https://github.com/apache/spark/pull/23405

> Avoid to use Random.nextString in StreamingInnerJoinSuite
> -
>
> Key: SPARK-26496
> URL: https://issues.apache.org/jira/browse/SPARK-26496
> Project: Spark
>  Issue Type: Bug
>  Components: Structured Streaming, Tests
>Affects Versions: 2.4.0
> Environment: Mac OS X High Sierra
>Reporter: Bruce Robbins
>Assignee: Hyukjin Kwon
>Priority: Minor
> Fix For: 2.3.3, 2.4.1, 3.0.0
>
>
> This is a bit esoteric and minor, but makes it difficult to run SQL unit 
> tests successfully on High Sierra.
> StreamingInnerJoinSuite."locality preferences of StateStoreAwareZippedRDD" 
> generates a directory name using {{Random.nextString(10)}}, and frequently 
> that directory name is unacceptable to High Sierra.
> For example:
> {noformat}
> scala> val prefix = Random.nextString(10); val dir = new File("/tmp", "del_" 
> + prefix + "-" + UUID.randomUUID.toString); dir.mkdirs()
> prefix: String = 媈ᒢ탊渓뀟?녛ꃲ싢櫦
> dir: java.io.File = /tmp/del_媈ᒢ탊渓뀟?녛ꃲ싢櫦-aff57fc6-ca38-4825-b4f3-473140edd4f6
> res39: Boolean = true // this one was OK
> scala> val prefix = Random.nextString(10); val dir = new File("/tmp", "del_" 
> + prefix + "-" + UUID.randomUUID.toString); dir.mkdirs()
> prefix: String = 窽텘⒘駖ⵚ駢⡞Ρ닋੎
> dir: java.io.File = /tmp/del_窽텘⒘駖ⵚ駢⡞Ρ닋੎-a3f99855-c429-47a0-a108-47bca6905745
> res40: Boolean = false  // nope, didn't like this one
> scala> prefix.foreach(x => printf("%04x ", x.toInt))
> 7abd d158 2498 99d6 2d5a 99e2 285e 03a1 b2cb 0a4e 
> scala> prefix(9)
> res46: Char = ੎
> scala> val prefix = "\u7abd"
> prefix: String = 窽
> scala> val dir = new File("/tmp", "del_" + prefix + "-" + 
> UUID.randomUUID.toString); dir.mkdirs()
> dir: java.io.File = /tmp/del_窽-d1c3af34-d34d-43fe-afed-ccef9a800ff4
> res47: Boolean = true // it's OK with \u7abd
> scala> val prefix = "\u0a4e"
> prefix: String = ੎
> scala> val dir = new File("/tmp", "del_" + prefix + "-" + 
> UUID.randomUUID.toString); dir.mkdirs()
> dir: java.io.File = /tmp/del_੎-3654a34c-6f74-4591-85af-a0f28b675a6f
> res50: Boolean = false // doesn't like \u0a4e
> {noformat}
> I thought it might have something to do with my Java 8 version, but Python is 
> equally affected:
> {noformat}
> >>> f = open(u"/tmp/del_\u7abd_file", "wb")
> f = open(u"/tmp/del_\u7abd_file", "wb")
> >>> f.write("hello\n")
> f.write("hello\n")
> # it's OK with \u7abd
> >>> f2 = open(u"/tmp/del_\u0a4e_file", "wb")
> f2 = open(u"/tmp/del_\u0a4e_file", "wb")
> Traceback (most recent call last):
>   File "", line 1, in 
> IOError: [Errno 92] Illegal byte sequence: u'/tmp/del_\u0a4e_file'
> # doesn't like \u0a4e
> >>> f2 = open(u"/tmp/del_\ufa4e_file", "wb")
> f2 = open(u"/tmp/del_\ufa4e_file", "wb")
> # a little change and it's happy again
> >>> 
> {noformat}
> Mac OS X Sierra is perfectly happy with these characters. This seems to be a 
> limitation introduced by High Sierra.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-26496) Avoid to use Random.nextString in StreamingInnerJoinSuite

2018-12-29 Thread Dongjoon Hyun (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26496?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-26496:
--
Summary: Avoid to use Random.nextString in StreamingInnerJoinSuite  (was: 
Test "locality preferences of StateStoreAwareZippedRDD" frequently fails on 
High Sierra)

> Avoid to use Random.nextString in StreamingInnerJoinSuite
> -
>
> Key: SPARK-26496
> URL: https://issues.apache.org/jira/browse/SPARK-26496
> Project: Spark
>  Issue Type: Bug
>  Components: Structured Streaming, Tests
>Affects Versions: 2.4.0
> Environment: Mac OS X High Sierra
>Reporter: Bruce Robbins
>Priority: Minor
>
> This is a bit esoteric and minor, but makes it difficult to run SQL unit 
> tests successfully on High Sierra.
> StreamingInnerJoinSuite."locality preferences of StateStoreAwareZippedRDD" 
> generates a directory name using {{Random.nextString(10)}}, and frequently 
> that directory name is unacceptable to High Sierra.
> For example:
> {noformat}
> scala> val prefix = Random.nextString(10); val dir = new File("/tmp", "del_" 
> + prefix + "-" + UUID.randomUUID.toString); dir.mkdirs()
> prefix: String = 媈ᒢ탊渓뀟?녛ꃲ싢櫦
> dir: java.io.File = /tmp/del_媈ᒢ탊渓뀟?녛ꃲ싢櫦-aff57fc6-ca38-4825-b4f3-473140edd4f6
> res39: Boolean = true // this one was OK
> scala> val prefix = Random.nextString(10); val dir = new File("/tmp", "del_" 
> + prefix + "-" + UUID.randomUUID.toString); dir.mkdirs()
> prefix: String = 窽텘⒘駖ⵚ駢⡞Ρ닋੎
> dir: java.io.File = /tmp/del_窽텘⒘駖ⵚ駢⡞Ρ닋੎-a3f99855-c429-47a0-a108-47bca6905745
> res40: Boolean = false  // nope, didn't like this one
> scala> prefix.foreach(x => printf("%04x ", x.toInt))
> 7abd d158 2498 99d6 2d5a 99e2 285e 03a1 b2cb 0a4e 
> scala> prefix(9)
> res46: Char = ੎
> scala> val prefix = "\u7abd"
> prefix: String = 窽
> scala> val dir = new File("/tmp", "del_" + prefix + "-" + 
> UUID.randomUUID.toString); dir.mkdirs()
> dir: java.io.File = /tmp/del_窽-d1c3af34-d34d-43fe-afed-ccef9a800ff4
> res47: Boolean = true // it's OK with \u7abd
> scala> val prefix = "\u0a4e"
> prefix: String = ੎
> scala> val dir = new File("/tmp", "del_" + prefix + "-" + 
> UUID.randomUUID.toString); dir.mkdirs()
> dir: java.io.File = /tmp/del_੎-3654a34c-6f74-4591-85af-a0f28b675a6f
> res50: Boolean = false // doesn't like \u0a4e
> {noformat}
> I thought it might have something to do with my Java 8 version, but Python is 
> equally affected:
> {noformat}
> >>> f = open(u"/tmp/del_\u7abd_file", "wb")
> f = open(u"/tmp/del_\u7abd_file", "wb")
> >>> f.write("hello\n")
> f.write("hello\n")
> # it's OK with \u7abd
> >>> f2 = open(u"/tmp/del_\u0a4e_file", "wb")
> f2 = open(u"/tmp/del_\u0a4e_file", "wb")
> Traceback (most recent call last):
>   File "", line 1, in 
> IOError: [Errno 92] Illegal byte sequence: u'/tmp/del_\u0a4e_file'
> # doesn't like \u0a4e
> >>> f2 = open(u"/tmp/del_\ufa4e_file", "wb")
> f2 = open(u"/tmp/del_\ufa4e_file", "wb")
> # a little change and it's happy again
> >>> 
> {noformat}
> Mac OS X Sierra is perfectly happy with these characters. This seems to be a 
> limitation introduced by High Sierra.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-26502) Get rid of hiveResultString() in QueryExecution

2018-12-29 Thread Apache Spark (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26502?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-26502:


Assignee: (was: Apache Spark)

> Get rid of hiveResultString() in QueryExecution
> ---
>
> Key: SPARK-26502
> URL: https://issues.apache.org/jira/browse/SPARK-26502
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 2.4.0
>Reporter: Maxim Gekk
>Priority: Minor
>
> The method hiveResultString() of QueryExecution is used in test and 
> SparkSQLDriver. It should be moved from QueryExecution to more specific class.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-26502) Get rid of hiveResultString() in QueryExecution

2018-12-29 Thread Apache Spark (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26502?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-26502:


Assignee: Apache Spark

> Get rid of hiveResultString() in QueryExecution
> ---
>
> Key: SPARK-26502
> URL: https://issues.apache.org/jira/browse/SPARK-26502
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 2.4.0
>Reporter: Maxim Gekk
>Assignee: Apache Spark
>Priority: Minor
>
> The method hiveResultString() of QueryExecution is used in test and 
> SparkSQLDriver. It should be moved from QueryExecution to more specific class.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-26504) Rope-wise dumping of Spark plans

2018-12-29 Thread Apache Spark (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26504?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-26504:


Assignee: Apache Spark

> Rope-wise dumping of Spark plans 
> -
>
> Key: SPARK-26504
> URL: https://issues.apache.org/jira/browse/SPARK-26504
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 2.4.0
>Reporter: Maxim Gekk
>Assignee: Apache Spark
>Priority: Minor
>
> Currently, Spark plans are converted to string via StringBuilderWriter when 
> memory for strings are allocated sequentially as soon as elements of plans 
> are added to the StringBuilder.
> Proposed improvement is StringRope which has 2 methods:
> 1. append(s: String): Unit - adds the string to internal list and increases 
> total size
> 2. toString: String - converts the list of strings to strings



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-26504) Rope-wise dumping of Spark plans

2018-12-29 Thread Apache Spark (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26504?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-26504:


Assignee: (was: Apache Spark)

> Rope-wise dumping of Spark plans 
> -
>
> Key: SPARK-26504
> URL: https://issues.apache.org/jira/browse/SPARK-26504
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 2.4.0
>Reporter: Maxim Gekk
>Priority: Minor
>
> Currently, Spark plans are converted to string via StringBuilderWriter when 
> memory for strings are allocated sequentially as soon as elements of plans 
> are added to the StringBuilder.
> Proposed improvement is StringRope which has 2 methods:
> 1. append(s: String): Unit - adds the string to internal list and increases 
> total size
> 2. toString: String - converts the list of strings to strings



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-26504) Rope-wise dumping of Spark plans

2018-12-29 Thread Maxim Gekk (JIRA)
Maxim Gekk created SPARK-26504:
--

 Summary: Rope-wise dumping of Spark plans 
 Key: SPARK-26504
 URL: https://issues.apache.org/jira/browse/SPARK-26504
 Project: Spark
  Issue Type: Improvement
  Components: SQL
Affects Versions: 2.4.0
Reporter: Maxim Gekk


Currently, Spark plans are converted to string via StringBuilderWriter when 
memory for strings are allocated sequentially as soon as elements of plans are 
added to the StringBuilder.

Proposed improvement is StringRope which has 2 methods:
1. append(s: String): Unit - adds the string to internal list and increases 
total size
2. toString: String - converts the list of strings to strings



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-26503) Get rid of spark.sql.legacy.timeParser.enabled

2018-12-29 Thread Maxim Gekk (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26503?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Maxim Gekk updated SPARK-26503:
---
External issue ID:   (was: SPARK-26374)

> Get rid of spark.sql.legacy.timeParser.enabled
> --
>
> Key: SPARK-26503
> URL: https://issues.apache.org/jira/browse/SPARK-26503
> Project: Spark
>  Issue Type: Task
>  Components: SQL
>Affects Versions: 3.0.0
>Reporter: Maxim Gekk
>Priority: Minor
>
> The flag is used in CSV/JSON datasources as well in time related functions to 
> control parsing/formatting date/timestamps. By default, DateTimeFormat is 
> used for the purpose but the flag allow switch back to SimpleDateFormat and 
> some fallback. In the major release 3.0, the flag should be removed, and new 
> formatters DateFormatter/TimestampFormatter should be used by default.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-26503) Get rid of spark.sql.legacy.timeParser.enabled

2018-12-29 Thread Maxim Gekk (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26503?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Maxim Gekk updated SPARK-26503:
---
External issue ID: SPARK-26374

> Get rid of spark.sql.legacy.timeParser.enabled
> --
>
> Key: SPARK-26503
> URL: https://issues.apache.org/jira/browse/SPARK-26503
> Project: Spark
>  Issue Type: Task
>  Components: SQL
>Affects Versions: 3.0.0
>Reporter: Maxim Gekk
>Priority: Minor
>
> The flag is used in CSV/JSON datasources as well in time related functions to 
> control parsing/formatting date/timestamps. By default, DateTimeFormat is 
> used for the purpose but the flag allow switch back to SimpleDateFormat and 
> some fallback. In the major release 3.0, the flag should be removed, and new 
> formatters DateFormatter/TimestampFormatter should be used by default.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-26503) Get rid of spark.sql.legacy.timeParser.enabled

2018-12-29 Thread Maxim Gekk (JIRA)
Maxim Gekk created SPARK-26503:
--

 Summary: Get rid of spark.sql.legacy.timeParser.enabled
 Key: SPARK-26503
 URL: https://issues.apache.org/jira/browse/SPARK-26503
 Project: Spark
  Issue Type: Task
  Components: SQL
Affects Versions: 3.0.0
Reporter: Maxim Gekk


The flag is used in CSV/JSON datasources as well in time related functions to 
control parsing/formatting date/timestamps. By default, DateTimeFormat is used 
for the purpose but the flag allow switch back to SimpleDateFormat and some 
fallback. In the major release 3.0, the flag should be removed, and new 
formatters DateFormatter/TimestampFormatter should be used by default.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-26502) Get rid of hiveResultString() in QueryExecution

2018-12-29 Thread Maxim Gekk (JIRA)
Maxim Gekk created SPARK-26502:
--

 Summary: Get rid of hiveResultString() in QueryExecution
 Key: SPARK-26502
 URL: https://issues.apache.org/jira/browse/SPARK-26502
 Project: Spark
  Issue Type: Improvement
  Components: SQL
Affects Versions: 2.4.0
Reporter: Maxim Gekk


The method hiveResultString() of QueryExecution is used in test and 
SparkSQLDriver. It should be moved from QueryExecution to more specific class.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-26496) Test "locality preferences of StateStoreAwareZippedRDD" frequently fails on High Sierra

2018-12-29 Thread Apache Spark (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26496?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-26496:


Assignee: (was: Apache Spark)

> Test "locality preferences of StateStoreAwareZippedRDD" frequently fails on 
> High Sierra
> ---
>
> Key: SPARK-26496
> URL: https://issues.apache.org/jira/browse/SPARK-26496
> Project: Spark
>  Issue Type: Bug
>  Components: Structured Streaming, Tests
>Affects Versions: 2.4.0
> Environment: Mac OS X High Sierra
>Reporter: Bruce Robbins
>Priority: Minor
>
> This is a bit esoteric and minor, but makes it difficult to run SQL unit 
> tests successfully on High Sierra.
> StreamingInnerJoinSuite."locality preferences of StateStoreAwareZippedRDD" 
> generates a directory name using {{Random.nextString(10)}}, and frequently 
> that directory name is unacceptable to High Sierra.
> For example:
> {noformat}
> scala> val prefix = Random.nextString(10); val dir = new File("/tmp", "del_" 
> + prefix + "-" + UUID.randomUUID.toString); dir.mkdirs()
> prefix: String = 媈ᒢ탊渓뀟?녛ꃲ싢櫦
> dir: java.io.File = /tmp/del_媈ᒢ탊渓뀟?녛ꃲ싢櫦-aff57fc6-ca38-4825-b4f3-473140edd4f6
> res39: Boolean = true // this one was OK
> scala> val prefix = Random.nextString(10); val dir = new File("/tmp", "del_" 
> + prefix + "-" + UUID.randomUUID.toString); dir.mkdirs()
> prefix: String = 窽텘⒘駖ⵚ駢⡞Ρ닋੎
> dir: java.io.File = /tmp/del_窽텘⒘駖ⵚ駢⡞Ρ닋੎-a3f99855-c429-47a0-a108-47bca6905745
> res40: Boolean = false  // nope, didn't like this one
> scala> prefix.foreach(x => printf("%04x ", x.toInt))
> 7abd d158 2498 99d6 2d5a 99e2 285e 03a1 b2cb 0a4e 
> scala> prefix(9)
> res46: Char = ੎
> scala> val prefix = "\u7abd"
> prefix: String = 窽
> scala> val dir = new File("/tmp", "del_" + prefix + "-" + 
> UUID.randomUUID.toString); dir.mkdirs()
> dir: java.io.File = /tmp/del_窽-d1c3af34-d34d-43fe-afed-ccef9a800ff4
> res47: Boolean = true // it's OK with \u7abd
> scala> val prefix = "\u0a4e"
> prefix: String = ੎
> scala> val dir = new File("/tmp", "del_" + prefix + "-" + 
> UUID.randomUUID.toString); dir.mkdirs()
> dir: java.io.File = /tmp/del_੎-3654a34c-6f74-4591-85af-a0f28b675a6f
> res50: Boolean = false // doesn't like \u0a4e
> {noformat}
> I thought it might have something to do with my Java 8 version, but Python is 
> equally affected:
> {noformat}
> >>> f = open(u"/tmp/del_\u7abd_file", "wb")
> f = open(u"/tmp/del_\u7abd_file", "wb")
> >>> f.write("hello\n")
> f.write("hello\n")
> # it's OK with \u7abd
> >>> f2 = open(u"/tmp/del_\u0a4e_file", "wb")
> f2 = open(u"/tmp/del_\u0a4e_file", "wb")
> Traceback (most recent call last):
>   File "", line 1, in 
> IOError: [Errno 92] Illegal byte sequence: u'/tmp/del_\u0a4e_file'
> # doesn't like \u0a4e
> >>> f2 = open(u"/tmp/del_\ufa4e_file", "wb")
> f2 = open(u"/tmp/del_\ufa4e_file", "wb")
> # a little change and it's happy again
> >>> 
> {noformat}
> Mac OS X Sierra is perfectly happy with these characters. This seems to be a 
> limitation introduced by High Sierra.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-26496) Test "locality preferences of StateStoreAwareZippedRDD" frequently fails on High Sierra

2018-12-29 Thread Apache Spark (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26496?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-26496:


Assignee: Apache Spark

> Test "locality preferences of StateStoreAwareZippedRDD" frequently fails on 
> High Sierra
> ---
>
> Key: SPARK-26496
> URL: https://issues.apache.org/jira/browse/SPARK-26496
> Project: Spark
>  Issue Type: Bug
>  Components: Structured Streaming, Tests
>Affects Versions: 2.4.0
> Environment: Mac OS X High Sierra
>Reporter: Bruce Robbins
>Assignee: Apache Spark
>Priority: Minor
>
> This is a bit esoteric and minor, but makes it difficult to run SQL unit 
> tests successfully on High Sierra.
> StreamingInnerJoinSuite."locality preferences of StateStoreAwareZippedRDD" 
> generates a directory name using {{Random.nextString(10)}}, and frequently 
> that directory name is unacceptable to High Sierra.
> For example:
> {noformat}
> scala> val prefix = Random.nextString(10); val dir = new File("/tmp", "del_" 
> + prefix + "-" + UUID.randomUUID.toString); dir.mkdirs()
> prefix: String = 媈ᒢ탊渓뀟?녛ꃲ싢櫦
> dir: java.io.File = /tmp/del_媈ᒢ탊渓뀟?녛ꃲ싢櫦-aff57fc6-ca38-4825-b4f3-473140edd4f6
> res39: Boolean = true // this one was OK
> scala> val prefix = Random.nextString(10); val dir = new File("/tmp", "del_" 
> + prefix + "-" + UUID.randomUUID.toString); dir.mkdirs()
> prefix: String = 窽텘⒘駖ⵚ駢⡞Ρ닋੎
> dir: java.io.File = /tmp/del_窽텘⒘駖ⵚ駢⡞Ρ닋੎-a3f99855-c429-47a0-a108-47bca6905745
> res40: Boolean = false  // nope, didn't like this one
> scala> prefix.foreach(x => printf("%04x ", x.toInt))
> 7abd d158 2498 99d6 2d5a 99e2 285e 03a1 b2cb 0a4e 
> scala> prefix(9)
> res46: Char = ੎
> scala> val prefix = "\u7abd"
> prefix: String = 窽
> scala> val dir = new File("/tmp", "del_" + prefix + "-" + 
> UUID.randomUUID.toString); dir.mkdirs()
> dir: java.io.File = /tmp/del_窽-d1c3af34-d34d-43fe-afed-ccef9a800ff4
> res47: Boolean = true // it's OK with \u7abd
> scala> val prefix = "\u0a4e"
> prefix: String = ੎
> scala> val dir = new File("/tmp", "del_" + prefix + "-" + 
> UUID.randomUUID.toString); dir.mkdirs()
> dir: java.io.File = /tmp/del_੎-3654a34c-6f74-4591-85af-a0f28b675a6f
> res50: Boolean = false // doesn't like \u0a4e
> {noformat}
> I thought it might have something to do with my Java 8 version, but Python is 
> equally affected:
> {noformat}
> >>> f = open(u"/tmp/del_\u7abd_file", "wb")
> f = open(u"/tmp/del_\u7abd_file", "wb")
> >>> f.write("hello\n")
> f.write("hello\n")
> # it's OK with \u7abd
> >>> f2 = open(u"/tmp/del_\u0a4e_file", "wb")
> f2 = open(u"/tmp/del_\u0a4e_file", "wb")
> Traceback (most recent call last):
>   File "", line 1, in 
> IOError: [Errno 92] Illegal byte sequence: u'/tmp/del_\u0a4e_file'
> # doesn't like \u0a4e
> >>> f2 = open(u"/tmp/del_\ufa4e_file", "wb")
> f2 = open(u"/tmp/del_\ufa4e_file", "wb")
> # a little change and it's happy again
> >>> 
> {noformat}
> Mac OS X Sierra is perfectly happy with these characters. This seems to be a 
> limitation introduced by High Sierra.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-26496) Test "locality preferences of StateStoreAwareZippedRDD" frequently fails on High Sierra

2018-12-29 Thread Hyukjin Kwon (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26496?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon updated SPARK-26496:
-
Component/s: (was: SQL)
 Structured Streaming

> Test "locality preferences of StateStoreAwareZippedRDD" frequently fails on 
> High Sierra
> ---
>
> Key: SPARK-26496
> URL: https://issues.apache.org/jira/browse/SPARK-26496
> Project: Spark
>  Issue Type: Bug
>  Components: Structured Streaming, Tests
>Affects Versions: 2.4.0
> Environment: Mac OS X High Sierra
>Reporter: Bruce Robbins
>Priority: Minor
>
> This is a bit esoteric and minor, but makes it difficult to run SQL unit 
> tests successfully on High Sierra.
> StreamingInnerJoinSuite."locality preferences of StateStoreAwareZippedRDD" 
> generates a directory name using {{Random.nextString(10)}}, and frequently 
> that directory name is unacceptable to High Sierra.
> For example:
> {noformat}
> scala> val prefix = Random.nextString(10); val dir = new File("/tmp", "del_" 
> + prefix + "-" + UUID.randomUUID.toString); dir.mkdirs()
> prefix: String = 媈ᒢ탊渓뀟?녛ꃲ싢櫦
> dir: java.io.File = /tmp/del_媈ᒢ탊渓뀟?녛ꃲ싢櫦-aff57fc6-ca38-4825-b4f3-473140edd4f6
> res39: Boolean = true // this one was OK
> scala> val prefix = Random.nextString(10); val dir = new File("/tmp", "del_" 
> + prefix + "-" + UUID.randomUUID.toString); dir.mkdirs()
> prefix: String = 窽텘⒘駖ⵚ駢⡞Ρ닋੎
> dir: java.io.File = /tmp/del_窽텘⒘駖ⵚ駢⡞Ρ닋੎-a3f99855-c429-47a0-a108-47bca6905745
> res40: Boolean = false  // nope, didn't like this one
> scala> prefix.foreach(x => printf("%04x ", x.toInt))
> 7abd d158 2498 99d6 2d5a 99e2 285e 03a1 b2cb 0a4e 
> scala> prefix(9)
> res46: Char = ੎
> scala> val prefix = "\u7abd"
> prefix: String = 窽
> scala> val dir = new File("/tmp", "del_" + prefix + "-" + 
> UUID.randomUUID.toString); dir.mkdirs()
> dir: java.io.File = /tmp/del_窽-d1c3af34-d34d-43fe-afed-ccef9a800ff4
> res47: Boolean = true // it's OK with \u7abd
> scala> val prefix = "\u0a4e"
> prefix: String = ੎
> scala> val dir = new File("/tmp", "del_" + prefix + "-" + 
> UUID.randomUUID.toString); dir.mkdirs()
> dir: java.io.File = /tmp/del_੎-3654a34c-6f74-4591-85af-a0f28b675a6f
> res50: Boolean = false // doesn't like \u0a4e
> {noformat}
> I thought it might have something to do with my Java 8 version, but Python is 
> equally affected:
> {noformat}
> >>> f = open(u"/tmp/del_\u7abd_file", "wb")
> f = open(u"/tmp/del_\u7abd_file", "wb")
> >>> f.write("hello\n")
> f.write("hello\n")
> # it's OK with \u7abd
> >>> f2 = open(u"/tmp/del_\u0a4e_file", "wb")
> f2 = open(u"/tmp/del_\u0a4e_file", "wb")
> Traceback (most recent call last):
>   File "", line 1, in 
> IOError: [Errno 92] Illegal byte sequence: u'/tmp/del_\u0a4e_file'
> # doesn't like \u0a4e
> >>> f2 = open(u"/tmp/del_\ufa4e_file", "wb")
> f2 = open(u"/tmp/del_\ufa4e_file", "wb")
> # a little change and it's happy again
> >>> 
> {noformat}
> Mac OS X Sierra is perfectly happy with these characters. This seems to be a 
> limitation introduced by High Sierra.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-26496) Test "locality preferences of StateStoreAwareZippedRDD" frequently fails on High Sierra

2018-12-29 Thread Hyukjin Kwon (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26496?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon updated SPARK-26496:
-
Component/s: Tests

> Test "locality preferences of StateStoreAwareZippedRDD" frequently fails on 
> High Sierra
> ---
>
> Key: SPARK-26496
> URL: https://issues.apache.org/jira/browse/SPARK-26496
> Project: Spark
>  Issue Type: Bug
>  Components: SQL, Tests
>Affects Versions: 2.4.0
> Environment: Mac OS X High Sierra
>Reporter: Bruce Robbins
>Priority: Minor
>
> This is a bit esoteric and minor, but makes it difficult to run SQL unit 
> tests successfully on High Sierra.
> StreamingInnerJoinSuite."locality preferences of StateStoreAwareZippedRDD" 
> generates a directory name using {{Random.nextString(10)}}, and frequently 
> that directory name is unacceptable to High Sierra.
> For example:
> {noformat}
> scala> val prefix = Random.nextString(10); val dir = new File("/tmp", "del_" 
> + prefix + "-" + UUID.randomUUID.toString); dir.mkdirs()
> prefix: String = 媈ᒢ탊渓뀟?녛ꃲ싢櫦
> dir: java.io.File = /tmp/del_媈ᒢ탊渓뀟?녛ꃲ싢櫦-aff57fc6-ca38-4825-b4f3-473140edd4f6
> res39: Boolean = true // this one was OK
> scala> val prefix = Random.nextString(10); val dir = new File("/tmp", "del_" 
> + prefix + "-" + UUID.randomUUID.toString); dir.mkdirs()
> prefix: String = 窽텘⒘駖ⵚ駢⡞Ρ닋੎
> dir: java.io.File = /tmp/del_窽텘⒘駖ⵚ駢⡞Ρ닋੎-a3f99855-c429-47a0-a108-47bca6905745
> res40: Boolean = false  // nope, didn't like this one
> scala> prefix.foreach(x => printf("%04x ", x.toInt))
> 7abd d158 2498 99d6 2d5a 99e2 285e 03a1 b2cb 0a4e 
> scala> prefix(9)
> res46: Char = ੎
> scala> val prefix = "\u7abd"
> prefix: String = 窽
> scala> val dir = new File("/tmp", "del_" + prefix + "-" + 
> UUID.randomUUID.toString); dir.mkdirs()
> dir: java.io.File = /tmp/del_窽-d1c3af34-d34d-43fe-afed-ccef9a800ff4
> res47: Boolean = true // it's OK with \u7abd
> scala> val prefix = "\u0a4e"
> prefix: String = ੎
> scala> val dir = new File("/tmp", "del_" + prefix + "-" + 
> UUID.randomUUID.toString); dir.mkdirs()
> dir: java.io.File = /tmp/del_੎-3654a34c-6f74-4591-85af-a0f28b675a6f
> res50: Boolean = false // doesn't like \u0a4e
> {noformat}
> I thought it might have something to do with my Java 8 version, but Python is 
> equally affected:
> {noformat}
> >>> f = open(u"/tmp/del_\u7abd_file", "wb")
> f = open(u"/tmp/del_\u7abd_file", "wb")
> >>> f.write("hello\n")
> f.write("hello\n")
> # it's OK with \u7abd
> >>> f2 = open(u"/tmp/del_\u0a4e_file", "wb")
> f2 = open(u"/tmp/del_\u0a4e_file", "wb")
> Traceback (most recent call last):
>   File "", line 1, in 
> IOError: [Errno 92] Illegal byte sequence: u'/tmp/del_\u0a4e_file'
> # doesn't like \u0a4e
> >>> f2 = open(u"/tmp/del_\ufa4e_file", "wb")
> f2 = open(u"/tmp/del_\ufa4e_file", "wb")
> # a little change and it's happy again
> >>> 
> {noformat}
> Mac OS X Sierra is perfectly happy with these characters. This seems to be a 
> limitation introduced by High Sierra.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-26496) Test "locality preferences of StateStoreAwareZippedRDD" frequently fails on High Sierra

2018-12-29 Thread Hyukjin Kwon (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-26496?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16730619#comment-16730619
 ] 

Hyukjin Kwon commented on SPARK-26496:
--

I think we should fix it to nextFloat.toString. Similar fix was made in 
SPARK-19613 before.

> Test "locality preferences of StateStoreAwareZippedRDD" frequently fails on 
> High Sierra
> ---
>
> Key: SPARK-26496
> URL: https://issues.apache.org/jira/browse/SPARK-26496
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 2.4.0
> Environment: Mac OS X High Sierra
>Reporter: Bruce Robbins
>Priority: Minor
>
> This is a bit esoteric and minor, but makes it difficult to run SQL unit 
> tests successfully on High Sierra.
> StreamingInnerJoinSuite."locality preferences of StateStoreAwareZippedRDD" 
> generates a directory name using {{Random.nextString(10)}}, and frequently 
> that directory name is unacceptable to High Sierra.
> For example:
> {noformat}
> scala> val prefix = Random.nextString(10); val dir = new File("/tmp", "del_" 
> + prefix + "-" + UUID.randomUUID.toString); dir.mkdirs()
> prefix: String = 媈ᒢ탊渓뀟?녛ꃲ싢櫦
> dir: java.io.File = /tmp/del_媈ᒢ탊渓뀟?녛ꃲ싢櫦-aff57fc6-ca38-4825-b4f3-473140edd4f6
> res39: Boolean = true // this one was OK
> scala> val prefix = Random.nextString(10); val dir = new File("/tmp", "del_" 
> + prefix + "-" + UUID.randomUUID.toString); dir.mkdirs()
> prefix: String = 窽텘⒘駖ⵚ駢⡞Ρ닋੎
> dir: java.io.File = /tmp/del_窽텘⒘駖ⵚ駢⡞Ρ닋੎-a3f99855-c429-47a0-a108-47bca6905745
> res40: Boolean = false  // nope, didn't like this one
> scala> prefix.foreach(x => printf("%04x ", x.toInt))
> 7abd d158 2498 99d6 2d5a 99e2 285e 03a1 b2cb 0a4e 
> scala> prefix(9)
> res46: Char = ੎
> scala> val prefix = "\u7abd"
> prefix: String = 窽
> scala> val dir = new File("/tmp", "del_" + prefix + "-" + 
> UUID.randomUUID.toString); dir.mkdirs()
> dir: java.io.File = /tmp/del_窽-d1c3af34-d34d-43fe-afed-ccef9a800ff4
> res47: Boolean = true // it's OK with \u7abd
> scala> val prefix = "\u0a4e"
> prefix: String = ੎
> scala> val dir = new File("/tmp", "del_" + prefix + "-" + 
> UUID.randomUUID.toString); dir.mkdirs()
> dir: java.io.File = /tmp/del_੎-3654a34c-6f74-4591-85af-a0f28b675a6f
> res50: Boolean = false // doesn't like \u0a4e
> {noformat}
> I thought it might have something to do with my Java 8 version, but Python is 
> equally affected:
> {noformat}
> >>> f = open(u"/tmp/del_\u7abd_file", "wb")
> f = open(u"/tmp/del_\u7abd_file", "wb")
> >>> f.write("hello\n")
> f.write("hello\n")
> # it's OK with \u7abd
> >>> f2 = open(u"/tmp/del_\u0a4e_file", "wb")
> f2 = open(u"/tmp/del_\u0a4e_file", "wb")
> Traceback (most recent call last):
>   File "", line 1, in 
> IOError: [Errno 92] Illegal byte sequence: u'/tmp/del_\u0a4e_file'
> # doesn't like \u0a4e
> >>> f2 = open(u"/tmp/del_\ufa4e_file", "wb")
> f2 = open(u"/tmp/del_\ufa4e_file", "wb")
> # a little change and it's happy again
> >>> 
> {noformat}
> Mac OS X Sierra is perfectly happy with these characters. This seems to be a 
> limitation introduced by High Sierra.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-26494) 【spark sql】 使用spark读取oracle TIMESTAMP(6) WITH LOCAL TIME ZONE 类型找不到

2018-12-29 Thread Hyukjin Kwon (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26494?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon resolved SPARK-26494.
--
Resolution: Invalid

Please use English.

> 【spark sql】 使用spark读取oracle TIMESTAMP(6) WITH LOCAL TIME ZONE 类型找不到
> ---
>
> Key: SPARK-26494
> URL: https://issues.apache.org/jira/browse/SPARK-26494
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 2.4.0
>Reporter: 秦坤
>Priority: Minor
>
> 使用spark读取oracle TIMESTAMP(6) WITH LOCAL TIME ZONE 类型找不到,
>  
> 当数据类型为TIMESTAMP(6) WITH LOCAL TIME ZONE  
> 此时JdbcUtils 类 里面函数getCatalystType的 sqlType 数值为-102



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-12154) Upgrade to Jersey 2

2018-12-29 Thread Jiatao Tao (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-12154?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16730601#comment-16730601
 ] 

Jiatao Tao commented on SPARK-12154:


Hadoop use 1.x, it's a nightmare.

> Upgrade to Jersey 2
> ---
>
> Key: SPARK-12154
> URL: https://issues.apache.org/jira/browse/SPARK-12154
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build, Spark Core
>Affects Versions: 1.5.2
>Reporter: Matt Cheah
>Assignee: Matt Cheah
>Priority: Blocker
> Fix For: 2.0.0
>
>
> Fairly self-explanatory, Jersey 1 is a bit old and could use an upgrade. 
> Library conflicts for Jersey are difficult to workaround - see discussion on 
> SPARK-11081. It's easier to upgrade Jersey entirely, but we should target 
> Spark 2.0 since this may be a break for users who were using Jersey 1 in 
> their Spark jobs.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org