[jira] [Commented] (SPARK-40508) Treat unknown partitioning as UnknownPartitioning
[ https://issues.apache.org/jira/browse/SPARK-40508?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17607918#comment-17607918 ] Dongjoon Hyun commented on SPARK-40508: --- Previously, you are in `Contributor` and `Administrator`. I added `Committer` group to you additionally to make it sure. > Treat unknown partitioning as UnknownPartitioning > - > > Key: SPARK-40508 > URL: https://issues.apache.org/jira/browse/SPARK-40508 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 3.3.0 >Reporter: Ted Yu >Assignee: Ted Yu >Priority: Major > Fix For: 3.4.0 > > > When running spark application against spark 3.3, I see the following : > {code} > java.lang.IllegalArgumentException: Unsupported data source V2 partitioning > type: CustomPartitioning > at > org.apache.spark.sql.execution.datasources.v2.V2ScanPartitioning$$anonfun$apply$1.applyOrElse(V2ScanPartitioning.scala:46) > at > org.apache.spark.sql.execution.datasources.v2.V2ScanPartitioning$$anonfun$apply$1.applyOrElse(V2ScanPartitioning.scala:34) > at > org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:584) > {code} > The CustomPartitioning works fine with Spark 3.2.1 > This PR proposes to relax the code and treat all unknown partitioning the > same way as that for UnknownPartitioning. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-40508) Treat unknown partitioning as UnknownPartitioning
[ https://issues.apache.org/jira/browse/SPARK-40508?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17607919#comment-17607919 ] Sun Chao commented on SPARK-40508: -- Great to know. Thanks! > Treat unknown partitioning as UnknownPartitioning > - > > Key: SPARK-40508 > URL: https://issues.apache.org/jira/browse/SPARK-40508 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 3.3.0 >Reporter: Ted Yu >Assignee: Ted Yu >Priority: Major > Fix For: 3.4.0 > > > When running spark application against spark 3.3, I see the following : > {code} > java.lang.IllegalArgumentException: Unsupported data source V2 partitioning > type: CustomPartitioning > at > org.apache.spark.sql.execution.datasources.v2.V2ScanPartitioning$$anonfun$apply$1.applyOrElse(V2ScanPartitioning.scala:46) > at > org.apache.spark.sql.execution.datasources.v2.V2ScanPartitioning$$anonfun$apply$1.applyOrElse(V2ScanPartitioning.scala:34) > at > org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:584) > {code} > The CustomPartitioning works fine with Spark 3.2.1 > This PR proposes to relax the code and treat all unknown partitioning the > same way as that for UnknownPartitioning. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-40508) Treat unknown partitioning as UnknownPartitioning
[ https://issues.apache.org/jira/browse/SPARK-40508?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17607917#comment-17607917 ] Dongjoon Hyun commented on SPARK-40508: --- Ya, the merge script sometimes hit the corner cases. BTW, [~sunchao] , you are already in the Apache Spark Admin group. You can add a user. - [https://issues.apache.org/jira/plugins/servlet/project-config/SPARK/roles] > Treat unknown partitioning as UnknownPartitioning > - > > Key: SPARK-40508 > URL: https://issues.apache.org/jira/browse/SPARK-40508 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 3.3.0 >Reporter: Ted Yu >Assignee: Ted Yu >Priority: Major > Fix For: 3.4.0 > > > When running spark application against spark 3.3, I see the following : > {code} > java.lang.IllegalArgumentException: Unsupported data source V2 partitioning > type: CustomPartitioning > at > org.apache.spark.sql.execution.datasources.v2.V2ScanPartitioning$$anonfun$apply$1.applyOrElse(V2ScanPartitioning.scala:46) > at > org.apache.spark.sql.execution.datasources.v2.V2ScanPartitioning$$anonfun$apply$1.applyOrElse(V2ScanPartitioning.scala:34) > at > org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:584) > {code} > The CustomPartitioning works fine with Spark 3.2.1 > This PR proposes to relax the code and treat all unknown partitioning the > same way as that for UnknownPartitioning. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-40508) Treat unknown partitioning as UnknownPartitioning
[ https://issues.apache.org/jira/browse/SPARK-40508?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17607902#comment-17607902 ] Sun Chao commented on SPARK-40508: -- Oh, thanks [~viirya] ! For some reason the merge script was throwing error at me: {code:java} response text = {"errorMessages":[],"errors":{"assignee":"User 'yuzhih...@gmail.com' cannot be assigned issues."}} Error assigning JIRA, try again (or leave blank and fix manually) JIRA is unassigned, choose assignee {code} > Treat unknown partitioning as UnknownPartitioning > - > > Key: SPARK-40508 > URL: https://issues.apache.org/jira/browse/SPARK-40508 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 3.3.0 >Reporter: Ted Yu >Assignee: Ted Yu >Priority: Major > Fix For: 3.4.0 > > > When running spark application against spark 3.3, I see the following : > {code} > java.lang.IllegalArgumentException: Unsupported data source V2 partitioning > type: CustomPartitioning > at > org.apache.spark.sql.execution.datasources.v2.V2ScanPartitioning$$anonfun$apply$1.applyOrElse(V2ScanPartitioning.scala:46) > at > org.apache.spark.sql.execution.datasources.v2.V2ScanPartitioning$$anonfun$apply$1.applyOrElse(V2ScanPartitioning.scala:34) > at > org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:584) > {code} > The CustomPartitioning works fine with Spark 3.2.1 > This PR proposes to relax the code and treat all unknown partitioning the > same way as that for UnknownPartitioning. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-40508) Treat unknown partitioning as UnknownPartitioning
[ https://issues.apache.org/jira/browse/SPARK-40508?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17607900#comment-17607900 ] L. C. Hsieh commented on SPARK-40508: - [~csun] Seems he is already in contributor list. I just assigned this ticket to him. > Treat unknown partitioning as UnknownPartitioning > - > > Key: SPARK-40508 > URL: https://issues.apache.org/jira/browse/SPARK-40508 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 3.3.0 >Reporter: Ted Yu >Assignee: Ted Yu >Priority: Major > Fix For: 3.4.0 > > > When running spark application against spark 3.3, I see the following : > {code} > java.lang.IllegalArgumentException: Unsupported data source V2 partitioning > type: CustomPartitioning > at > org.apache.spark.sql.execution.datasources.v2.V2ScanPartitioning$$anonfun$apply$1.applyOrElse(V2ScanPartitioning.scala:46) > at > org.apache.spark.sql.execution.datasources.v2.V2ScanPartitioning$$anonfun$apply$1.applyOrElse(V2ScanPartitioning.scala:34) > at > org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:584) > {code} > The CustomPartitioning works fine with Spark 3.2.1 > This PR proposes to relax the code and treat all unknown partitioning the > same way as that for UnknownPartitioning. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-40508) Treat unknown partitioning as UnknownPartitioning
[ https://issues.apache.org/jira/browse/SPARK-40508?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17607895#comment-17607895 ] Apache Spark commented on SPARK-40508: -- User 'tedyu' has created a pull request for this issue: https://github.com/apache/spark/pull/37957 > Treat unknown partitioning as UnknownPartitioning > - > > Key: SPARK-40508 > URL: https://issues.apache.org/jira/browse/SPARK-40508 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 3.3.0 >Reporter: Ted Yu >Priority: Major > Fix For: 3.4.0 > > > When running spark application against spark 3.3, I see the following : > {code} > java.lang.IllegalArgumentException: Unsupported data source V2 partitioning > type: CustomPartitioning > at > org.apache.spark.sql.execution.datasources.v2.V2ScanPartitioning$$anonfun$apply$1.applyOrElse(V2ScanPartitioning.scala:46) > at > org.apache.spark.sql.execution.datasources.v2.V2ScanPartitioning$$anonfun$apply$1.applyOrElse(V2ScanPartitioning.scala:34) > at > org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:584) > {code} > The CustomPartitioning works fine with Spark 3.2.1 > This PR proposes to relax the code and treat all unknown partitioning the > same way as that for UnknownPartitioning. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-40508) Treat unknown partitioning as UnknownPartitioning
[ https://issues.apache.org/jira/browse/SPARK-40508?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17607893#comment-17607893 ] Apache Spark commented on SPARK-40508: -- User 'tedyu' has created a pull request for this issue: https://github.com/apache/spark/pull/37957 > Treat unknown partitioning as UnknownPartitioning > - > > Key: SPARK-40508 > URL: https://issues.apache.org/jira/browse/SPARK-40508 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 3.3.0 >Reporter: Ted Yu >Priority: Major > Fix For: 3.4.0 > > > When running spark application against spark 3.3, I see the following : > {code} > java.lang.IllegalArgumentException: Unsupported data source V2 partitioning > type: CustomPartitioning > at > org.apache.spark.sql.execution.datasources.v2.V2ScanPartitioning$$anonfun$apply$1.applyOrElse(V2ScanPartitioning.scala:46) > at > org.apache.spark.sql.execution.datasources.v2.V2ScanPartitioning$$anonfun$apply$1.applyOrElse(V2ScanPartitioning.scala:34) > at > org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:584) > {code} > The CustomPartitioning works fine with Spark 3.2.1 > This PR proposes to relax the code and treat all unknown partitioning the > same way as that for UnknownPartitioning. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-40508) Treat unknown partitioning as UnknownPartitioning
[ https://issues.apache.org/jira/browse/SPARK-40508?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17607869#comment-17607869 ] Chao Sun commented on SPARK-40508: -- [~dongjoon][~viirya] could you add [~yuzhih...@gmail.com] to the contributor list? I can't assign this to him. > Treat unknown partitioning as UnknownPartitioning > - > > Key: SPARK-40508 > URL: https://issues.apache.org/jira/browse/SPARK-40508 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 3.3.0 >Reporter: Ted Yu >Priority: Major > Fix For: 3.4.0 > > > When running spark application against spark 3.3, I see the following : > {code} > java.lang.IllegalArgumentException: Unsupported data source V2 partitioning > type: CustomPartitioning > at > org.apache.spark.sql.execution.datasources.v2.V2ScanPartitioning$$anonfun$apply$1.applyOrElse(V2ScanPartitioning.scala:46) > at > org.apache.spark.sql.execution.datasources.v2.V2ScanPartitioning$$anonfun$apply$1.applyOrElse(V2ScanPartitioning.scala:34) > at > org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:584) > {code} > The CustomPartitioning works fine with Spark 3.2.1 > This PR proposes to relax the code and treat all unknown partitioning the > same way as that for UnknownPartitioning. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-40508) Treat unknown partitioning as UnknownPartitioning
[ https://issues.apache.org/jira/browse/SPARK-40508?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17607384#comment-17607384 ] Apache Spark commented on SPARK-40508: -- User 'tedyu' has created a pull request for this issue: https://github.com/apache/spark/pull/37952 > Treat unknown partitioning as UnknownPartitioning > - > > Key: SPARK-40508 > URL: https://issues.apache.org/jira/browse/SPARK-40508 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 3.3.0 >Reporter: Ted Yu >Priority: Major > > When running spark application against spark 3.3, I see the following : > {code} > java.lang.IllegalArgumentException: Unsupported data source V2 partitioning > type: CustomPartitioning > at > org.apache.spark.sql.execution.datasources.v2.V2ScanPartitioning$$anonfun$apply$1.applyOrElse(V2ScanPartitioning.scala:46) > at > org.apache.spark.sql.execution.datasources.v2.V2ScanPartitioning$$anonfun$apply$1.applyOrElse(V2ScanPartitioning.scala:34) > at > org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:584) > {code} > The CustomPartitioning works fine with Spark 3.2.1 > This PR proposes to relax the code and treat all unknown partitioning the > same way as that for UnknownPartitioning. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org