Yuming Wang created SPARK-38772:
-----------------------------------
Summary: Formatting the log plan in AdaptiveSparkPlanExec
Key: SPARK-38772
URL: https://issues.apache.org/jira/browse/SPARK-38772
Project: Spark
Issue Type: Improvement
Components: SQL
Affects Versions: 3.4.0
Reporter: Yuming Wang
{code:java}
12:08:36.876 ERROR
org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec: Plan changed
from SortMergeJoin [key#13], [a#23], Inner
:- Sort [key#13 ASC NULLS FIRST], false, 0
: +- ShuffleQueryStage 0
: +- Exchange hashpartitioning(key#13, 5), ENSURE_REQUIREMENTS, [id=#110]
: +- *(1) Filter (isnotnull(value#14) AND (value#14 = 1))
: +- *(1) SerializeFromObject [knownnotnull(assertnotnull(input[0,
org.apache.spark.sql.test.SQLTestData$TestData, true])).key AS key#13,
staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType,
fromString, knownnotnull(assertnotnull(input[0,
org.apache.spark.sql.test.SQLTestData$TestData, true])).value, true, false,
true) AS value#14]
: +- Scan[obj#12]
+- Sort [a#23 ASC NULLS FIRST], false, 0
+- ShuffleQueryStage 1
+- Exchange hashpartitioning(a#23, 5), ENSURE_REQUIREMENTS, [id=#129]
+- *(2) SerializeFromObject [knownnotnull(assertnotnull(input[0,
org.apache.spark.sql.test.SQLTestData$TestData2, true])).a AS a#23,
knownnotnull(assertnotnull(input[0,
org.apache.spark.sql.test.SQLTestData$TestData2, true])).b AS b#24]
+- Scan[obj#22]
to BroadcastHashJoin [key#13], [a#23], Inner, BuildLeft, false
:- BroadcastExchange HashedRelationBroadcastMode(List(cast(input[0, int, false]
as bigint)),false), [id=#145]
: +- ShuffleQueryStage 0
: +- Exchange hashpartitioning(key#13, 5), ENSURE_REQUIREMENTS, [id=#110]
: +- *(1) Filter (isnotnull(value#14) AND (value#14 = 1))
: +- *(1) SerializeFromObject [knownnotnull(assertnotnull(input[0,
org.apache.spark.sql.test.SQLTestData$TestData, true])).key AS key#13,
staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType,
fromString, knownnotnull(assertnotnull(input[0,
org.apache.spark.sql.test.SQLTestData$TestData, true])).value, true, false,
true) AS value#14]
: +- Scan[obj#12]
+- ShuffleQueryStage 1
+- Exchange hashpartitioning(a#23, 5), ENSURE_REQUIREMENTS, [id=#129]
+- *(2) SerializeFromObject [knownnotnull(assertnotnull(input[0,
org.apache.spark.sql.test.SQLTestData$TestData2, true])).a AS a#23,
knownnotnull(assertnotnull(input[0,
org.apache.spark.sql.test.SQLTestData$TestData2, true])).b AS b#24]
+- Scan[obj#22]
{code}
--
This message was sent by Atlassian Jira
(v8.20.1#820001)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]