[jira] [Resolved] (SPARK-40448) Prototype implementation
[ https://issues.apache.org/jira/browse/SPARK-40448?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Hyukjin Kwon resolved SPARK-40448. -- Fix Version/s: 3.4.0 Resolution: Fixed Issue resolved by pull request 37710 [https://github.com/apache/spark/pull/37710] > Prototype implementation > > > Key: SPARK-40448 > URL: https://issues.apache.org/jira/browse/SPARK-40448 > Project: Spark > Issue Type: Sub-task > Components: Connect >Affects Versions: 3.2.2 >Reporter: Martin Grund >Assignee: Martin Grund >Priority: Major > Fix For: 3.4.0 > > > In [https://github.com/apache/spark/pull/37710] we created a prototype that > shows the end to end integration of Spark Connect with the rest of the system. > > Since the PR is quite large, we will track follow up items as children of > SPARK-39375 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-40448) Prototype implementation
[ https://issues.apache.org/jira/browse/SPARK-40448?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Hyukjin Kwon reassigned SPARK-40448: Assignee: Martin Grund > Prototype implementation > > > Key: SPARK-40448 > URL: https://issues.apache.org/jira/browse/SPARK-40448 > Project: Spark > Issue Type: Sub-task > Components: Connect >Affects Versions: 3.2.2 >Reporter: Martin Grund >Assignee: Martin Grund >Priority: Major > > In [https://github.com/apache/spark/pull/37710] we created a prototype that > shows the end to end integration of Spark Connect with the rest of the system. > > Since the PR is quite large, we will track follow up items as children of > SPARK-39375 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-40545) SparkSQLEnvSuite failed to clean the `spark_derby` directory after execution
[ https://issues.apache.org/jira/browse/SPARK-40545?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yuming Wang resolved SPARK-40545. - Fix Version/s: 3.4.0 Resolution: Fixed Issue resolved by pull request 37979 [https://github.com/apache/spark/pull/37979] > SparkSQLEnvSuite failed to clean the `spark_derby` directory after execution > > > Key: SPARK-40545 > URL: https://issues.apache.org/jira/browse/SPARK-40545 > Project: Spark > Issue Type: Improvement > Components: SQL, Tests >Affects Versions: 3.4.0 >Reporter: Yang Jie >Assignee: Yang Jie >Priority: Minor > Fix For: 3.4.0 > > > run > {code:java} > mvn clean install -Phive-thriftserver -pl sql/hive-thriftserver -Dtest=none > -DwildcardSuites=org.apache.spark.sql.hive.thriftserver.SparkSQLEnvSuite > git status {code} > The ` sql/hive-thriftserver/spark_derby/` directory will be found > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-40545) SparkSQLEnvSuite failed to clean the `spark_derby` directory after execution
[ https://issues.apache.org/jira/browse/SPARK-40545?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yuming Wang reassigned SPARK-40545: --- Assignee: Yang Jie > SparkSQLEnvSuite failed to clean the `spark_derby` directory after execution > > > Key: SPARK-40545 > URL: https://issues.apache.org/jira/browse/SPARK-40545 > Project: Spark > Issue Type: Improvement > Components: SQL, Tests >Affects Versions: 3.4.0 >Reporter: Yang Jie >Assignee: Yang Jie >Priority: Minor > > run > {code:java} > mvn clean install -Phive-thriftserver -pl sql/hive-thriftserver -Dtest=none > -DwildcardSuites=org.apache.spark.sql.hive.thriftserver.SparkSQLEnvSuite > git status {code} > The ` sql/hive-thriftserver/spark_derby/` directory will be found > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-40357) Migrate window type check failures onto error classes
[ https://issues.apache.org/jira/browse/SPARK-40357?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17609023#comment-17609023 ] Apache Spark commented on SPARK-40357: -- User 'lvshaokang' has created a pull request for this issue: https://github.com/apache/spark/pull/37986 > Migrate window type check failures onto error classes > - > > Key: SPARK-40357 > URL: https://issues.apache.org/jira/browse/SPARK-40357 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 3.4.0 >Reporter: Max Gekk >Priority: Major > > Replace TypeCheckFailure by DataTypeMismatch in type checks in window > expressions: > 1. WindowSpecDefinition (4): > https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/windowExpressions.scala#L68-L85 > 2. SpecifiedWindowFrame (3): > https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/windowExpressions.scala#L216-L231 > 3. checkBoundary (2): > https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/windowExpressions.scala#L264-L269 > 4. FrameLessOffsetWindowFunction (1): > https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/windowExpressions.scala#L424 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-40357) Migrate window type check failures onto error classes
[ https://issues.apache.org/jira/browse/SPARK-40357?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-40357: Assignee: (was: Apache Spark) > Migrate window type check failures onto error classes > - > > Key: SPARK-40357 > URL: https://issues.apache.org/jira/browse/SPARK-40357 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 3.4.0 >Reporter: Max Gekk >Priority: Major > > Replace TypeCheckFailure by DataTypeMismatch in type checks in window > expressions: > 1. WindowSpecDefinition (4): > https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/windowExpressions.scala#L68-L85 > 2. SpecifiedWindowFrame (3): > https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/windowExpressions.scala#L216-L231 > 3. checkBoundary (2): > https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/windowExpressions.scala#L264-L269 > 4. FrameLessOffsetWindowFunction (1): > https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/windowExpressions.scala#L424 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-40357) Migrate window type check failures onto error classes
[ https://issues.apache.org/jira/browse/SPARK-40357?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-40357: Assignee: Apache Spark > Migrate window type check failures onto error classes > - > > Key: SPARK-40357 > URL: https://issues.apache.org/jira/browse/SPARK-40357 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 3.4.0 >Reporter: Max Gekk >Assignee: Apache Spark >Priority: Major > > Replace TypeCheckFailure by DataTypeMismatch in type checks in window > expressions: > 1. WindowSpecDefinition (4): > https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/windowExpressions.scala#L68-L85 > 2. SpecifiedWindowFrame (3): > https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/windowExpressions.scala#L216-L231 > 3. checkBoundary (2): > https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/windowExpressions.scala#L264-L269 > 4. FrameLessOffsetWindowFunction (1): > https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/windowExpressions.scala#L424 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-40357) Migrate window type check failures onto error classes
[ https://issues.apache.org/jira/browse/SPARK-40357?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17609022#comment-17609022 ] Apache Spark commented on SPARK-40357: -- User 'lvshaokang' has created a pull request for this issue: https://github.com/apache/spark/pull/37986 > Migrate window type check failures onto error classes > - > > Key: SPARK-40357 > URL: https://issues.apache.org/jira/browse/SPARK-40357 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 3.4.0 >Reporter: Max Gekk >Priority: Major > > Replace TypeCheckFailure by DataTypeMismatch in type checks in window > expressions: > 1. WindowSpecDefinition (4): > https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/windowExpressions.scala#L68-L85 > 2. SpecifiedWindowFrame (3): > https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/windowExpressions.scala#L216-L231 > 3. checkBoundary (2): > https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/windowExpressions.scala#L264-L269 > 4. FrameLessOffsetWindowFunction (1): > https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/windowExpressions.scala#L424 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-40544) The file size of `sql/hive/target/unit-tests.log` is too big
[ https://issues.apache.org/jira/browse/SPARK-40544?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yuming Wang reassigned SPARK-40544: --- Assignee: Yang Jie > The file size of `sql/hive/target/unit-tests.log` is too big > > > Key: SPARK-40544 > URL: https://issues.apache.org/jira/browse/SPARK-40544 > Project: Spark > Issue Type: Improvement > Components: SQL, Tests >Affects Versions: 3.4.0 >Reporter: Yang Jie >Assignee: Yang Jie >Priority: Minor > > SPARK-6908 set the file appender log level threshold of the hive UTs from > info to debug > ,but didn't explain why. > > When I run > {code:java} > mvn clean install -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn -Pkinesis-asl > -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive {code} > the size of the whole Spark directory is about 22G, and the size of > `sql/hive/target/unit-tests.log` is 12G. However, the debug level logs in the > log files seem worthless, but it takes up a lot of disk space. > > {code:java} > # Set the logger level of File Appender to WARN > log4j.appender.FA.Threshold = DEBUG {code} > And the original comment of this config is `{{{}Set the logger level of File > Appender to WARN{}}}`, but the {{warn}} level has not been used > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-40544) The file size of `sql/hive/target/unit-tests.log` is too big
[ https://issues.apache.org/jira/browse/SPARK-40544?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yuming Wang resolved SPARK-40544. - Fix Version/s: 3.4.0 Resolution: Fixed Issue resolved by pull request 37976 [https://github.com/apache/spark/pull/37976] > The file size of `sql/hive/target/unit-tests.log` is too big > > > Key: SPARK-40544 > URL: https://issues.apache.org/jira/browse/SPARK-40544 > Project: Spark > Issue Type: Improvement > Components: SQL, Tests >Affects Versions: 3.4.0 >Reporter: Yang Jie >Assignee: Yang Jie >Priority: Minor > Fix For: 3.4.0 > > > SPARK-6908 set the file appender log level threshold of the hive UTs from > info to debug > ,but didn't explain why. > > When I run > {code:java} > mvn clean install -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn -Pkinesis-asl > -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive {code} > the size of the whole Spark directory is about 22G, and the size of > `sql/hive/target/unit-tests.log` is 12G. However, the debug level logs in the > log files seem worthless, but it takes up a lot of disk space. > > {code:java} > # Set the logger level of File Appender to WARN > log4j.appender.FA.Threshold = DEBUG {code} > And the original comment of this config is `{{{}Set the logger level of File > Appender to WARN{}}}`, but the {{warn}} level has not been used > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-40548) Upgrade rocksdbjni from 7.5.3 to 7.6.0
[ https://issues.apache.org/jira/browse/SPARK-40548?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17609002#comment-17609002 ] Apache Spark commented on SPARK-40548: -- User 'panbingkun' has created a pull request for this issue: https://github.com/apache/spark/pull/37985 > Upgrade rocksdbjni from 7.5.3 to 7.6.0 > -- > > Key: SPARK-40548 > URL: https://issues.apache.org/jira/browse/SPARK-40548 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 3.4.0 >Reporter: BingKun Pan >Priority: Minor > > Release Notes: > https://github.com/facebook/rocksdb/releases/tag/v7.6.0 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-40548) Upgrade rocksdbjni from 7.5.3 to 7.6.0
[ https://issues.apache.org/jira/browse/SPARK-40548?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-40548: Assignee: Apache Spark > Upgrade rocksdbjni from 7.5.3 to 7.6.0 > -- > > Key: SPARK-40548 > URL: https://issues.apache.org/jira/browse/SPARK-40548 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 3.4.0 >Reporter: BingKun Pan >Assignee: Apache Spark >Priority: Minor > > Release Notes: > https://github.com/facebook/rocksdb/releases/tag/v7.6.0 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-40548) Upgrade rocksdbjni from 7.5.3 to 7.6.0
[ https://issues.apache.org/jira/browse/SPARK-40548?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-40548: Assignee: (was: Apache Spark) > Upgrade rocksdbjni from 7.5.3 to 7.6.0 > -- > > Key: SPARK-40548 > URL: https://issues.apache.org/jira/browse/SPARK-40548 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 3.4.0 >Reporter: BingKun Pan >Priority: Minor > > Release Notes: > https://github.com/facebook/rocksdb/releases/tag/v7.6.0 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-40548) Upgrade rocksdbjni from 7.5.3 to 7.6.0
BingKun Pan created SPARK-40548: --- Summary: Upgrade rocksdbjni from 7.5.3 to 7.6.0 Key: SPARK-40548 URL: https://issues.apache.org/jira/browse/SPARK-40548 Project: Spark Issue Type: Improvement Components: Build Affects Versions: 3.4.0 Reporter: BingKun Pan Release Notes: https://github.com/facebook/rocksdb/releases/tag/v7.6.0 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-40322) Fix all dead links
[ https://issues.apache.org/jira/browse/SPARK-40322?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Hyukjin Kwon reassigned SPARK-40322: Assignee: Yuming Wang > Fix all dead links > -- > > Key: SPARK-40322 > URL: https://issues.apache.org/jira/browse/SPARK-40322 > Project: Spark > Issue Type: Bug > Components: Documentation >Affects Versions: 3.4.0 >Reporter: Yuming Wang >Assignee: Yuming Wang >Priority: Major > > > [https://www.deadlinkchecker.com/website-dead-link-checker.asp] > > > ||Status||URL||Source link text|| > |-1 Not found: The server name or address could not be > resolved|[http://engineering.ooyala.com/blog/using-parquet-and-scrooge-spark]|[Using > Parquet and Scrooge with Spark|https://spark.apache.org/documentation.html]| > |-1 Not found: The server name or address could not be > resolved|[http://blinkdb.org/]|[BlinkDB|https://spark.apache.org/third-party-projects.html]| > |404 Not > Found|[https://github.com/AyasdiOpenSource/df]|[DF|https://spark.apache.org/third-party-projects.html]| > |-1 Timeout|[https://atp.io/]|[atp|https://spark.apache.org/powered-by.html]| > |-1 Not found: The server name or address could not be > resolved|[http://www.sehir.edu.tr/en/]|[Istanbul Sehir > University|https://spark.apache.org/powered-by.html]| > |404 Not Found|[http://nsn.com/]|[Nokia Solutions and > Networks|https://spark.apache.org/powered-by.html]| > |-1 Not found: The server name or address could not be > resolved|[http://www.nubetech.co/]|[Nube > Technologies|https://spark.apache.org/powered-by.html]| > |-1 Timeout|[http://ooyala.com/]|[Ooyala, > Inc.|https://spark.apache.org/powered-by.html]| > |-1 Not found: The server name or address could not be > resolved|[http://engineering.ooyala.com/blog/fast-spark-queries-memory-datasets]|[Spark > for Fast Queries|https://spark.apache.org/powered-by.html]| > |-1 Not found: The server name or address could not be > resolved|[http://www.sisa.samsung.com/]|[Samsung Research > America|https://spark.apache.org/powered-by.html]| > |-1 > Timeout|[https://checker.apache.org/projs/spark.html]|[https://checker.apache.org/projs/spark.html|https://spark.apache.org/release-process.html]| > |404 Not Found|[https://ampcamp.berkeley.edu/amp-camp-two-strata-2013/]|[AMP > Camp 2 [302 from > http://ampcamp.berkeley.edu/amp-camp-two-strata-2013/]|https://spark.apache.org/documentation.html]| > |404 Not Found|[https://ampcamp.berkeley.edu/agenda-2012/]|[AMP Camp 1 [302 > from > http://ampcamp.berkeley.edu/agenda-2012/]|https://spark.apache.org/documentation.html]| > |404 Not Found|[https://ampcamp.berkeley.edu/4/]|[AMP Camp 4 [302 from > http://ampcamp.berkeley.edu/4/]|https://spark.apache.org/documentation.html]| > |404 Not Found|[https://ampcamp.berkeley.edu/3/]|[AMP Camp 3 [302 from > http://ampcamp.berkeley.edu/3/]|https://spark.apache.org/documentation.html]| > |-500 Internal Server > Error-|-[https://www.packtpub.com/product/spark-cookbook/9781783987061]-|-[Spark > Cookbook [301 from > https://www.packtpub.com/big-data-and-business-intelligence/spark-cookbook]|https://spark.apache.org/documentation.html]-| > |-500 Internal Server > Error-|-[https://www.packtpub.com/product/apache-spark-graph-processing/9781784391805]-|-[Apache > Spark Graph Processing [301 from > https://www.packtpub.com/big-data-and-business-intelligence/apache-spark-graph-processing]|https://spark.apache.org/documentation.html]-| > |500 Internal Server > Error|[https://prevalentdesignevents.com/sparksummit/eu17/]|[register|https://spark.apache.org/news/]| > |500 Internal Server > Error|[https://prevalentdesignevents.com/sparksummit/ss17/?_ga=1.211902866.780052874.1433437196]|[register|https://spark.apache.org/news/]| > |500 Internal Server > Error|[https://www.prevalentdesignevents.com/sparksummit2015/europe/registration.aspx?source=header]|[register|https://spark.apache.org/news/]| > |500 Internal Server > Error|[https://www.prevalentdesignevents.com/sparksummit2015/europe/speaker/]|[Spark > Summit Europe|https://spark.apache.org/news/]| > |-1 > Timeout|[http://strataconf.com/strata2013]|[Strata|https://spark.apache.org/news/]| > |-1 Not found: The server name or address could not be > resolved|[http://blog.quantifind.com/posts/spark-unit-test/]|[Unit testing > with Spark|https://spark.apache.org/news/]| > |-1 Not found: The server name or address could not be > resolved|[http://blog.quantifind.com/posts/logging-post/]|[Configuring > Spark's logs|https://spark.apache.org/news/]| > |-1 > Timeout|[http://strata.oreilly.com/2012/08/seven-reasons-why-i-like-spark.html]|[Spark|https://spark.apache.org/news/]| > |-1 > Timeout|[http://strata.oreilly.com/2012/11/shark-real-time-queries-and-analytics-for-big-data.html]|[Shark|https://spark.apache.org/news/]| > |-1 > Timeout|[htt
[jira] [Resolved] (SPARK-40322) Fix all dead links
[ https://issues.apache.org/jira/browse/SPARK-40322?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Hyukjin Kwon resolved SPARK-40322. -- Fix Version/s: 3.3.1 Resolution: Fixed Issue resolved by pull request 37984 [https://github.com/apache/spark/pull/37984] > Fix all dead links > -- > > Key: SPARK-40322 > URL: https://issues.apache.org/jira/browse/SPARK-40322 > Project: Spark > Issue Type: Bug > Components: Documentation >Affects Versions: 3.4.0 >Reporter: Yuming Wang >Assignee: Yuming Wang >Priority: Major > Fix For: 3.3.1 > > > > [https://www.deadlinkchecker.com/website-dead-link-checker.asp] > > > ||Status||URL||Source link text|| > |-1 Not found: The server name or address could not be > resolved|[http://engineering.ooyala.com/blog/using-parquet-and-scrooge-spark]|[Using > Parquet and Scrooge with Spark|https://spark.apache.org/documentation.html]| > |-1 Not found: The server name or address could not be > resolved|[http://blinkdb.org/]|[BlinkDB|https://spark.apache.org/third-party-projects.html]| > |404 Not > Found|[https://github.com/AyasdiOpenSource/df]|[DF|https://spark.apache.org/third-party-projects.html]| > |-1 Timeout|[https://atp.io/]|[atp|https://spark.apache.org/powered-by.html]| > |-1 Not found: The server name or address could not be > resolved|[http://www.sehir.edu.tr/en/]|[Istanbul Sehir > University|https://spark.apache.org/powered-by.html]| > |404 Not Found|[http://nsn.com/]|[Nokia Solutions and > Networks|https://spark.apache.org/powered-by.html]| > |-1 Not found: The server name or address could not be > resolved|[http://www.nubetech.co/]|[Nube > Technologies|https://spark.apache.org/powered-by.html]| > |-1 Timeout|[http://ooyala.com/]|[Ooyala, > Inc.|https://spark.apache.org/powered-by.html]| > |-1 Not found: The server name or address could not be > resolved|[http://engineering.ooyala.com/blog/fast-spark-queries-memory-datasets]|[Spark > for Fast Queries|https://spark.apache.org/powered-by.html]| > |-1 Not found: The server name or address could not be > resolved|[http://www.sisa.samsung.com/]|[Samsung Research > America|https://spark.apache.org/powered-by.html]| > |-1 > Timeout|[https://checker.apache.org/projs/spark.html]|[https://checker.apache.org/projs/spark.html|https://spark.apache.org/release-process.html]| > |404 Not Found|[https://ampcamp.berkeley.edu/amp-camp-two-strata-2013/]|[AMP > Camp 2 [302 from > http://ampcamp.berkeley.edu/amp-camp-two-strata-2013/]|https://spark.apache.org/documentation.html]| > |404 Not Found|[https://ampcamp.berkeley.edu/agenda-2012/]|[AMP Camp 1 [302 > from > http://ampcamp.berkeley.edu/agenda-2012/]|https://spark.apache.org/documentation.html]| > |404 Not Found|[https://ampcamp.berkeley.edu/4/]|[AMP Camp 4 [302 from > http://ampcamp.berkeley.edu/4/]|https://spark.apache.org/documentation.html]| > |404 Not Found|[https://ampcamp.berkeley.edu/3/]|[AMP Camp 3 [302 from > http://ampcamp.berkeley.edu/3/]|https://spark.apache.org/documentation.html]| > |-500 Internal Server > Error-|-[https://www.packtpub.com/product/spark-cookbook/9781783987061]-|-[Spark > Cookbook [301 from > https://www.packtpub.com/big-data-and-business-intelligence/spark-cookbook]|https://spark.apache.org/documentation.html]-| > |-500 Internal Server > Error-|-[https://www.packtpub.com/product/apache-spark-graph-processing/9781784391805]-|-[Apache > Spark Graph Processing [301 from > https://www.packtpub.com/big-data-and-business-intelligence/apache-spark-graph-processing]|https://spark.apache.org/documentation.html]-| > |500 Internal Server > Error|[https://prevalentdesignevents.com/sparksummit/eu17/]|[register|https://spark.apache.org/news/]| > |500 Internal Server > Error|[https://prevalentdesignevents.com/sparksummit/ss17/?_ga=1.211902866.780052874.1433437196]|[register|https://spark.apache.org/news/]| > |500 Internal Server > Error|[https://www.prevalentdesignevents.com/sparksummit2015/europe/registration.aspx?source=header]|[register|https://spark.apache.org/news/]| > |500 Internal Server > Error|[https://www.prevalentdesignevents.com/sparksummit2015/europe/speaker/]|[Spark > Summit Europe|https://spark.apache.org/news/]| > |-1 > Timeout|[http://strataconf.com/strata2013]|[Strata|https://spark.apache.org/news/]| > |-1 Not found: The server name or address could not be > resolved|[http://blog.quantifind.com/posts/spark-unit-test/]|[Unit testing > with Spark|https://spark.apache.org/news/]| > |-1 Not found: The server name or address could not be > resolved|[http://blog.quantifind.com/posts/logging-post/]|[Configuring > Spark's logs|https://spark.apache.org/news/]| > |-1 > Timeout|[http://strata.oreilly.com/2012/08/seven-reasons-why-i-like-spark.html]|[Spark|https://spark.apache.org/news/]| > |-1 > Timeout|[http://strata.oreil