GitHub user shaolinliu opened a pull request: https://github.com/apache/spark/pull/17581
[SPARK-20248][ SQL]Spark SQL add limit parameter to enhance the reliability. ## What changes were proposed in this pull request? Add a parameter "spark.sql.thriftServer.retainedResults" with default value 200, when user run a query without a limit, this will implicitly add a limit to this query. When user run a query with a limit,we do nothing. If this parameter is set to 0,we do nothing too. ## How was this patch tested? manual tests. You can merge this pull request into a Git repository by running: $ git pull https://github.com/shaolinliu/spark SPARK-20248 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/17581.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #17581 ---- commit b47cf92fee79a57fbec37bbf9c7d35753f5c7d75 Author: liu shaolin <liu.shaol...@zte.com.cn> Date: 2017-04-07T05:25:21Z modified: sql/core/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala modified: sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkExecuteStatementOperation.scala commit f556bf64b46b40beb58872ebb2fafa69a97c43ec Author: liu shaolin <liu.shaol...@zte.com.cn> Date: 2017-04-07T08:19:04Z repair the compile error. commit 81cd6333164b0915a0fb492a34ab84b973f7cd0d Author: liu shaolin <liu.shaol...@zte.com.cn> Date: 2017-04-07T09:22:30Z modify the doc. commit 59a1b1ae96809076916849c9a1d396dc7d40251d Author: liu shaolin <liu.shaol...@zte.com.cn> Date: 2017-04-10T02:01:03Z Merge branch 'liu' into SPARK-20248 ---- --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org