Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/23104
@guoxiaolongzte good job
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/19887
Can one of the admins verify this patch?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/19919
@srowen
If the default is not correct,I can fix it in this PR by the way.
---
-
To unsubscribe, e-mail: reviews
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/19919
@srowen
Where does the default value "2" of spark.executor.instances used?
---
-
To unsubscribe, e-mai
GitHub user liu-zhaokun opened a pull request:
https://github.com/apache/spark/pull/19919
[SPARK-22727] spark.executor.instances's default value should be 2
[https://issues.apache.org/jira/browse/SPARK-22727](https://issues.apache.org/jira/browse/SPARK-22727)
## What changes
GitHub user liu-zhaokun opened a pull request:
https://github.com/apache/spark/pull/19887
[SPARK-21168] KafkaRDD should always set kafka clientId.
There are no a number of other places that a client ID should be set,and I
think we should use consumer.clientId in the clientId method
Github user liu-zhaokun closed the pull request at:
https://github.com/apache/spark/pull/19856
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/19856
@jerryshao
Yes,you are right,but this log is not accurate,I think it should log like
this "consumerconnector has been created",it is too amb
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/19856
@jerryshao
I think the log can't reflect the behavior of consumer connection,because
consumer.create doesn't do any connect,it only construct a
ZookeeperConsumerConnector instance,so
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/19856
@srowen
Please help merge this PR as it has passed all tests.
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/19856
@srowen
Thanks for your reply.Could you help me review it?
---
-
To unsubscribe, e-mail: reviews-unsubscr
GitHub user liu-zhaokun opened a pull request:
https://github.com/apache/spark/pull/19856
[SPARK-22664] The logs about "Connected to Zookeeper" in
ReliableKafkaReceiver.scala are in wrong position
[https://issues.apache.org/jira/browse/SPARK-22664](https://issues.apach
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/19377
@srowen
This hook is only for DriverRunner.I think there was some log like this for
normal executor when worker shutdown
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/19377
@srowen
Sorry,I do not understand what you meanï¼could you explain it again?
---
-
To unsubscribe, e-mail: reviews
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/19377
It says that "It would be nice to add a shutdown hook here that explains
why the output is terminating. Otherwise if the worker dies the executor logs
will silently stop" in &quo
GitHub user liu-zhaokun opened a pull request:
https://github.com/apache/spark/pull/19377
[SPARK-22154] add a shutdown hook that explains why the output is
terminating
It would be nice to add a shutdown hook here that explains why the output
is terminating. Otherwise if the worker
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18879
@jiangxb1987
BTW,how to find the release branch of Hive-0.13.1a which spark1.4.1
compiled with?
---
If your project is set up for it, you can reply to this email and have your
reply
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18879
Thanks for your patience.Would you @srowen agree with @jiangxb1987 .
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user liu-zhaokun commented on a diff in the pull request:
https://github.com/apache/spark/pull/18879#discussion_r132126001
--- Diff:
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLEnv.scala
---
@@ -35,7 +35,7 @@ private[hive] object
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18879
@srowen
I have modified the annotation,would you like to review it?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18879
@srowen @jiangxb1987
If we can't change this logic,I think we should change the
annotation,because it says "If user doesn't specify the appName, we want to get
[SparkSQL::localHos
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18879
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
GitHub user liu-zhaokun opened a pull request:
https://github.com/apache/spark/pull/18879
[SPARK-21662] modify the appname to [SparkSQL::localHostName] instead of
[SparkSQL::lP]
[https://issues.apache.org/jira/browse/SPARK-21662](https://issues.apache.org/jira/browse/SPARK
Github user liu-zhaokun closed the pull request at:
https://github.com/apache/spark/pull/18842
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user liu-zhaokun closed the pull request at:
https://github.com/apache/spark/pull/18838
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18838
@jerryshao
Thanks for your reply.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18842
@jerryshao
OK.Thanks.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18838
@srowen
Yes,I change the logic because I think it's no mean to attempt to create a
dir while there was a same name dir.
---
If your project is set up for it, you can reply to this email
GitHub user liu-zhaokun opened a pull request:
https://github.com/apache/spark/pull/18842
[SPARK-21636] Several configurations which only are used in unit tests
should be removed
[https://issues.apache.org/jira/browse/SPARK-21636](https://issues.apache.org/jira/browse/SPARK-21636
GitHub user liu-zhaokun opened a pull request:
https://github.com/apache/spark/pull/18838
[SPARK-21632] There is no need to make attempts for createDirectory if the
dir had existed
[https://issues.apache.org/jira/browse/SPARK-21632](https://issues.apache.org/jira/browse/SPARK
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18596
OK,I will close this PR.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user liu-zhaokun closed the pull request at:
https://github.com/apache/spark/pull/18596
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18599
@srowen
The param "num" which defined in function spark_rotate_log() isn't used,if
users don't modify the script,it will never be invoked,so could I create a new
PR to delete
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18606
@rxin
Thx!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18606
@rxin
Hello,did you help me merge this PR?Why I found my PR only be closed but
isn't merged.
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18606
@srowen
But now,spark 2.2.0 released.we know exactly that Scala 2.10 isn't removed
in Spark 2.2.0,so we shouldn't give the user an inaccurate message.
---
If your project is set up
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18596
@srowen @jiangxb1987
Hello,I have modified the PR according to your opinion.Could you help me
review it again.
---
If your project is set up for it, you can reply to this email and have
GitHub user liu-zhaokun opened a pull request:
https://github.com/apache/spark/pull/18606
[SPARK-21382] The note about Scala 2.10 in building-spark.md is wrong.
[https://issues.apache.org/jira/browse/SPARK-21382](https://issues.apache.org/jira/browse/SPARK-21382)
There should
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18599
@srowen
I think spark provides the param,so I can pass it in the script.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user liu-zhaokun commented on a diff in the pull request:
https://github.com/apache/spark/pull/18599#discussion_r126847486
--- Diff: sbin/spark-daemon.sh ---
@@ -78,6 +78,12 @@ spark_rotate_log ()
if [ -n "$2" ]; then
num=$2
--
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18596
@srowen
I will modify it.What do you think of other changes?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18596
@srowen
Sorryï¼I don't understandï¼Could you explain it?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18599
@srowen
If I set num to 0,there shouldn't create a logfile or we should make some
deal for 0ï¼and If I set num some invaild symbol,the function doesn't work
normally,rightï¼I am sorry
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18596
@jiangxb1987
Thanks for your reply.I can make some change according to your
opinion.There are also others changes In this PR,do you support me?
---
If your project is set up for it, you
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18599
@jiangxb1987
I want to change the number of log file ,so I passed num param manually by
modify code in this file.In fact,this bug is thereï¼even I don't trigger
it,right?Shouldn't we
Github user liu-zhaokun commented on a diff in the pull request:
https://github.com/apache/spark/pull/18596#discussion_r126669564
--- Diff: dev/make-distribution.sh ---
@@ -163,7 +163,7 @@ echo -e "\$ ${BUILD_COMMAND[@]}\n"
rm -rf "$DISTDIR"
mkd
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18599
@jiangxb1987
When I want to change the number of log file,I test the scenario that
num=0,and spark writes one log file.When I test the scenario that set num to
Invaild symbolï¼the logic
Github user liu-zhaokun commented on a diff in the pull request:
https://github.com/apache/spark/pull/18599#discussion_r126639515
--- Diff: sbin/spark-daemon.sh ---
@@ -78,6 +78,12 @@ spark_rotate_log ()
if [ -n "$2" ]; then
num=$2
fi
+
GitHub user liu-zhaokun opened a pull request:
https://github.com/apache/spark/pull/18599
[SPARK-21372] spark writes one log file even I set the number of
spark_rotate_log to 0
[https://issues.apache.org/jira/browse/SPARK-21372](https://issues.apache.org/jira/browse/SPARK-21372
GitHub user liu-zhaokun opened a pull request:
https://github.com/apache/spark/pull/18596
[SPARK-21371] dev/make-distribution.sh scripts use of $@ without ""
[https://issues.apache.org/jira/browse/SPARK-21371](https://issues.apache.org/jira/browse/SPARK-21371)
Github user liu-zhaokun closed the pull request at:
https://github.com/apache/spark/pull/18385
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18385
@jerryshao
Ok, I get it.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18385
@jerryshao
Sorry,my English is poor.I mean whether we should change the configuration
which named "spark.ssl.[namespace].port" to "spark.ssl.port".
---
If yo
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18385
@jerryshao
Thanks for your reply.I didn't notice "spark.ssl" has been replaced by $ns
inSSLOptions,it is my fault.But I still have a question,whether
"spark.ssl.[namespac
GitHub user liu-zhaokun opened a pull request:
https://github.com/apache/spark/pull/18385
[SPARK-21173] There are several configuration about SSL displayed in
configuration.md but never be used.
[https://issues.apache.org/jira/browse/SPARK-21173](https://issues.apache.org/jira
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18333
@srowen
Thanks for your support.Could help me merge it soon?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
GitHub user liu-zhaokun opened a pull request:
https://github.com/apache/spark/pull/18333
[SPARK-21126] The configuration which named
"spark.core.connection.auth.wait.timeout" hasn't been used in spark
[https://issues.apache.org/jira/browse/SPARK-21126](https://issues.
Github user liu-zhaokun closed the pull request at:
https://github.com/apache/spark/pull/18274
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
GitHub user liu-zhaokun opened a pull request:
https://github.com/apache/spark/pull/18274
[SPARK-21062] There are some minor mistakes in log
[https://issues.apache.org/jira/browse/SPARK-21062](https://issues.apache.org/jira/browse/SPARK-21062)
There are some minor mistakes
Github user liu-zhaokun closed the pull request at:
https://github.com/apache/spark/pull/18108
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18108
@HyukjinKwon
Yes,I didn't found any problems when I compiled and used it in my local.
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18108
@srowen
First,I think the tests which are related to hive went to fail doesn't my
business,right? And then ,the test of
"org.apache.spark.deploy.master.PersistenceEngineSuite&
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18108
@srowen
Could you help me review this PR?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18108
@srowen
Hello,I have run "./test-dependencies.sh --replace-manifest" to update
deps/* files,please review the new PR.
---
If your project is set up for it, you can reply to
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18108
@srowen
I have compared curator-2.6,curator-2.7 and curator-2.8 by study the Source
code and [Version evolution from 2.7 to
2.8](https://issues.apache.org/jira/secure/ReleaseNote.jspa
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18108
@srowen
I think you can test and verify your points via SparkQA's test.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18108
@srowen
I have tested it in my cluster with modify the version of curator,spark
works normally,and the problem I encountered was resolved.
---
If your project is set up for it, you can
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18108
@HyukjinKwon
Hello,could you help me review this PR?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18108
@srowen
Hello,could you help me review this PR?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18102
@HyukjinKwon
Hello,the test said there is some failureï¼is it my problems?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18102
@srowen
Hello,the test said there is some failureï¼is IT my problems?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18102
@HyukjinKwon
Thanks for your reply. I made changes based on your opinion.Please review
it soon.
---
If your project is set up for it, you can reply to this email and have your
reply
GitHub user liu-zhaokun opened a pull request:
https://github.com/apache/spark/pull/18108
[SPARK-20884] Spark' masters will be both standby due to the bug of curator
[https://issues.apache.org/jira/browse/SPARK-20884](https://issues.apache.org/jira/browse/SPARK-20884)
I built
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18102
I found there are some mistakes.So I modify the PR again.Please review it.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18102
@srowen
Thanks for your patience. I modified again.Please review it.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18102
@srowen
Thanks for your reply. I made changes based on your opinion.Please review
it soon.
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18102
@jerryshao
Hello,thanks for your attention,is there any suggestion?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18102
@jerryshao
Yes, I only want to provided a method to judge whether the spark.work.dir
is removing or has been cleaned.
---
If your project is set up for it, you can reply to this email
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18102
@jerryshao
Hi, I think change the log level also can make the logfile more
concise.would this change eliminate your concerns?
---
If your project is set up for it, you can reply
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18102
@sameeragarwal
Thanks for your reply. I will make changes based on your opinion.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18102
@jerryshao
Thanks for your reply.Do you mean that I need to streamline the content?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
GitHub user liu-zhaokun opened a pull request:
https://github.com/apache/spark/pull/18102
[SPARK-20875] Spark should print the log when the directory has been deleted
[https://issues.apache.org/jira/browse/SPARK-20875](https://issues.apache.org/jira/browse/SPARK-20875)
When
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/17992
@srowen
Hi,do you know why this PR can't pass the test? I don't think it's my
problem.
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/17992
@srowen
Hello,do you know how to finish the test?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18027
@srowen
I will try my best to slove this type of problem.But I only found one this
time.Please help me to merge it.Thanks.
---
If your project is set up for it, you can reply to this email
GitHub user liu-zhaokun opened a pull request:
https://github.com/apache/spark/pull/18027
[SPARK-20796] the location of start-master.sh in spark-standalone.md is
wrong
[https://issues.apache.org/jira/browse/SPARK-20796](https://issues.apache.org/jira/browse/SPARK-20796
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/17992
@srowen
The test doesn't finish,need I do anything?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/18013
@srowen
Hi,could you help me merge this PR?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/17992
@srowen
Hi,could you help me merge this PR?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
GitHub user liu-zhaokun opened a pull request:
https://github.com/apache/spark/pull/18013
[SPARK-20781] the location of Dockerfile in docker.properties.templat is
wrong
[https://issues.apache.org/jira/browse/SPARK-20781](https://issues.apache.org/jira/browse/SPARK-20781
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/17992
@srowen
Thanks for your reply.I modified the files again.Please review it.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/17992
@srowen
Hello, I have made some changes according to your advice.Please review the
new PR?
---
If your project is set up for it, you can reply to this email and have your
reply appear
GitHub user liu-zhaokun opened a pull request:
https://github.com/apache/spark/pull/17992
[SPARK-20759] SCALA_VERSION in _config.yml should be consistent with
pom.xml
[https://issues.apache.org/jira/browse/SPARK-20759](https://issues.apache.org/jira/browse/SPARK-20759
Github user liu-zhaokun closed the pull request at:
https://github.com/apache/spark/pull/17769
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user liu-zhaokun commented on the issue:
https://github.com/apache/spark/pull/17769
@srowen
Thanks for your reply.I will close this PR.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
GitHub user liu-zhaokun opened a pull request:
https://github.com/apache/spark/pull/17769
[SPARK-20467] sbt-launch-lib.bash has lacked the ASF header.
[https://issues.apache.org/jira/browse/SPARK-20467](https://issues.apache.org/jira/browse/SPARK-20467)
When I use this script,I
96 matches
Mail list logo