Github user srowen commented on the issue:
https://github.com/apache/spark/pull/20516
Now this no longer changes `spark.testing`, which was the main point here.
I think this should just be closed; the purpose is changing and unclear here.
---
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/20516
@heary-cao I'm not sure how that addresses the question here, which is just
about cleaning up opened resources in the test code. It's not directly related
to your change, but affects code you are cha
Github user heary-cao commented on the issue:
https://github.com/apache/spark/pull/20516
@srowen, I think this is a function provided by spark for port use,
One is that the spark user only needs to specify start port and the offset
of ports (spark.port.maxRetries settings), the por
Github user heary-cao commented on the issue:
https://github.com/apache/spark/pull/20516
I try with mavn(using command line) to test case , it is right. thanks.
Then, whether we add System.setProperty("spark.testing", "true") in
SparkFunSuite to slove the IDE test tool problem. Be
Github user cloud-fan commented on the issue:
https://github.com/apache/spark/pull/20516
Can you try with SBT(using command line)? Usually we don't trust the test
result of IDE.
---
-
To unsubscribe, e-mail: reviews
Github user heary-cao commented on the issue:
https://github.com/apache/spark/pull/20516
sure,
Operation environment: IDEA test tool.
test case: test("can bind to a specific port")
Test code:
val maxRetries = portMaxRetries(conf)
println("maxRetries:" + maxRetri
Github user cloud-fan commented on the issue:
https://github.com/apache/spark/pull/20516
are you sure `./project/SparkBuild.scala:795: javaOptions in Test +=
"-Dspark.testing=1"` only affect non-test code path? Then we have a lot of
places to fix.
---
--
Github user heary-cao commented on the issue:
https://github.com/apache/spark/pull/20516
@cloud-fan thank you for suggest.
`./project/SparkBuild.scala:795: javaOptions in Test +=
"-Dspark.testing=1"` seems only the compiler of the spark effectively,
No effect on the SparkFunSu
Github user heary-cao commented on the issue:
https://github.com/apache/spark/pull/20516
@jiangxb1987 thank you for review it.
First, the source of` 100 `in the revised` (expectedPort + 100)` must be
after the setting of the spark.testing,
Second, to add some boundary tests, su
Github user jiangxb1987 commented on the issue:
https://github.com/apache/spark/pull/20516
Does this PR add any value or fix any bugs?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user heary-cao commented on the issue:
https://github.com/apache/spark/pull/20516
@srowen,
if you don't set the spark.testing for true, the default value for
spark.port.maxRetries is not 100, but 16,
so in verifyServicePort function,
actualPort should be <= (expecte
Github user heary-cao commented on the issue:
https://github.com/apache/spark/pull/20516
@srowen please review it again.thanks.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional comman
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/20516
Can one of the admins verify this patch?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/20516
Can one of the admins verify this patch?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
14 matches
Mail list logo