AngersZhuuuu commented on a change in pull request #26594: 
[SPARK-29957][TEST][test-java11][test-hadoop3.2] Reset MiniKDC's default 
enctypes to fit jdk8/jdk11
URL: https://github.com/apache/spark/pull/26594#discussion_r352077060
 
 

 ##########
 File path: 
external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaTestUtils.scala
 ##########
 @@ -136,6 +137,20 @@ class KafkaTestUtils(
     kdcConf.setProperty(MiniKdc.DEBUG, "true")
     kdc = new MiniKdc(kdcConf, kdcDir)
     kdc.start()
+    val krb5Conf = Source.fromFile(kdc.getKrb5conf, "UTF-8").getLines()
+    val rewriteKrb5Conf = krb5Conf.map(s => if (s.contains("libdefaults")) {
 
 Review comment:
   > OK, so make things more simple. I've double checked 2.7.4 code and the 
template is in the hadoop library. If the hadoop lib removes the `libdefaults` 
section because you haven't pointed any guarantee that it will remain, do this 
test will fail with the actual stand of the PR? If it fails the guy who will 
fix this must spend quite a time to understand what's going on...
   
   Yea... so build spark's own MiniKDC will make us more clear about kerberos 
config...
   Understanding  what we have shown in this pr ... really need to know 
kerberos's config a lot ..

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to