bowenliang123 commented on code in PR #3838:
URL: https://github.com/apache/incubator-kyuubi/pull/3838#discussion_r1031935223


##########
docs/deployment/settings.md:
##########
@@ -497,6 +497,7 @@ kyuubi.session.idle.timeout|PT6H|session idle timeout, it 
will be closed when it
 kyuubi.session.local.dir.allow.list||The local dir list that are allowed to 
access by the kyuubi session application. User might set some parameters such 
as `spark.files` and it will upload some local files when launching the kyuubi 
engine, if the local dir allow list is defined, kyuubi will check whether the 
path to upload is in the allow list. Note that, if it is empty, there is no 
limitation for that and please use absolute path list.|seq|1.6.0
 kyuubi.session.name|<undefined>|A human readable name of session and we 
use empty string by default. This name will be recorded in event. Note that, we 
only apply this value from session conf.|string|1.4.0
 kyuubi.session.timeout|PT6H|(deprecated)session timeout, it will be closed 
when it's not accessed for this duration|duration|1.0.0
+kyuubi.session.user.verify.enabled|true|Whether to verify the integrity of 
session user name in Spark engine within Authz plugin.|boolean|1.7.0

Review Comment:
   OK, changed the config name to `kyuubi.session.user.sign.enabled`.
   
   As for the rest configs suggested, they are not good to put in configs this 
time maybe.
   - algorithm , 
           - not suitable config in one option, itis related to different 
details including the KeyPair generation algorithm (like "EC"), and the signing 
algorithm (like "SHA256withECDSA")
           - used by both the server and engine side plugin Authz
   - public.key / private.key
           - set to spark conf , may be show on Spark UI, even being redacted
           - if set to spark conf via `SparkProcBuilder`, they will be easily 
leaking to using scripts for regenearting fake  session user which passing the 
use sign verifying in Authz.
    ```
   """
   privatekey = spark.context.getconf.get("privatekey")
   
   newsign = sign(fakeuser, privatekey)
   spark.context.setLocalProp(fakeuser)
   
   # invalid access with fake session user
   df = spark.sql("select * from confidential_table")
    ```



##########
docs/deployment/settings.md:
##########
@@ -497,6 +497,7 @@ kyuubi.session.idle.timeout|PT6H|session idle timeout, it 
will be closed when it
 kyuubi.session.local.dir.allow.list||The local dir list that are allowed to 
access by the kyuubi session application. User might set some parameters such 
as `spark.files` and it will upload some local files when launching the kyuubi 
engine, if the local dir allow list is defined, kyuubi will check whether the 
path to upload is in the allow list. Note that, if it is empty, there is no 
limitation for that and please use absolute path list.|seq|1.6.0
 kyuubi.session.name|<undefined>|A human readable name of session and we 
use empty string by default. This name will be recorded in event. Note that, we 
only apply this value from session conf.|string|1.4.0
 kyuubi.session.timeout|PT6H|(deprecated)session timeout, it will be closed 
when it's not accessed for this duration|duration|1.0.0
+kyuubi.session.user.verify.enabled|true|Whether to verify the integrity of 
session user name in Spark engine within Authz plugin.|boolean|1.7.0

Review Comment:
   OK, changed the config name to `kyuubi.session.user.sign.enabled`.
   
   As for the rest configs suggested, they are not good to put in configs this 
time maybe.
   - algorithm , 
           - not suitable config in one option, itis related to different 
details including the KeyPair generation algorithm (like "EC"), and the signing 
algorithm (like "SHA256withECDSA")
           - used by both the server and engine side plugin Authz
   - public.key / private.key
           - set to spark conf , may be show on Spark UI, even being redacted
           - if set to spark conf via `SparkProcBuilder`, they will be easily 
leaking to using scripts for regenearting fake  session user which passing the 
use sign verifying in Authz.
    ```
   privatekey = spark.context.getconf.get("privatekey")
   
   newsign = sign(fakeuser, privatekey)
   spark.context.setLocalProp(fakeuser)
   
   # invalid access with fake session user
   df = spark.sql("select * from confidential_table")
    ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to