Github user ArtRand commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18837#discussion_r131986331
  
    --- Diff: docs/running-on-mesos.md ---
    @@ -479,6 +479,35 @@ See the [configuration page](configuration.html) for 
information on Spark config
     </tr>
     
     <tr>
    +  <td><code>spark.mesos.driver.secret.envkey</code></td>
    +  <td><code>(none)</code></td>
    +  <td>
    +    If set, the contents of the secret referenced by
    +    spark.mesos.driver.secret.name will be written to the provided
    +    environment variable in the driver's process.
    +  </td>
    +  </tr>
    +  <tr>
    +<td><code>spark.mesos.driver.secret.filename</code></td>
    +  <td><code>(none)</code></td>
    +  <td>
    +    If set, the contents of the secret referenced by
    +    spark.mesos.driver.secret.name will be written to the provided
    +    file.  Relative paths are relative to the container's work
    +    directory.  Absolute paths must already exist.  Consult the Mesos 
Secret
    +    protobuf for more information.
    +  </td>
    +</tr>
    +<tr>
    +  <td><code>spark.mesos.driver.secret.name</code></td>
    --- End diff --
    
    Hey @susanxhuynh, so the way it works now is you can specify a secret as a 
`REFERENCE` or as a `VALUE` type by using the `spark.mesos.driver.secret.name` 
or `spark.mesos.driver.secret.value` configs, respectively. These secrets are 
then made file-based and/or env-based depending on the contents of 
`spark.mesos.driver.secret.filename` and `spark.mesos.driver.secret.envkey`. I 
allow for multiple secrets (but not multiple types) as comma-seperated lists 
(like Mesos URIs).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to