[ 
https://issues.apache.org/jira/browse/SPARK-19079?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Derek M Miller updated SPARK-19079:
-----------------------------------
    Description: 
Currently, there seems to be an issue when using SASL in Spark with yarn with 
at least 1.6.1. I wrote up a stackoverflow issue with the exact details of my 
configuration here: 
http://stackoverflow.com/questions/41453588/spark-sasl-not-working-on-the-emr-with-yarn
 .

In short, I have added the spark.authenticate parameter to the hadoop and spark 
configuration. I also have these parameters as well.

```
      "spark.authenticate.enableSaslEncryption": "true",
      "spark.network.sasl.serverAlwaysEncrypt": "true"
```

However, I am consistently getting this error message:

```
java.lang.RuntimeException: java.lang.IllegalArgumentException: Unknown message 
type: -22
```

Further debugging has not been helpful. I think it is worth noting that this is 
all on Amazon's emr as well. As a side note, even if this is not a bug, I think 
it would at the very least be worth updating the docs. The docs make it seem 
like you only need to add 'spark.authenticate' to the spark config, where it 
sounds like you actually need it for the hadoop configuration as well.


  was:
Currently, there seems to be an issue when using SASL in Spark with yarn with 
at least 1.6.1. I wrote up a stackoverflow issue with the exact details of my 
configuration here: 
http://stackoverflow.com/questions/41453588/spark-sasl-not-working-on-the-emr-with-yarn
 .

In short, I have added the spark.authenticate parameter to the hadoop and spark 
configuration. I also have these parameters as well.

```
      "spark.authenticate.enableSaslEncryption": "true",
      "spark.network.sasl.serverAlwaysEncrypt": "true"
```

However, I am consistently getting this error message:

```
java.lang.RuntimeException: java.lang.IllegalArgumentException: Unknown message 
type: -22
```

Further debugging has not been helpful. I think it is worth noting that this is 
all on Amazon's emr as well.



> Spark 1.6.1 SASL Error with Yarn
> --------------------------------
>
>                 Key: SPARK-19079
>                 URL: https://issues.apache.org/jira/browse/SPARK-19079
>             Project: Spark
>          Issue Type: Bug
>            Reporter: Derek M Miller
>
> Currently, there seems to be an issue when using SASL in Spark with yarn with 
> at least 1.6.1. I wrote up a stackoverflow issue with the exact details of my 
> configuration here: 
> http://stackoverflow.com/questions/41453588/spark-sasl-not-working-on-the-emr-with-yarn
>  .
> In short, I have added the spark.authenticate parameter to the hadoop and 
> spark configuration. I also have these parameters as well.
> ```
>       "spark.authenticate.enableSaslEncryption": "true",
>       "spark.network.sasl.serverAlwaysEncrypt": "true"
> ```
> However, I am consistently getting this error message:
> ```
> java.lang.RuntimeException: java.lang.IllegalArgumentException: Unknown 
> message type: -22
> ```
> Further debugging has not been helpful. I think it is worth noting that this 
> is all on Amazon's emr as well. As a side note, even if this is not a bug, I 
> think it would at the very least be worth updating the docs. The docs make it 
> seem like you only need to add 'spark.authenticate' to the spark config, 
> where it sounds like you actually need it for the hadoop configuration as 
> well.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to