[ 
https://issues.apache.org/jira/browse/SPARK-20922?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16112600#comment-16112600
 ] 

Aditya Sharad commented on SPARK-20922:
---------------------------------------

Apologies for the delay in getting back to you. I believe we first got in touch 
privately to report this, but in future we'll discuss the details and fix on 
private@ first if that fits better into your workflow.

The scope is indeed limited to attacks from local users and the issue is now 
publicly disclosed. However, I would argue neither of these points disqualifies 
the vulnerability reported here for the purposes of getting a CVE assigned.

Depending on the configuration and the intentions of an attacker, the 
repercussions of this vulnerability are potentially extremely severe despite 
the limited scope:
- The worst case is obviously when Spark runs as an administrative user.
- In the more common case where Spark runs under a user account that is also 
responsible for other services (like Hadoop, HDFS), the repercussions can be 
very severe. This is the case in the default Cloudera setup, for example. In 
that particular scenario, an attacker can cause a widespread outage by simply 
wiping all data that belongs to the 'hdfs' user. The repercussions reach far 
beyond Spark itself.
- In the 'best' case, Spark is set up to use a dedicated user account. Here 
we're looking at a DoS to Spark specifically, with a severe risk for data loss. 
An attacker can stop the service and wipe all of Spark's data.

We have seen significantly less severe vulnerabilities for which a CVE is 
assigned. The prime reasons for doing so are to advise users and to maintain a 
visible record of the issue that isn't project-specific, which I think would be 
appropriate in this case.

Please let me know if there's anything I can help with. I am willing to file 
separately for the CVE if that is easier, but I do not wish to do so without 
first having your agreement and finding out if Spark has a preferred CVE route. 
If you'd like to discuss this further off-list, please feel free to contact me 
on [email protected].

> Unsafe deserialization in Spark LauncherConnection
> --------------------------------------------------
>
>                 Key: SPARK-20922
>                 URL: https://issues.apache.org/jira/browse/SPARK-20922
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 2.1.1
>            Reporter: Aditya Sharad
>            Assignee: Marcelo Vanzin
>              Labels: security
>             Fix For: 2.0.3, 2.1.2, 2.2.0, 2.3.0
>
>         Attachments: spark-deserialize-master.zip
>
>
> The {{run()}} method of the class 
> {{org.apache.spark.launcher.LauncherConnection}} performs unsafe 
> deserialization of data received by its socket. This makes Spark applications 
> launched programmatically using the {{SparkLauncher}} framework potentially 
> vulnerable to remote code execution by an attacker with access to any user 
> account on the local machine. Such an attacker could send a malicious 
> serialized Java object to multiple ports on the local machine, and if this 
> port matches the one (randomly) chosen by the Spark launcher, the malicious 
> object will be deserialized. By making use of gadget chains in code present 
> on the Spark application classpath, the deserialization process can lead to 
> RCE or privilege escalation.
> This vulnerability is identified by the “Unsafe deserialization” rule on 
> lgtm.com:
> https://lgtm.com/projects/g/apache/spark/snapshot/80fdc2c9d1693f5b3402a79ca4ec76f6e422ff13/files/launcher/src/main/java/org/apache/spark/launcher/LauncherConnection.java#V58
>  
> Attached is a proof-of-concept exploit involving a simple 
> {{SparkLauncher}}-based application and a known gadget chain in the Apache 
> Commons Beanutils library referenced by Spark.
> See the readme file for demonstration instructions.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to