[ 
https://issues.apache.org/jira/browse/SPARK-1518?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14010937#comment-14010937
 ] 

Sean Owen commented on SPARK-1518:
----------------------------------

Re: versioning one more time, really supporting a bunch of versions may get 
costly. It's already tricky to manage two builds times YARN-or-not, 
Hive-or-not, times 4 flavors of Hadoop. I doubt the assemblies are yet 
problem-free in all cases. 

In practice it look like one generic Hadoop 1, Hadoop 2, and CDH 4 release is 
produced, and 1 set of Maven artifact. (PS again I am not sure Spark should 
contain a CDH-specific distribution? realizing it's really a proxy for a 
particular Hadoop combo. Same goes for a MapR profile, which is really for 
vendors to maintain) That means right now you can't build a Spark app for 
anything but Hadoop 1.x with Maven, without installing it yourself, and there's 
not an official distro for anything but two major Hadoop versions. Support for 
niche versions isn't really there or promised anyway, and fleshing out 
"support" may make doing so pretty burdensome. 

There is no suggested action here; if anything I suggest that the right thing 
is to add Maven artifacts with classifiers, add a few binary artifacts, 
subtract a few vendor artifacts, but this is a different action.

> Spark master doesn't compile against hadoop-common trunk
> --------------------------------------------------------
>
>                 Key: SPARK-1518
>                 URL: https://issues.apache.org/jira/browse/SPARK-1518
>             Project: Spark
>          Issue Type: Bug
>            Reporter: Marcelo Vanzin
>            Assignee: Colin Patrick McCabe
>            Priority: Critical
>
> FSDataOutputStream::sync() has disappeared from trunk in Hadoop; 
> FileLogger.scala is calling it.
> I've changed it locally to hsync() so I can compile the code, but haven't 
> checked yet whether those are equivalent. hsync() seems to have been there 
> forever, so it hopefully works with all versions Spark cares about.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to