On Wed, Jul 1, 2009 at 10:10 PM, Raghu Angadi <rang...@yahoo-inc.com> wrote:

>
> -1 for committing the jar.
>
> Most of the various options proposed sound certainly better.
>
> Can build.xml be updated such that Ivy fetches recent (nightly) build?
>

This seems slightly better than actually committing the jars. However, what
should we do when the nightly build has failed hudson tests? We seem to
sometimes go weeks at a time without a "green" build out of Hudson.


>
> HDFS could have a build target that builds common jar from a specified
> source location for common.
>

This is still my preffered option. Whether it does this with a <javac> task
or with some kind of <subant> or even <exec>, I think having the source
trees "loosely" tied together for developers is a must.


-Todd
Todd Lipcon wrote:

> On Wed, Jul 1, 2009 at 2:10 PM, Philip Zeyliger <phi...@cloudera.com>
> wrote:
>
>  -1 to checking in jars.  It's quite a bit of bloat in the repository
>> (which
>> admittedly affects the git.apache folks more than the svn folks), but it's
>> also cumbersome to develop.
>>
>> It'd be nice to have a one-liner that builds the equivalent of the tarball
>> built by "ant binary" in the old world.  When you're working on something
>> that affects both common and hdfs, it'll be pretty painful to make the
>> jars
>> in common, move them over to hdfs, and then compile hdfs.
>>
>> Could the build.xml in hdfs call into common's build.xml and build common
>> as
>> part of building hdfs?  Or perhaps have a separate "top-level" build file
>> that builds everything?
>>
>>
> Agree with Phillip here. Requiring a new jar to be checked in anywhere
> after
> every common commit seems unscalable and nonperformant. For git users this
> will make the repository size baloon like crazy (the jar is 400KB and we
> have around 5300 commits so far = 2GB!). For svn users it will still mean
> that every "svn update" requires a download of a new jar. Using svn
> externals to manage them also complicates things when trying to work on a
> cross-component patch with two dirty directories - you really need a
> symlink
> between your working directories rather than through the SVN tree.
>
> I think it would be reasonable to require that developers check out a
> structure like:
>
> working-dir/
>  hadoop-common/
>  hadoop-mapred/
>  hadoop-hdfs/
>
> We can then use relative paths for the mapred->common and hdfs->common
> dependencies. Those who only work on HDFS or only work on mapred will not
> have to check out the other, but everyone will check out common.
>
> Whether there exists a fourth repository (eg hadoop-build) that has a
> build.xml that ties together the other build.xmls is another open question
> IMO.
>
> -Todd
>
>

Reply via email to