On Jan 31, 2011, at 10:19 PM, Konstantin Boudnik wrote:
On Sun, Jan 30, 2011 at 23:19, Owen O'Malley <[email protected]>
wrote:
On Jan 30, 2011, at 7:42 PM, Nigel Daley wrote:
Now that http://apache-extras.org is launched
(https://blogs.apache.org/foundation/entry/the_apache_software_foundation_launches
)
I'd like to start a discussion on moving contrib components out of
common,
mapreduce, and hdfs.
The PMC can't "move" code to Apache extras. It can only choose to
abandon
code that it doesn't want to support any longer. As a separate
action some
group of developers may create projects in Apache Extras based on
the code
from Hadoop.
Therefore the question is really what if any code Hadoop wants to
abandon.
That is a good question and one that we should ask ourselves
occasionally.
After a quick consideration, my personal list would look like:
failmon
fault injection
This is the best way to kill a project as tightly coupled with the
core code as fault injection.
So, if you really want to kill it - then move it.
Nigel/Owen did not say "kill it". Folks were simply listing potential
projects to move out.
If you feel that it should stay in then simply say so and give the
reasons -- looks like your reason is "tight coupling".
sanjay
fuse-dfs
hod
kfs
Also note that pushing code out of Hadoop has a high cost. There
are at
least 3 forks of the hadoop-gpl-compression code. That creates a
lot of
confusion for the users. A lot of users never go to the work to
figure out
which fork and branch of hadoop-gpl-compression work with the
version of
Hadoop they installed.
-- Owen