Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change 
notification.

The following page has been changed by OwenOMalley:
http://wiki.apache.org/hadoop/Release1%2e0Requirements

------------------------------------------------------------------------------
- = Requirements for Hadoop Release 1.0 =
+ = Hadoop Release 1.0 =
  
  This page is a place to collect potential requirements for a 1.0 release of 
Hadoop.
+ 
+ == What does Hadoop 1.0 include? ==
+ 
+ Question: Do we assume that all of these subprojects will go 1.0 at the same 
time? -- !DougCutting
+ 
+ == What does Hadoop 1.0 mean? ==
+ 
+    * No need for client recompilation when upgrading from 1.x to 1.y, where x 
<= y
+       * Can't remove deprecated classes or methods until 2.0
+    * Old 1.x clients can connect to new 1.y servers, where x <= y
+    * Only bug fixes in 1.x.y releases and new features in 1.x.0 releases.
+    * New !FileSystem clients must be able to call old methods when talking to 
old servers. This generally will be done by having old methods continue to use 
old rpc methods. However, it is legal to have new implementations of old 
methods call new rpcs methods, as long as the library transparently handles the 
fallback case for old servers.
+ 
+ Question: Does the release number really matter?  Should we just keep adding 
features, improving back-compatibility, etc?  Our ["Roadmap"] currently defines 
what a major release means.  Does this need updating?  -- !DougCutting
+ 
+ == Prerequisites for Hadoop 1.0 ==
  
  Please add requirements as sections below.
  
  If you comment on someone else's requirement, please add your name next to 
your comments.
  
- Question: Does the release number really matter?  Should we just keep adding 
features, improving back-compatibility, etc?  Our ["Roadmap"] currently defines 
what a major release means.  Does this need updating?  --DougCutting
+ === New MapReduce APIs ===
  
- == New MapReduce APIs ==
+ This is [https://issues.apache.org/jira/browse/HADOOP-1230]. The new API will 
provide us with much better future-proof APIs. The current Map/Reduce interface 
needs to be removed.
  
- This is [https://issues.apache.org/jira/browse/HADOOP-1230].
- 
- == HDFS and MapReduce split into separate projects ==
+ === HDFS and MapReduce split into separate projects ===
  
  This was agreed to in a [http://markmail.org/message/flljdfpgj65zngsf thread 
on [EMAIL PROTECTED]  There will be three separate projects, mapred, hdfs, and 
a library of common APIs and utilities (fs, io, conf, etc. packages).
  
- Question: Do we assume that all of these subprojects will go 1.0 at the same 
time? --DougCutting
  
- = Release 2.0 Requirements =
+ == Release 2.0 Requirements ==
  
  List here longer-range items, that we'd like to do someday, but not in the 
1.0 timeframe.
  

Reply via email to