Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change 
notification.

The following page has been changed by DougCutting:
http://wiki.apache.org/hadoop/Release1%2e0Requirements

------------------------------------------------------------------------------
  
  ''Language Neutral - yes it will mean duplicated client-side and hence more 
work. From what I have observed in the design discussion,  keeping the client 
side small was a criteria because we were expecting language neutral protocols 
down the road. Do you feel that we should not bother with language neutral 
protocols at all? --SanjayRadia''
  
+ ''I think we should be very careful about which network protocols we publicly 
expose.  Currently we expose none.  I do not think we should attempt to expose 
all soon.  A first obvious candidate to expose might be the job submission 
protocol.  Before we do so we should closely revisit its design, since it was 
not designed as an API with long-term, multi-language access in mind.  Any 
logic that we can easily move server-side, we should, to minimize duplicated 
code.  Etc.  The HDFS protocols will require more scrutiny, since they involve 
more client side logic.  It would be simpler if all of HDFS was implemented 
using RPC, not a mix of RPC and raw sockets.  So we might decide to delay 
publicly exposing the HDFS protocols until we have made that switch, should it 
prove feasable.  I think we could reasonably have a 1.0 release that exposed 
none.  I do not see this as a gating issue for a 1.0 release.  We could 
reasonably expose such protocols in the course of 1.x releases, no?  
 --DougCutting''
  
  === Versioning Scheme - Manual or Automated ===
  Hadoop is likely to see fairly significant changes between 1.0 and 2.0. Given 
the compatibility requirements, we need some scheme (manual or automated) for 
versioning the RPC interfaces and also for versioning the data types that are 
passed as parameters to rpc.

Reply via email to