(I just finished writing this when I noticed the similar email from Marcos 
bringing up similar issues with the Yahoo tutorials)

I've read through the following:

http://www.mail-archive.com/mapreduce-dev@hadoop.apache.org/msg01833.html
http://www.mail-archive.com/general@hadoop.apache.org/msg04621.html
https://issues.apache.org/jira/browse/MAPREDUCE-1734
https://issues.apache.org/jira/browse/MAPREDUCE-1735
https://issues.apache.org/jira/browse/MAPREDUCE-3771

And it looks like every branch has un-deprecated the old APIs (except for 
0.22?). But it seems like from the first discussion above, several people who 
were okay with un-deprecating the old APIs thought they would still be 
re-deprecated (and even removed) prior to a 1.0 release, which has obviously 
not happened. Is re-deprecating the old APIs on the horizon at all for any of 
the branches?

I started writing mapreduce code using the old APIs because the vast majority 
of examples and tutorials used the old APIs. But when I started encountering 
the deprecated warnings and I found the new APIs, I started porting my code. I 
felt like the warnings were a good indication of where I should be heading with 
the code. And having: "Deprecated. Use XyzClass instead" was very helpful. Now, 
with these classes being un-deprecated, users have no clue what they 
should/should not be using. This issue is compounded when examples in the API 
docs use these once-deprecated classes. For example: the use of JobConf in an 
example from:

http://hadoop.apache.org/common/docs/r0.23.0/api/org/apache/hadoop/util/Tool.html

Without either deprecated warnings or good examples using the new API, new 
hadoop developers will continue to use the old APIs. If these classes aren't 
going to be re-deprecated any time soon, shouldn't at least the javadoc 
examples be updated to use only the new APIs so that developers have some 
indication of where they should be going with their code?

-Steven Willis

Reply via email to