[Hadoop Wiki] Update of "HowToSetupYourDevelopmentEnvironment" by SteveLoughran

2017-02-14 Thread Apache Wiki
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change 
notification.

The "HowToSetupYourDevelopmentEnvironment" page has been changed by 
SteveLoughran:
https://wiki.apache.org/hadoop/HowToSetupYourDevelopmentEnvironment?action=diff=34=35

Comment:
add the details on OSX install, especially protoc setup now that homebrew 1.x 
doesn't support protobuf 2.5

  This page describes how to get your environment setup and is IDE agnostic.
  
  = Requirements =
-  * Java 6 or 7
-  * Maven
+  * Java 7 or 8 (Branch 2) or Java 8 (trunk)
+  * Maven 3.3 or later
   * Your favorite IDE
+  * Protobuf 2.5.0
  
  = Setup Your Development Environment in Linux =
  
- The instructions below talk about how to get an environment setup using the 
command line to build, control source, and test.  These instructions are 
therefore IDE independent.  Take a look at EclipseEnvironment for instructions 
on how to configure Eclipse to build, control source, and test.  If you prefer 
ItelliJ IDEA, then take a look [[HadoopUnderIDEA| here]]
+ The instructions below talk about how to get an environment setup using the 
command line to build, control source, and test.  These instructions are 
therefore IDE independent.  Take a look at EclipseEnvironment for instructions 
on how to configure Eclipse to build, control source, and test.  If you prefer 
IntelliJ IDEA, then take a look [[HadoopUnderIDEA| here]]
  
-  * Choose a good place to put your code.  You will eventually use your source 
code to run Hadoop, so choose wisely. For example ~/code/hadoop.
+  * Choose a good place to put your code.  You will eventually use your source 
code to run Hadoop, so choose wisely. For example {{{~/code/hadoop}}}.
-  * Get the source.  This is documented in HowToContribute.  Put the source in 
~/code/hadoop (or whatever you chose) so that you have 
~/code/hadoop/hadoop-common
+  * Get the source.  This is documented in HowToContribute.  Put the source in 
{{{~/code/hadoop (or whatever you chose) so that you have 
{{{~/code/hadoop/hadoop-common}}}
-  * cd into ''hadoop-common'', or whatever you named the directory
+  * cd into {{{hadoop-common}}}, or whatever you named the directory
-  * attempt to run ''mvn install''
+  * attempt to run {{{mvn install}}} . To build without tests: {{{mvn install 
-DskipTests}}}
*  If you get any strange errors (other than JUnit test failures and 
errors), then consult the ''Build Errors'' section below.
   * follow GettingStartedWithHadoop to learn how to run Hadoop.
*  If you run in to any problems, refer to the ''Runtime Errors'' below, 
along with the troubleshooting document here: TroubleShooting
+ 
+ = Setup Your Development Environment in OSX =
+ 
+ 
+ The Linux instructions match, except that:
+ 
+ XCode is needed for the command line compiler and other tools. 
+ 
+ 
+ protobuf 2.5.0 needs to be built by hand, as macports and homebrew no longer 
ship that version.
+ 
+ Follow the instructions in the building from source 
[[http://sleepythread.blogspot.co.uk/2013/11/installing-protoc-25x-compiler-google.html|Installing
 protoc 2.5.x compiler on mac]] ''but change the URL for the protobuf archive 
to 
[[https://github.com/google/protobuf/releases/download/v2.5.0/protobuf-2.5.0.tar.gz]]''.
 
+ 
+ To verify that protobuf is  correctly installed, the command {{{protoc 
--version}}} must print out the string {{{libprotoc 2.5.0}}}.
+ 
  
  = Run HDFS in pseudo-distributed mode from the dev tree =
  

-
To unsubscribe, e-mail: common-commits-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-commits-h...@hadoop.apache.org



[Hadoop Wiki] Update of HowToSetupYourDevelopmentEnvironment by SteveLoughran

2013-10-25 Thread Apache Wiki
Dear Wiki user,

You have subscribed to a wiki page or wiki category on Hadoop Wiki for change 
notification.

The HowToSetupYourDevelopmentEnvironment page has been changed by 
SteveLoughran:
https://wiki.apache.org/hadoop/HowToSetupYourDevelopmentEnvironment?action=diffrev1=31rev2=32

Comment:
make sure the docs are in sync w/ mvn build and current structure

  
  This page describes how to get your environment setup and is IDE agnostic.
  
- ''this article is out of date -it covers Hadoop 1.x, not the restructured and 
maven-based Hadoop 2.x build''
- 
  = Requirements =
-  * Java 6
+  * Java 6 or 7
-  * Ant
+  * Maven
   * Your favorite IDE
  
  = Setup Your Development Environment in Linux =
  
  The instructions below talk about how to get an environment setup using the 
command line to build, control source, and test.  These instructions are 
therefore IDE independent.  Take a look at EclipseEnvironment for instructions 
on how to configure Eclipse to build, control source, and test.  If you prefer 
ItelliJ IDEA, then take a look [[HadoopUnderIDEA| here]]
  
-  * Choose a good place to put your code.  You will eventually use your source 
code to run Hadoop, so choose wisely.  I chose /code/hadoop.
+  * Choose a good place to put your code.  You will eventually use your source 
code to run Hadoop, so choose wisely. For example ~/code/hadoop.
-  * Get the source.  This is documented here: HowToContribute.  Put the source 
in /code/hadoop (or whatever you chose) so that you have 
/code/hadoop/hadoop-core-trunk
+  * Get the source.  This is documented in HowToContribute.  Put the source in 
~/code/hadoop (or whatever you chose) so that you have 
~/code/hadoop/hadoop-common
-  * cd into ''hadoop-core-trunk'', or whatever you named the directory
+  * cd into ''hadoop-common'', or whatever you named the directory
-  * attempt to run ''ant test''
+  * attempt to run ''mvn install''
*  If you get any strange errors (other than JUnit test failures and 
errors), then consult the ''Build Errors'' section below.
+  * follow GettingStartedWithHadoop to learn how to run Hadoop.
-  * run ''ant'' to compile (this may not be necessary if you've already run 
''ant test'')
-  * follow GettingStartedWithHadoop or the instructions below to learn how to 
run Hadoop (use this guide if you use Ubuntu: 
http://wiki.apache.org/hadoop/Running_Hadoop_On_Ubuntu_Linux_%28Single-Node_Cluster%29)
-   *  Use the hadoop-core-trunk folder just as you would a downloaded version 
of Hadoop (symlink hadoop-core-trunk to hadoop)
*  If you run in to any problems, refer to the ''Runtime Errors'' below, 
along with the troubleshooting document here: TroubleShooting
  
  = Run HDFS in pseudo-distributed mode from the dev tree =
@@ -123, +119 @@

  
  {{{Exception in thread main java.lang.AssertionError: Missing tools.jar at: 
/Library/Java/JavaVirtualMachines/jdk1.7.0_45.jdk/Contents/Home/Classes/classes.jar.
 Expression: file.exists()}}}
  
- This happens because one of the modules used in the Hadoop build expects 
{{classes.jar}} to be in a location it no longer is on Oracle Java 7+ on OS/X. 
See [https://issues.apache.org/jira/browse/HADOOP-9350|HADOOP-9350]
+ This happens because one of the modules used in the Hadoop build expects 
{{classes.jar}} to be in a location it no longer is on Oracle Java 7+ on OS/X. 
See [[https://issues.apache.org/jira/browse/HADOOP-9350|HADOOP-9350]]
  
  = Runtime Errors =
  


[Hadoop Wiki] Update of HowToSetupYourDevelopmentEnvironment by SteveLoughran

2013-10-25 Thread Apache Wiki
Dear Wiki user,

You have subscribed to a wiki page or wiki category on Hadoop Wiki for change 
notification.

The HowToSetupYourDevelopmentEnvironment page has been changed by 
SteveLoughran:
https://wiki.apache.org/hadoop/HowToSetupYourDevelopmentEnvironment?action=diffrev1=32rev2=33

Comment:
fix some formatting problems

  
  {{{Exception in thread main java.lang.AssertionError: Missing tools.jar at: 
/Library/Java/JavaVirtualMachines/jdk1.7.0_45.jdk/Contents/Home/Classes/classes.jar.
 Expression: file.exists()}}}
  
- This happens because one of the modules used in the Hadoop build expects 
{{classes.jar}} to be in a location it no longer is on Oracle Java 7+ on OS/X. 
See [[https://issues.apache.org/jira/browse/HADOOP-9350|HADOOP-9350]]
+ This happens because one of the modules used in the Hadoop build expects 
{{{classes.jar}}} to be in a location it no longer is on Oracle Java 7+ on 
OS/X. See [[https://issues.apache.org/jira/browse/HADOOP-9350|HADOOP-9350]]
  
  = Runtime Errors =