http://git-wip-us.apache.org/repos/asf/hbase/blob/48d9d27d/src/main/docbkx/developer.xml ---------------------------------------------------------------------- diff --git a/src/main/docbkx/developer.xml b/src/main/docbkx/developer.xml index 89ed56f..a6b5dc2 100644 --- a/src/main/docbkx/developer.xml +++ b/src/main/docbkx/developer.xml @@ -29,197 +29,467 @@ */ --> <title>Building and Developing Apache HBase</title> - <para>This chapter will be of interest only to those building and developing Apache HBase (i.e., as opposed to - just downloading the latest distribution). - </para> + <para>This chapter contains information and guidelines for building and releasing HBase code and + documentation. Being familiar with these guidelines will help the HBase committers to use + your contributions more easily.</para> + <section xml:id="getting.involved"> + <title>Getting Involved</title> + <para>Apache HBase gets better only when people contribute! If you are looking to contribute + to Apache HBase, look for <link + xlink:href="https://issues.apache.org/jira/issues/?jql=project%20%3D%20HBASE%20AND%20labels%20in%20(beginner)" + >issues in JIRA tagged with the label 'beginner'</link>. These are issues HBase + contributors have deemed worthy but not of immediate priority and a good way to ramp on + HBase internals. See <link xlink:href="http://search-hadoop.com/m/DHED43re96">What label + is used for issues that are good on ramps for new contributors?</link> from the dev + mailing list for background.</para> + <para>Before you get started submitting code to HBase, please refer to <xref + linkend="developing"/>.</para> + <para>As Apache HBase is an Apache Software Foundation project, see <xref linkend="asf"/> + for more information about how the ASF functions. </para> + <section xml:id="mailing.list"> + <title>Mailing Lists</title> + <para>Sign up for the dev-list and the user-list. See the + <link xlink:href="http://hbase.apache.org/mail-lists.html">mailing lists</link> page. + Posing questions - and helping to answer other people's questions - is encouraged! + There are varying levels of experience on both lists so patience and politeness are encouraged (and please + stay on topic.) + </para> + </section> + <section xml:id="irc"> + <title>Internet Relay Chat (IRC)</title> + <para>For real-time questions and discussions, use the <literal>#hbase</literal> IRC + channel on the <link xlink:href="https://freenode.net/">FreeNode</link> IRC network. + FreeNode offers a web-based client, but most people prefer a native client, and + several clients are available for each operating system.</para> + </section> + <section xml:id="jira"> + <title>Jira</title> + <para>Check for existing issues in <link + xlink:href="https://issues.apache.org/jira/browse/HBASE">Jira</link>. If it's + either a new feature request, enhancement, or a bug, file a ticket. </para> + <para>To check for existing issues which you can tackle as a beginner, search for <link + xlink:href="https://issues.apache.org/jira/issues/?jql=project%20%3D%20HBASE%20AND%20labels%20in%20(beginner)" + >issues in JIRA tagged with the label 'beginner'</link>.</para> + <itemizedlist xml:id="jira.priorities"> + <title>JIRA Priorities</title> + <listitem> + <para>Blocker: Should only be used if the issue WILL cause data loss or cluster + instability reliably.</para> + </listitem> + <listitem> + <para>Critical: The issue described can cause data loss or cluster instability + in some cases.</para> + </listitem> + <listitem> + <para>Major: Important but not tragic issues, like updates to the client API + that will add a lot of much-needed functionality or significant bugs that + need to be fixed but that don't cause data loss.</para> + </listitem> + <listitem> + <para>Minor: Useful enhancements and annoying but not damaging bugs.</para> + </listitem> + <listitem> + <para>Trivial: Useful enhancements but generally cosmetic.</para> + </listitem> + </itemizedlist> + <example xml:id="submitting.patches.jira.code"> + <title>Code Blocks in Jira Comments</title> + <para>A commonly used macro in Jira is {code}. Everything inside the tags is + preformatted, as in this example.</para> + <programlisting> +{code} +code snippet +{code} + </programlisting> + </example> + </section> <!-- jira --> + + </section> <!-- getting involved --> + <section xml:id="repos"> - <title>Apache HBase Repositories</title> - <para>There are two different repositories for Apache HBase: Subversion (SVN) and Git. - GIT is our repository of record for all but the Apache HBase website. - We used to be on SVN. We migrated. See <link xlink:href="https://issues.apache.org/jira/browse/INFRA-7768">Migrade Apache HBase SVN Repos to Git</link>. - Updating hbase.apache.org still requires use of SVN (See <xref linkend="hbase.org" />). - See <link xlink:href="http://hbase.apache.org/source-repository.html">Source Code Management</link> - page for contributor and committer links or - seach for HBase on the <link xlink:href="http://git.apache.org/">Apache Git</link> page. - </para> + <title>Apache HBase Repositories</title> + <para>There are two different repositories for Apache HBase: Subversion (SVN) and Git. GIT + is our repository of record for all but the Apache HBase website. We used to be on SVN. + We migrated. See <link xlink:href="https://issues.apache.org/jira/browse/INFRA-7768" + >Migrate Apache HBase SVN Repos to Git</link>. Updating hbase.apache.org still + requires use of SVN (See <xref linkend="hbase.org"/>). See <link + xlink:href="http://hbase.apache.org/source-repository.html">Source Code + Management</link> page for contributor and committer links or seach for HBase on the + <link xlink:href="http://git.apache.org/">Apache Git</link> page.</para> </section> <section xml:id="ides"> <title>IDEs</title> <section xml:id="eclipse"> - <title>Eclipse</title> + <title>Eclipse</title> <section xml:id="eclipse.code.formatting"> - <title>Code Formatting</title> - <para>Under the <filename>dev-support</filename> folder, you will find <filename>hbase_eclipse_formatter.xml</filename>. - We encourage you to have this formatter in place in eclipse when editing HBase code. To load it into eclipse: -<orderedlist> -<listitem><para>Go to Eclipse->Preferences...</para></listitem> -<listitem><para>In Preferences, Go to Java->Code Style->Formatter</para></listitem> -<listitem><para>Import... <filename>hbase_eclipse_formatter.xml</filename></para></listitem> -<listitem><para>Click Apply</para></listitem> -<listitem><para>Still in Preferences, Go to Java->Editor->Save Actions</para></listitem> -<listitem><para>Check the following: -<orderedlist> -<listitem><para>Perform the selected actions on save</para></listitem> -<listitem><para>Format source code</para></listitem> -<listitem><para>Format edited lines</para></listitem> -</orderedlist> -</para></listitem> -<listitem><para>Click Apply</para></listitem> -</orderedlist> -</para> - <para>In addition to the automatic formatting, make sure you follow the style guidelines explained in <xref linkend="common.patch.feedback"/></para> - <para>Also, no @author tags - that's a rule. Quality Javadoc comments are appreciated. And include the Apache license.</para> + <title>Code Formatting</title> + <para>Under the <filename>dev-support/</filename> folder, you will find + <filename>hbase_eclipse_formatter.xml</filename>. We encourage you to have + this formatter in place in eclipse when editing HBase code.</para> + <procedure> + <title>Load the HBase Formatter Into Eclipse</title> + <step> + <para>Open the <menuchoice> + <guimenu>Eclipse</guimenu> + <guimenuitem>Preferences</guimenuitem> + </menuchoice> menu item.</para> + </step> + <step> + <para>In Preferences, click the <menuchoice> + <guimenu>Java</guimenu> + <guisubmenu>Code Style</guisubmenu> + <guimenuitem>Formatter</guimenuitem> + </menuchoice> menu item.</para> + </step> + <step> + <para>Click <guibutton>Import</guibutton> and browse to the location of the + <filename>hbase_eclipse_formatter.xml</filename> file, which is in + the <filename>dev-support/</filename> directory. Click + <guibutton>Apply</guibutton>.</para> + </step> + <step> + <para>Still in Preferences, click <menuchoice> + <guimenu>Java Editor</guimenu> + <guimenuitem>Save Actions</guimenuitem> + </menuchoice>. Be sure the following options are selected:</para> + <itemizedlist> + <listitem><para>Perform the selected actions on save</para></listitem> + <listitem><para>Format source code</para></listitem> + <listitem><para>Format edited lines</para></listitem> + </itemizedlist> + <para>Click <guibutton>Apply</guibutton>. Close all dialog boxes and return + to the main window.</para> + </step> + </procedure> + + <para>In addition to the automatic formatting, make sure you follow the style + guidelines explained in <xref linkend="common.patch.feedback"/></para> + <para>Also, no <code>@author</code> tags - that's a rule. Quality Javadoc comments + are appreciated. And include the Apache license.</para> </section> <section xml:id="eclipse.git.plugin"> - <title>Git Plugin</title> - <para>If you cloned the project via git, download and install the Git plugin (EGit). Attach to your local git repo (via the Git Repositories window) and you'll be able to see file revision history, generate patches, etc.</para> + <title>Eclipse Git Plugin</title> + <para>If you cloned the project via git, download and install the Git plugin (EGit). + Attach to your local git repo (via the <guilabel>Git Repositories</guilabel> + window) and you'll be able to see file revision history, generate patches, + etc.</para> </section> <section xml:id="eclipse.maven.setup"> - <title>HBase Project Setup in Eclipse</title> - <para>The easiest way is to use the m2eclipse plugin for Eclipse. Eclipse Indigo or newer has m2eclipse built-in, or it can be found here:http://www.eclipse.org/m2e/. M2Eclipse provides Maven integration for Eclipse - it even lets you use the direct Maven commands from within Eclipse to compile and test your project.</para> - <para>To import the project, you merely need to go to File->Import...Maven->Existing Maven Projects and then point Eclipse at the HBase root directory; m2eclipse will automatically find all the hbase modules for you.</para> - <para>If you install m2eclipse and import HBase in your workspace, you will have to fix your eclipse Build Path. - Remove <filename>target</filename> folder, add <filename>target/generated-jamon</filename> - and <filename>target/generated-sources/java</filename> folders. You may also remove from your Build Path - the exclusions on the <filename>src/main/resources</filename> and <filename>src/test/resources</filename> - to avoid error message in the console 'Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (default) on project hbase: - 'An Ant BuildException has occured: Replace: source file .../target/classes/hbase-default.xml doesn't exist'. This will also - reduce the eclipse build cycles and make your life easier when developing.</para> + <title>HBase Project Setup in Eclipse using <code>m2eclipse</code></title> + <para>The easiest way is to use the <command>m2eclipse</command> plugin for Eclipse. + Eclipse Indigo or newer includes <command>m2eclipse</command>, or you can + download it from <link xlink:href="http://www.eclipse.org/m2e/" + >http://www.eclipse.org/m2e/</link>/. It provides Maven integration for + Eclipse, and even lets you use the direct Maven commands from within Eclipse to + compile and test your project.</para> + <para>To import the project, click <menuchoice> + <guimenu>File</guimenu> + <guisubmenu>Import</guisubmenu> + <guisubmenu>Maven</guisubmenu> + <guimenuitem>Existing Maven Projects</guimenuitem> + </menuchoice> and select the HBase root directory. <code>m2eclipse</code> + locates all the hbase modules for you.</para> + <para>If you install <command>m2eclipse</command> and import HBase in your + workspace, do the following to fix your eclipse Build Path. </para> + <orderedlist> + <listitem> + <para>Remove <filename>target</filename> folder</para> + </listitem> + <listitem> + <para>Add <filename>target/generated-jamon</filename> and + <filename>target/generated-sources/java</filename> folders.</para> + </listitem> + <listitem> + <para>Remove from your Build Path the exclusions on the + <filename>src/main/resources</filename> and + <filename>src/test/resources</filename> to avoid error message in + the console, such as the following:</para> + <screen>Failed to execute goal +org.apache.maven.plugins:maven-antrun-plugin:1.6:run (default) on project hbase: +'An Ant BuildException has occured: Replace: source file .../target/classes/hbase-default.xml +doesn't exist</screen> + <para>This will also reduce the eclipse build cycles and make your life + easier when developing. </para> + </listitem> + </orderedlist> </section> <section xml:id="eclipse.commandline"> - <title>Import into eclipse with the command line</title> - <para>For those not inclined to use m2eclipse, you can generate the Eclipse files from the command line. First, run (you should only have to do this once): - <programlisting>mvn clean install -DskipTests</programlisting> - and then close Eclipse and execute... - <programlisting>mvn eclipse:eclipse</programlisting> - ... from your local HBase project directory in your workspace to generate some new <filename>.project</filename> - and <filename>.classpath</filename>files. Then reopen Eclipse, or refresh your eclipse project (F5), and import - the .project file in the HBase directory to a workspace. - </para> + <title>HBase Project Setup in Eclipse Using the Command Line</title> + <para>Instead of using <code>m2eclipse</code>, you can generate the Eclipse files + from the command line. </para> + <orderedlist> + <listitem> + <para>First, run the following command, which builds HBase. You only need to + do this once.</para> + <programlisting language="bourne">mvn clean install -DskipTests</programlisting> + </listitem> + <listitem> + <para>Close Eclipse, and execute the following command from the terminal, in + your local HBase project directory, to generate new + <filename>.project</filename> and <filename>.classpath</filename> + files.</para> + <programlisting language="bourne">mvn eclipse:eclipse</programlisting> + </listitem> + <listitem> + <para>Reopen Eclipse and import the <filename>.project</filename> file in + the HBase directory to a workspace.</para> + </listitem> + </orderedlist> </section> <section xml:id="eclipse.maven.class"> - <title>Maven Classpath Variable</title> - <para>The <varname>M2_REPO</varname> classpath variable needs to be set up for the project. This needs to be set to - your local Maven repository, which is usually <filename>~/.m2/repository</filename></para> -<para>If this classpath variable is not configured, you will see compile errors in Eclipse like this: -</para> <programlisting> + <title>Maven Classpath Variable</title> + <para>The <varname>$M2_REPO</varname> classpath variable needs to be set up for the + project. This needs to be set to your local Maven repository, which is usually + <filename>~/.m2/repository</filename></para> + <para>If this classpath variable is not configured, you will see compile errors in + Eclipse like this: </para> + <screen> Description Resource Path Location Type The project cannot be built until build path errors are resolved hbase Unknown Java Problem Unbound classpath variable: 'M2_REPO/asm/asm/3.1/asm-3.1.jar' in project 'hbase' hbase Build path Build Path Problem Unbound classpath variable: 'M2_REPO/com/google/guava/guava/r09/guava-r09.jar' in project 'hbase' hbase Build path Build Path Problem Unbound classpath variable: 'M2_REPO/com/google/protobuf/protobuf-java/2.3.0/protobuf-java-2.3.0.jar' in project 'hbase' hbase Build path Build Path Problem Unbound classpath variable: - </programlisting> + </screen> </section> <section xml:id="eclipse.issues"> - <title>Eclipse Known Issues</title> - <para>Eclipse will currently complain about <filename>Bytes.java</filename>. It is not possible to turn these errors off.</para> - <programlisting> + <title>Eclipse Known Issues</title> + <para>Eclipse will currently complain about <filename>Bytes.java</filename>. It is + not possible to turn these errors off.</para> + <screen> Description Resource Path Location Type Access restriction: The method arrayBaseOffset(Class) from the type Unsafe is not accessible due to restriction on required library /System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Classes/classes.jar Bytes.java /hbase/src/main/java/org/apache/hadoop/hbase/util line 1061 Java Problem Access restriction: The method arrayIndexScale(Class) from the type Unsafe is not accessible due to restriction on required library /System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Classes/classes.jar Bytes.java /hbase/src/main/java/org/apache/hadoop/hbase/util line 1064 Java Problem Access restriction: The method getLong(Object, long) from the type Unsafe is not accessible due to restriction on required library /System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Classes/classes.jar Bytes.java /hbase/src/main/java/org/apache/hadoop/hbase/util line 1111 Java Problem - </programlisting> - </section> - <section xml:id="eclipse.more"> - <title>Eclipse - More Information</title> - <para>For additional information on setting up Eclipse for HBase development on Windows, see - <link xlink:href="http://michaelmorello.blogspot.com/2011/09/hbase-subversion-eclipse-windows.html">Michael Morello's blog</link> on the topic. - </para> - </section> + </screen> + </section> + <section xml:id="eclipse.more"> + <title>Eclipse - More Information</title> + <para>For additional information on setting up Eclipse for HBase development on + Windows, see <link + xlink:href="http://michaelmorello.blogspot.com/2011/09/hbase-subversion-eclipse-windows.html" + >Michael Morello's blog</link> on the topic. </para> + </section> + </section> + <section> + <title>IntelliJ IDEA</title> + <para>You can set up IntelliJ IDEA for similar functinoality as Eclipse. Follow these steps.</para> + <orderedlist> + <title>Project Setup in IntelliJ IDEA</title> + <listitem> + <para>Select <menuchoice> + <guimenu>Import Project</guimenu> + <guisubmenu>Import Project From External Model</guisubmenu> + <guimenuitem>Maven</guimenuitem> + </menuchoice></para> + </listitem> + <listitem> + <para>You do not need to select a profile. Be sure <guilabel>Maven project + required</guilabel> is selected, and click + <guibutton>Next</guibutton>.</para> + </listitem> + <listitem> + <para>Select the location for the JDK.</para> + </listitem> + </orderedlist> + <formalpara> + <title>Using the HBase Formatter in IntelliJ IDEA</title> + <para>Using the Eclipse Code Formatter plugin for IntelliJ IDEA, you can import the + HBase code formatter described in <xref linkend="eclipse.code.formatting" />.</para> + </formalpara> + </section> + <section> + <title>Other IDEs</title> + <para>It would be userful to mirror the <xref linkend="eclipse"/> set-up instructions + for other IDEs. If you would like to assist, please have a look at <link + xlink:href="https://issues.apache.org/jira/browse/HBASE-11704" + >HBASE-11704</link>.</para> </section> </section> <section xml:id="build"> - <title>Building Apache HBase</title> - <section xml:id="build.basic"> - <title>Basic Compile</title> - <para>Thanks to maven, building HBase is pretty easy. You can read about the various maven commands in <xref linkend="maven.build.commands"/>, - but the simplest command to compile HBase from its java source code is: - <programlisting> -mvn package -DskipTests - </programlisting> - Or, to clean up before compiling: - <programlisting> -mvn clean package -DskipTests - </programlisting> - With Eclipse set up as explained above in <xref linkend="eclipse"/>, you can also simply use the build command in Eclipse. - To create the full installable HBase package takes a little bit more work, so read on. - </para> - <note> - <title>JDK Version Requirements</title> - <para> - Starting with HBase 1.0 you must use Java 7 or later to build from source code. See <xref linkend="java" /> for more complete - information about supported JDK versions. - </para> - </note> - </section> - <section xml:id="build.protobuf"><title>Build Protobuf</title> - <para>You may need to change the protobuf definitions that reside in the hbase-protocol module or other modules.</para> - <para> - The protobuf files are located <filename>hbase-protocol/src/main/protobuf</filename>. - For the change to be effective, you will need to regenerate the classes. You can use maven profile compile-protobuf to do this. - <programlisting> -mvn compile -Dcompile-protobuf -or -mvn compile -Pcompile-protobuf - </programlisting> + <title>Building Apache HBase</title> + <section xml:id="build.basic"> + <title>Basic Compile</title> + <para>HBase is compiled using Maven. You must use Maven 3.x. To check your Maven + version, run the command <command>mvn -version</command>.</para> + <note> + <title>JDK Version Requirements</title> + <para> Starting with HBase 1.0 you must use Java 7 or later to build from source + code. See <xref linkend="java"/> for more complete information about supported + JDK versions. </para> + </note> + <section xml:id="maven.build.commands"> + <title>Maven Build Commands</title> + <para>All commands are executed from the local HBase project directory. </para> + <section> + <title>Package</title> + <para>The simplest command to compile HBase from its java source code is to use + the <code>package</code> target, which builds JARs with the compiled + files.</para> + <programlisting language="bourne">mvn package -DskipTests</programlisting> + <para>Or, to clean up before compiling:</para> + <programlisting language="bourne">mvn clean package -DskipTests</programlisting> + <para>With Eclipse set up as explained above in <xref linkend="eclipse"/>, you + can also use the <guimenu>Build</guimenu> command in Eclipse. To create the + full installable HBase package takes a little bit more work, so read on. + </para> + </section> + <section xml:id="maven.build.commands.compile"> + <title>Compile</title> + <para>The <code>compile</code> target does not create the JARs with the compiled + files.</para> + <programlisting language="bourne">mvn compile</programlisting> + <programlisting language="bourne">mvn clean compile</programlisting> + </section> + <section> + <title>Install</title> + <para>To install the JARs in your <filename>~/.m2/</filename> directory, use the + <code>install</code> target.</para> + <programlisting language="bourne">mvn install</programlisting> + <programlisting language="bourne">mvn clean install</programlisting> + <programlisting language="bourne">mvn clean install -DskipTests</programlisting> + </section> + </section> + <section xml:id="maven.build.commands.unitall"> + <title>Running all or individual Unit Tests</title> + <para>See the <xref linkend="hbase.unittests.cmds"/> section in <xref + linkend="hbase.unittests"/></para> + </section> -You may also want to define protoc.path for the protoc binary - <programlisting> -mvn compile -Dcompile-protobuf -Dprotoc.path=/opt/local/bin/protoc - </programlisting> Read the <filename>hbase-protocol/README.txt</filename> for more details. - </para> - </section> + <section xml:id="maven.build.hadoop"> + <title>Building against various hadoop versions.</title> + <para>As of 0.96, Apache HBase supports building against Apache Hadoop versions: + 1.0.3, 2.0.0-alpha and 3.0.0-SNAPSHOT. By default, in 0.96 and earlier, we will + build with Hadoop-1.0.x. As of 0.98, Hadoop 1.x is deprecated and Hadoop 2.x is + the default. To change the version to build against, add a hadoop.profile + property when you invoke <command>mvn</command>:</para> + <programlisting language="bourne">mvn -Dhadoop.profile=1.0 ...</programlisting> + <para> The above will build against whatever explicit hadoop 1.x version we have in + our <filename>pom.xml</filename> as our '1.0' version. Tests may not all pass so + you may need to pass <code>-DskipTests</code> unless you are inclined to fix the + failing tests.</para> + <note xml:id="maven.build.passing.default.profile"> + <title>'dependencyManagement.dependencies.dependency.artifactId' for + org.apache.hbase:${compat.module}:test-jar with value '${compat.module}' + does not match a valid id pattern</title> + <para>You will see ERRORs like the above title if you pass the + <emphasis>default</emphasis> profile; e.g. if you pass + <property>hadoop.profile=1.1</property> when building 0.96 or + <property>hadoop.profile=2.0</property> when building hadoop 0.98; just + drop the hadoop.profile stipulation in this case to get your build to run + again. This seems to be a maven pecularity that is probably fixable but + we've not spent the time trying to figure it.</para> + </note> - <section xml:id="build.gotchas"><title>Build Gotchas</title> - <para>If you see <code>Unable to find resource 'VM_global_library.vm'</code>, ignore it. - Its not an error. It is <link xlink:href="http://jira.codehaus.org/browse/MSITE-286">officially ugly</link> though. - </para> - </section> - <section xml:id="build.snappy"> - <title>Building in snappy compression support</title> - <para>Pass <code>-Dsnappy</code> to trigger the snappy maven profile for building - snappy native libs into hbase. See also <xref linkend="snappy.compression" /></para> - </section> - </section> + <para> Similarly, for 3.0, you would just replace the profile value. Note that + Hadoop-3.0.0-SNAPSHOT does not currently have a deployed maven artificat - you + will need to build and install your own in your local maven repository if you + want to run against this profile. </para> + <para> In earilier versions of Apache HBase, you can build against older versions of + Apache Hadoop, notably, Hadoop 0.22.x and 0.23.x. If you are running, for + example HBase-0.94 and wanted to build against Hadoop 0.23.x, you would run + with:</para> + <programlisting language="bourne">mvn -Dhadoop.profile=22 ...</programlisting> + </section> + <section xml:id="build.protobuf"> + <title>Build Protobuf</title> + <para>You may need to change the protobuf definitions that reside in the + <filename>hbase-protocol</filename> module or other modules.</para> + <para> The protobuf files are located in + <filename>hbase-protocol/src/main/protobuf</filename>. For the change to be + effective, you will need to regenerate the classes. You can use maven profile + <code>compile-protobuf</code> to do this.</para> + <programlisting language="bourne">mvn compile -Pcompile-protobuf</programlisting> + <para>You may also want to define <varname>protoc.path</varname> for the protoc + binary, using the following command:</para> + <programlisting language="bourne"> +mvn compile -Pcompile-protobuf -Dprotoc.path=/opt/local/bin/protoc + </programlisting> + <para>Read the <filename>hbase-protocol/README.txt</filename> for more details. + </para> + </section> - <section xml:id="releasing"> - <title>Releasing Apache HBase</title> - <para>HBase 0.96.x will run on hadoop 1.x or hadoop 2.x. HBase 0.98 will run on - both also (but HBase 0.98 deprecates use of hadoop 1). HBase 1.x will NOT - run on hadoop 1. In what follows, we make a distinction between HBase 1.x - builds and the awkward process involved building HBase 0.96/0.98 for either - hadoop 1 or hadoop 2 targets. - </para> - <section><title>Building against HBase 0.96-0.98</title> - <para>Building 0.98 and 0.96, you must choose which hadoop to build against; - we cannot make a single HBase binary that can run against both hadoop1 and - hadoop2. Since we include the Hadoop we were built - against -- so we can do standalone mode -- the set of modules included - in the tarball changes dependent on whether the hadoop1 or hadoop2 target - is chosen. You can tell which HBase you have -- whether it is for hadoop1 - or hadoop2 by looking at the version; the HBase for hadoop1 bundle will - include 'hadoop1' in its version. Ditto for hadoop2. - </para> - <para>Maven, our build system, natively will not let you have a single product - built against different dependencies. It is understandable. But neither could - we convince maven to change the set of included modules and write out - the correct poms w/ appropriate dependencies even though we have two - build targets; one for hadoop1 and another for hadoop2. So, there is a prestep - required. This prestep takes as input the current pom.xmls and it generates hadoop1 or - hadoop2 versions using a script in <filename>dev-tools</filename> called - <filename>generate-hadoopX-poms.sh</filename>. You then reference these generated - poms when you build. For now, just be aware of the difference between HBase 1.x - builds and those of HBase 0.96-0.98. Below we will come back to this difference - when we list out build instructions.</para> + <section xml:id="build.thrift"> + <title>Build Thrift</title> + <para>You may need to change the thrift definitions that reside in the + <filename>hbase-thrift</filename> module or other modules.</para> + <para>The thrift files are located in + <filename>hbase-thrift/src/main/resources</filename>. + For the change to be effective, you will need to regenerate the classes. + You can use maven profile <code>compile-thrift</code> to do this.</para> + <programlisting language="bourne">mvn compile -Pcompile-thrift</programlisting> + <para>You may also want to define <varname>thrift.path</varname> for the thrift + binary, using the following command:</para> + <programlisting language="bourne"> + mvn compile -Pcompile-thrift -Dthrift.path=/opt/local/bin/thrift + </programlisting> + </section> - -<para xml:id="mvn.settings.file">Publishing to maven requires you sign the artifacts you want to upload. To have the - build do this for you, you need to make sure you have a properly configured - <filename>settings.xml</filename> in your local repository under <filename>.m2</filename>. - Here is my <filename>~/.m2/settings.xml</filename>. - <programlisting><![CDATA[<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0" + <section> + <title>Build a Tarball</title> + <para>You can build a tarball without going through the release process described in + <xref linkend="releasing"/>, by running the following command:</para> + <screen>mvn -DskipTests clean install && mvn -DskipTests package assembly:single</screen> + <para>The distribution tarball is built in + <filename>hbase-assembly/target/hbase-<replaceable><version></replaceable>-bin.tar.gz</filename>.</para> + </section> + <section xml:id="build.gotchas"> + <title>Build Gotchas</title> + <para>If you see <code>Unable to find resource 'VM_global_library.vm'</code>, ignore + it. Its not an error. It is <link + xlink:href="http://jira.codehaus.org/browse/MSITE-286">officially + ugly</link> though. </para> + </section> + <section xml:id="build.snappy"> + <title>Building in snappy compression support</title> + <para>Pass <code>-Psnappy</code> to trigger the <code>hadoop-snappy</code> maven profile + for building Google Snappy native libraries into HBase. See also <xref + linkend="snappy.compression.installation"/></para> + </section> + </section> + </section> + <section xml:id="releasing"> + <title>Releasing Apache HBase</title> + <note> + <title>Building against HBase 1.x</title> + <para>HBase 1.x requires Java 7 to build. See <xref linkend="java"/> for Java + requirements per HBase release.</para> + </note> + <section> + <title>Building against HBase 0.96-0.98</title> + <para>HBase 0.96.x will run on Hadoop 1.x or Hadoop 2.x. HBase 0.98 still runs on both, + but HBase 0.98 deprecates use of Hadoop 1. HBase 1.x will <emphasis>not</emphasis> + run on Hadoop 1. In the following procedures, we make a distinction between HBase + 1.x builds and the awkward process involved building HBase 0.96/0.98 for either + Hadoop 1 or Hadoop 2 targets. </para> + <para>You must choose which Hadoop to build against. It is not possible to build a + single HBase binary that runs against both Hadoop 1 and Hadoop 2. Hadoop is included + in the build, because it is needed to run HBase in standalone mode. Therefore, the + set of modules included in the tarball changes, depending on the build target. To + determine which HBase you have, look at the HBase version. The Hadoop version is + embedded within it.</para> + <para>Maven, our build system, natively does not allow a single product to be built + against different dependencies. Also, Maven cannot change the set of included + modules and write out the correct <filename>pom.xml</filename> files with + appropriate dependencies, even using two build targets, one for Hadoop 1 and another + for Hadoop 2. A prerequisite step is required, which takes as input the current + <filename>pom.xml</filename>s and generates Hadoop 1 or Hadoop 2 versions using + a script in the <filename>dev-tools/</filename> directory, called + <filename>generate-hadoop<replaceable>X</replaceable>-poms.sh</filename> + where <replaceable>X</replaceable> is either <literal>1</literal> or + <literal>2</literal>. You then reference these generated poms when you build. + For now, just be aware of the difference between HBase 1.x builds and those of HBase + 0.96-0.98. This difference is important to the build instructions.</para> + + + <example xml:id="mvn.settings.file"> + <title>Example <filename>~/.m2/settings.xml</filename> File</title> + <para>Publishing to maven requires you sign the artifacts you want to upload. For + the build to sign them for you, you a properly configured + <filename>settings.xml</filename> in your local repository under + <filename>.m2</filename>, such as the following.</para> + <programlisting language="xml"><![CDATA[<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0 http://maven.apache.org/xsd/settings-1.0.0.xsd"> @@ -254,179 +524,293 @@ mvn compile -Dcompile-protobuf -Dprotoc.path=/opt/local/bin/protoc </profile> </profiles> </settings>]]> - </programlisting> - </para> - <para>You must use maven 3.0.x (Check by running <command>mvn -version</command>). - </para> - </section> - <section xml:id="maven.release"> - <title>Making a Release Candidate</title> - <para>I'll explain by running through the process. See later in this section for more detail on particular steps. -These instructions are for building HBase 1.0.x. For building earlier versions, the process is different. See this section -under the respective release documentation folders. - </para> - <para>If you are making a point release (for example to quickly address a critical incompatability or security - problem) off of a release branch instead of a development branch the tagging instructions are slightly different. - I'll prefix those special steps with <emphasis>Point Release Only</emphasis>. - </para> - - <para>I would advise before you go about making a release candidate, do a practise run by deploying a SNAPSHOT. - Also, make sure builds have been passing recently for the branch from where you are going to take your - release. You should also have tried recent branch tips out on a cluster under load running for instance - our hbase-it integration test suite for a few hours to 'burn in' the near-candidate bits. - </para> - <note> - <title>Point Release Only</title> - <para>At this point you should tag the previous release branch (ex: 0.96.1) with - the new point release tag (e.g. 0.96.1.1 tag). Any commits with changes or mentioned below for the point release - should be appled to the new tag. - </para> - </note> - - - <para>The <link xlink:href="http://wiki.apache.org/hadoop/HowToRelease">Hadoop How To Release</link> wiki - page informs much of the below and may have more detail on particular sections so it is worth review.</para> - - <para>Update CHANGES.txt with the changes since the last release. - Make sure the URL to the JIRA points to the properly location listing fixes for this release. - Adjust the version in all the poms appropriately. If you are making a release candidate, you must - remove the <emphasis>-SNAPSHOT</emphasis> from all versions. If you are running this receipe to - publish a SNAPSHOT, you must keep the <emphasis>-SNAPSHOT</emphasis> suffix on the hbase version. - The <link xlink:href="http://mojo.codehaus.org/versions-maven-plugin/">Versions Maven Plugin</link> can be of use here. To - set a version in all the many poms of the hbase multi-module project, do something like this: - <programlisting>$ mvn clean org.codehaus.mojo:versions-maven-plugin:1.3.1:set -DnewVersion=0.96.0</programlisting> - Checkin the <filename>CHANGES.txt</filename> and any version changes. - </para> - <para> - Update the documentation under <filename>src/main/docbkx</filename>. This usually involves copying the - latest from trunk making version-particular adjustments to suit this release candidate version. - </para> - <para>Now, build the src tarball. This tarball is hadoop version independent. It is just the pure src code and documentation without a particular hadoop taint, etc. - Add the <varname>-Prelease</varname> profile when building; it checks files for licenses and will fail the build if unlicensed files present. - <programlisting>$ MAVEN_OPTS="-Xmx2g" mvn clean install -DskipTests assembly:single -Dassembly.file=hbase-assembly/src/main/assembly/src.xml -Prelease</programlisting> - Undo the tarball and make sure it looks good. A good test for the src tarball being 'complete' is to see if - you can build new tarballs from this source bundle. - If the source tarball is good, save it off to a <emphasis>version directory</emphasis>, i.e a directory somewhere where you are collecting - all of the tarballs you will publish as part of the release candidate. For example if we were building a - hbase-0.96.0 release candidate, we might call the directory <filename>hbase-0.96.0RC0</filename>. Later - we will publish this directory as our release candidate up on people.apache.org/~YOU. - </para> - <para>Now lets build the binary tarball. - Add the <varname>-Prelease</varname> profile when building; it checks files for licenses and will fail the build if unlicensed files present. - Do it in two steps. First install into the local repository and then generate documentation and assemble the tarball - (Otherwise build complains that hbase modules are not in maven repo when we try to do it all in the one go especially on fresh repo). - It seems that you need the install goal in both steps. - <programlisting>$ MAVEN_OPTS="-Xmx3g" mvn clean install -DskipTests -Prelease -$ MAVEN_OPTS="-Xmx3g" mvn install -DskipTests site assembly:single -Prelease</programlisting> -Undo the generated tarball and check it out. Look at doc. and see if it runs, etc. -If good, copy the tarball to the above mentioned <emphasis>version directory</emphasis>. -</para> -<note><title>Point Release Only</title><para>The following step that creates a new tag can be skipped since you've already created the point release tag</para></note> -<para>I'll tag the release at this point since its looking good. If we find an issue later, we can delete the tag and start over. Release needs to be tagged when we do next step.</para> -<para>Now deploy hbase to the apache maven repository. -This time we use the <varname>apache-release</varname> profile instead of just <varname>release</varname> profile when doing mvn deploy; -it will invoke the apache pom referenced by our poms. It will also sign your artifacts published to mvn as long as your settings.xml in your local <filename>.m2</filename> -repository is configured correctly (your <filename>settings.xml</filename> adds your gpg password property to the apache profile). -<programlisting>$ MAVEN_OPTS="-Xmx3g" mvn deploy -DskipTests -Papache-release</programlisting> -The last command above copies all artifacts up to a temporary staging apache mvn repo in an 'open' state. -We'll need to do more work on these maven artifacts to make them generally available. -</para> - - <para>The script <filename>dev-support/make_rc.sh</filename> automates alot of the above listed release steps. - It does not do the modification of the CHANGES.txt for the release, the close of the - staging repository up in apache maven (human intervention is needed here), the checking of - the produced artifacts to ensure they are 'good' -- e.g. undoing the produced tarballs, eyeballing them to make - sure they look right then starting and checking all is running properly -- and then the signing and pushing of - the tarballs to people.apache.org but it does the other stuff; it can come in handy. - </para> + </programlisting> + </example> - <para>Now lets get back to what is up in maven. Our artifacts should be up in maven repository in the staging area -in the 'open' state. While in this 'open' state you can check out what you've published to make sure all is good. -To do this, login at repository.apache.org -using your apache id. Find your artifacts in the staging repository. Browse the content. Make sure all artifacts made it up -and that the poms look generally good. If it checks out, 'close' the repo. This will make the artifacts publically available. -You will receive an email with the URL to give out for the temporary staging repository for others to use trying out this new -release candidate. Include it in the email that announces the release candidate. Folks will need to add this repo URL to their -local poms or to their local settings.xml file to pull the published release candidate artifacts. If the published artifacts are incomplete -or borked, just delete the 'open' staged artifacts. -<note> - <title>hbase-downstreamer</title> - <para> - See the <link xlink:href="https://github.com/saintstack/hbase-downstreamer">hbase-downstreamer</link> test for a simple - example of a project that is downstream of hbase an depends on it. - Check it out and run its simple test to make sure maven artifacts are properly deployed to the maven repository. - Be sure to edit the pom to point at the proper staging repo. Make sure you are pulling from the repo when tests run and that you are not - getting from your local repo (pass -U or delete your local repo content and check maven is pulling from remote out of the staging repo). - </para> -</note> - See <link xlink:href="http://www.apache.org/dev/publishing-maven-artifacts.html">Publishing Maven Artifacts</link> for - some pointers on this maven staging process. + </section> + <section xml:id="maven.release"> + <title>Making a Release Candidate</title> <note> - <para>We no longer publish using the maven release plugin. Instead we do mvn deploy. It seems to give - us a backdoor to maven release publishing. If no <emphasis>-SNAPSHOT</emphasis> on the version - string, then we are 'deployed' to the apache maven repository staging directory from which we - can publish URLs for candidates and later, if they pass, publish as release (if a - <emphasis>-SNAPSHOT</emphasis> on the version string, deploy will put the artifacts up into - apache snapshot repos). - </para> + <para>These instructions are for building HBase 1.0.x. For building earlier + versions, the process is different. See this section under the respective + release documentation folders. </para></note> + <formalpara> + <title>Point Releases</title> + <para>If you are making a point release (for example to quickly address a critical + incompatability or security problem) off of a release branch instead of a + development branch, the tagging instructions are slightly different. I'll prefix + those special steps with <emphasis>Point Release Only</emphasis>. </para> + </formalpara> + + <formalpara> + <title>Before You Begin</title> + <para>Before you make a release candidate, do a practice run by deploying a + snapshot. Before you start, check to be sure recent builds have been passing for + the branch from where you are going to take your release. You should also have + tried recent branch tips out on a cluster under load, perhaps by running the + <code>hbase-it</code> integration test suite for a few hours to 'burn in' + the near-candidate bits. </para> + </formalpara> + <note> + <title>Point Release Only</title> + <para>At this point you should tag the previous release branch (ex: 0.96.1) with the + new point release tag (e.g. 0.96.1.1 tag). Any commits with changes for the + point release should be appled to the new tag. </para> </note> -</para> - -<para>If the hbase version ends in <varname>-SNAPSHOT</varname>, the artifacts go elsewhere. They are put into the apache snapshots repository - directly and are immediately available. Making a SNAPSHOT release, this is what you want to happen.</para> - - <para> - At this stage we have two tarballs in our 'version directory' and a set of artifacts up in maven in staging area in the - 'closed' state publically available in a temporary staging repository whose URL you should have gotten in an email. - The above mentioned script, <filename>make_rc.sh</filename> does all of the above for you minus the check of the artifacts built, - the closing of the staging repository up in maven, and the tagging of the release. If you run the script, do your checks at this - stage verifying the src and bin tarballs and checking what is up in staging using hbase-downstreamer project. Tag before you start - the build. You can always delete it if the build goes haywire. - </para> - <para> - If all checks out, next put the <emphasis>version directory</emphasis> up on people.apache.org. You will need to sign and fingerprint them before you - push them up. In the <emphasis>version directory</emphasis> do this: - <programlisting>$ for i in *.tar.gz; do echo $i; gpg --print-mds $i > $i.mds ; done + + + <para>The Hadoop <link xlink:href="http://wiki.apache.org/hadoop/HowToRelease">How To + Release</link> wiki page is used as a model for most of the instructions below, + and may have more detail on particular sections, so it is worth review.</para> + + <note> + <title>Specifying the Heap Space for Maven on OSX</title> + <para>On OSX, you may need to specify the heap space for Maven commands, by setting + the <varname>MAVEN_OPTS</varname> variable to <literal>-Xmx3g</literal>. You can + prefix the variable to the Maven command, as in the following example:</para> + <screen>MAVEN_OPTS="-Xmx2g" mvn package</screen> + <para>You could also set this in an environment variable or alias in your + shell.</para> + </note> + <procedure> + <title>Release Procedure</title> + <para>The script <filename>dev-support/make_rc.sh</filename> automates many of these + steps. It does not do the modification of the <filename>CHANGES.txt</filename> + for the release, the close of the staging repository in Apache Maven (human + intervention is needed here), the checking of the produced artifacts to ensure + they are 'good' -- e.g. extracting the produced tarballs, verifying that they + look right, then starting HBase and checking that everything is running + correctly, then the signing and pushing of the tarballs to <link + xlink:href="http://people.apache.org">people.apache.org</link>. The script + handles everything else, and comes in handy.</para> + <step> + <title>Update the <filename>CHANGES.txt</filename> file and the POM files.</title> + <para>Update <filename>CHANGES.txt</filename> with the changes since the last + release. Make sure the URL to the JIRA points to the proper location which + lists fixes for this release. Adjust the version in all the POM files + appropriately. If you are making a release candidate, you must remove the + <literal>-SNAPSHOT</literal> label from all versions. If you are running + this receipe to publish a snapshot, you must keep the + <literal>-SNAPSHOT</literal> suffix on the hbase version. The <link + xlink:href="http://mojo.codehaus.org/versions-maven-plugin/">Versions + Maven Plugin</link> can be of use here. To set a version in all the many + poms of the hbase multi-module project, use a command like the + following:</para> + <programlisting language="bourne"> +$ mvn clean org.codehaus.mojo:versions-maven-plugin:1.3.1:set -DnewVersion=0.96.0 + </programlisting> + <para>Checkin the <filename>CHANGES.txt</filename> and any version + changes.</para> + </step> + <step> + <title>Update the documentation.</title> + <para> Update the documentation under <filename>src/main/docbkx</filename>. This + usually involves copying the latest from trunk and making version-particular + adjustments to suit this release candidate version. </para> + </step> + <step> + <title>Build the source tarball.</title> + <para>Now, build the source tarball. This tarball is Hadoop-version-independent. + It is just the pure source code and documentation without a particular + hadoop taint, etc. Add the <varname>-Prelease</varname> profile when + building. It checks files for licenses and will fail the build if unlicensed + files are present.</para> + <programlisting language="bourne"> +$ mvn clean install -DskipTests assembly:single -Dassembly.file=hbase-assembly/src/main/assembly/src.xml -Prelease + </programlisting> + <para>Extract the tarball and make sure it looks good. A good test for the src + tarball being 'complete' is to see if you can build new tarballs from this + source bundle. If the source tarball is good, save it off to a + <emphasis>version directory</emphasis>, a directory somewhere where you + are collecting all of the tarballs you will publish as part of the release + candidate. For example if you were building a hbase-0.96.0 release + candidate, you might call the directory + <filename>hbase-0.96.0RC0</filename>. Later you will publish this directory + as our release candidate up on <link xlink:href="people.apache.org/~YOU" + >people.apache.org/<replaceable>~YOU</replaceable>/</link>. </para> + </step> + <step> + <title>Build the binary tarball.</title> + <para>Next, build the binary tarball. Add the <varname>-Prelease</varname> + profile when building. It checks files for licenses and will fail the build + if unlicensed files are present. Do it in two steps.</para> + <substeps> + <step> + <para>First install into the local repository</para> + <programlisting language="bourne"> +$ mvn clean install -DskipTests -Prelease</programlisting> + </step> + <step> + <para>Next, generate documentation and assemble the tarball.</para> + <programlisting language="bourne"> +$ mvn install -DskipTests site assembly:single -Prelease</programlisting> + </step> + </substeps> + <para> Otherwise, the build complains that hbase modules are not in the maven + repository when you try to do it at once, especially on fresh repository. It + seems that you need the install goal in both steps.</para> + <para>Extract the generated tarball and check it out. Look at the documentation, + see if it runs, etc. If good, copy the tarball to the above mentioned + <emphasis>version directory</emphasis>. </para> + </step> + <step> + <title>Create a new tag.</title> + <note> + <title>Point Release Only</title> + <para>The following step that creates a new tag can be skipped since you've + already created the point release tag</para> + </note> + <para>Tag the release at this point since it looks good. If you find an issue + later, you can delete the tag and start over. Release needs to be tagged for + the next step.</para> + </step> + <step> + <title>Deploy to the Maven Repository.</title> + <para>Next, deploy HBase to the Apache Maven repository, using the + <varname>apache-release</varname> profile instead of the + <varname>release</varname> profile when running the <command>mvn + deploy</command> command. This profile invokes the Apache pom referenced + by our pom files, and also signs your artifacts published to Maven, as long + as the <filename>settings.xml</filename> is configured correctly, as + described in <xref linkend="mvn.settings.file"/>.</para> + <programlisting language="bourne"> +$ mvn deploy -DskipTests -Papache-release</programlisting> + <para>This command copies all artifacts up to a temporary staging Apache mvn + repository in an 'open' state. More work needs to be done on these maven + artifacts to make them generally available. </para> + </step> + <step> + <title>Make the Release Candidate available.</title> + <para>The artifacts are in the maven repository in the staging area in the + 'open' state. While in this 'open' state you can check out what you've + published to make sure all is good. To do this, login at <link + xlink:href="http://repository.apache.org">repository.apache.org</link> + using your Apache ID. Find your artifacts in the staging repository. Browse + the content. Make sure all artifacts made it up and that the poms look + generally good. If it checks out, 'close' the repo. This will make the + artifacts publically available. You will receive an email with the URL to + give out for the temporary staging repository for others to use trying out + this new release candidate. Include it in the email that announces the + release candidate. Folks will need to add this repo URL to their local poms + or to their local <filename>settings.xml</filename> file to pull the + published release candidate artifacts. If the published artifacts are + incomplete or have problems, just delete the 'open' staged artifacts.</para> + <note> + <title>hbase-downstreamer</title> + <para> See the <link + xlink:href="https://github.com/saintstack/hbase-downstreamer" + >hbase-downstreamer</link> test for a simple example of a project + that is downstream of HBase an depends on it. Check it out and run its + simple test to make sure maven artifacts are properly deployed to the + maven repository. Be sure to edit the pom to point to the proper staging + repository. Make sure you are pulling from the repository when tests run + and that you are not getting from your local repository, by either + passing the <code>-U</code> flag or deleting your local repo content and + check maven is pulling from remote out of the staging repository. + </para> + </note> + <para>See <link + xlink:href="http://www.apache.org/dev/publishing-maven-artifacts.html" + >Publishing Maven Artifacts</link> for some pointers on this maven + staging process.</para> + <note> + <para>We no longer publish using the maven release plugin. Instead we do + <command>mvn deploy</command>. It seems to give us a backdoor to + maven release publishing. If there is no <emphasis>-SNAPSHOT</emphasis> + on the version string, then we are 'deployed' to the apache maven + repository staging directory from which we can publish URLs for + candidates and later, if they pass, publish as release (if a + <emphasis>-SNAPSHOT</emphasis> on the version string, deploy will + put the artifacts up into apache snapshot repos). </para> + </note> + <para>If the HBase version ends in <varname>-SNAPSHOT</varname>, the artifacts + go elsewhere. They are put into the Apache snapshots repository directly and + are immediately available. Making a SNAPSHOT release, this is what you want + to happen.</para> + </step> + <step> + <title>If you used the <filename>make_rc.sh</filename> script instead of doing + the above manually, do your sanity checks now.</title> + <para> At this stage, you have two tarballs in your 'version directory' and a + set of artifacts in a staging area of the maven repository, in the 'closed' + state. These are publicly accessible in a temporary staging repository whose + URL you should have gotten in an email. The above mentioned script, + <filename>make_rc.sh</filename> does all of the above for you minus the + check of the artifacts built, the closing of the staging repository up in + maven, and the tagging of the release. If you run the script, do your checks + at this stage verifying the src and bin tarballs and checking what is up in + staging using hbase-downstreamer project. Tag before you start the build. + You can always delete it if the build goes haywire. </para> + </step> + <step> + <title>Sign and upload your version directory to <link + xlink:href="http://people.apache.org">people.apache.org</link>.</title> + <para> If all checks out, next put the <emphasis>version directory</emphasis> up + on <link xlink:href="http://people.apache.org">people.apache.org</link>. You + will need to sign and fingerprint them before you push them up. In the + <emphasis>version directory</emphasis> run the following commands: + </para> + <programlisting language="bourne"> +$ for i in *.tar.gz; do echo $i; gpg --print-mds $i > $i.mds ; done $ for i in *.tar.gz; do echo $i; gpg --armor --output $i.asc --detach-sig $i ; done $ cd .. # Presuming our 'version directory' is named 0.96.0RC0, now copy it up to people.apache.org. $ rsync -av 0.96.0RC0 people.apache.org:public_html - </programlisting> - </para> - <para>Make sure the people.apache.org directory is showing and that the - mvn repo urls are good. - Announce the release candidate on the mailing list and call a vote. - </para> - </section> - <section xml:id="maven.snapshot"> - <title>Publishing a SNAPSHOT to maven</title> - <para>Make sure your <filename>settings.xml</filename> is set up properly (see above for how). - Make sure the hbase version includes <varname>-SNAPSHOT</varname> as a suffix. Here is how I published SNAPSHOTS of - a release that had an hbase version of 0.96.0 in its poms. - <programlisting>$ MAVEN_OPTS="-Xmx3g" mvn clean install -DskipTests javadoc:aggregate site assembly:single -Prelease - $ MAVEN_OPTS="-Xmx3g" mvn -DskipTests deploy -Papache-release</programlisting> -</para> -<para>The <filename>make_rc.sh</filename> script mentioned above in the - (see <xref linkend="maven.release"/>) can help you publish <varname>SNAPSHOTS</varname>. - Make sure your hbase.version has a <varname>-SNAPSHOT</varname> suffix and then run - the script. It will put a snapshot up into the apache snapshot repository for you. -</para> - </section> + </programlisting> + <para>Make sure the <link xlink:href="http://people.apache.org" + >people.apache.org</link> directory is showing and that the mvn repo + URLs are good. Announce the release candidate on the mailing list and call a + vote. </para> + </step> + </procedure> + </section> + <section xml:id="maven.snapshot"> + <title>Publishing a SNAPSHOT to maven</title> + <para>Make sure your <filename>settings.xml</filename> is set up properly, as in <xref + linkend="mvn.settings.file"/>. Make sure the hbase version includes + <varname>-SNAPSHOT</varname> as a suffix. Following is an example of publishing + SNAPSHOTS of a release that had an hbase version of 0.96.0 in its poms.</para> + <programlisting language="bourne"> + $ mvn clean install -DskipTests javadoc:aggregate site assembly:single -Prelease + $ mvn -DskipTests deploy -Papache-release</programlisting> + <para>The <filename>make_rc.sh</filename> script mentioned above (see <xref + linkend="maven.release"/>) can help you publish <varname>SNAPSHOTS</varname>. + Make sure your <varname>hbase.version</varname> has a <varname>-SNAPSHOT</varname> + suffix before running the script. It will put a snapshot up into the apache snapshot + repository for you. </para> + </section> - </section> + </section> + <section xml:id="hbase.rc.voting"> + <title>Voting on Release Candidates</title> + <para> Everyone is encouraged to try and vote on HBase release candidates. Only the votes of + PMC members are binding. PMC members, please read this WIP doc on policy voting for a + release candidate, <link + xlink:href="https://github.com/rectang/asfrelease/blob/master/release.md">Release + Policy</link>. <quote>Before casting +1 binding votes, individuals are required to + download the signed source code package onto their own hardware, compile it as + provided, and test the resulting executable on their own platform, along with also + validating cryptographic signatures and verifying that the package meets the + requirements of the ASF policy on releases.</quote> Regards the latter, run + <command>mvn apache-rat:check</command> to verify all files are suitably licensed. + See <link xlink:href="http://search-hadoop.com/m/DHED4dhFaU">HBase, mail # dev - On + recent discussion clarifying ASF release policy</link>. for how we arrived at this + process. </para> + </section> <section xml:id="documentation"> <title>Generating the HBase Reference Guide</title> - <para>The manual is marked up using <link xlink:href="http://www.docbook.org/">docbook</link>. - We then use the <link xlink:href="http://code.google.com/p/docbkx-tools/">docbkx maven plugin</link> - to transform the markup to html. This plugin is run when you specify the <command>site</command> - goal as in when you run <command>mvn site</command> or you can call the plugin explicitly to - just generate the manual by doing <command>mvn docbkx:generate-html</command> - (TODO: It looks like you have to run <command>mvn site</command> first because docbkx wants to - include a transformed <filename>hbase-default.xml</filename>. Fix). - When you run mvn site, we do the document generation twice, once to generate the multipage - manual and then again for the single page manual (the single page version is easier to search). - </para> + <para>The manual is marked up using <link xlink:href="http://www.docbook.org/" + >docbook</link>. We then use the <link + xlink:href="http://code.google.com/p/docbkx-tools/">docbkx maven plugin</link> to + transform the markup to html. This plugin is run when you specify the + <command>site</command> goal as in when you run <command>mvn site</command> or you + can call the plugin explicitly to just generate the manual by doing <command>mvn + docbkx:generate-html</command>. When you run <command>mvn site</command>, the + documentation is generated twice, once to generate the multipage manual and then again + for the single page manual, which is easier to search. See <xref + linkend="appendix_contributing_to_documentation"/> for more information on building + the documentation. </para> </section> <section xml:id="hbase.org"> <title>Updating <link xlink:href="http://hbase.apache.org">hbase.apache.org</link></title> @@ -461,35 +845,74 @@ $ rsync -av 0.96.0RC0 people.apache.org:public_html </section> </section> <section xml:id="hbase.tests"> - <title>Tests</title> - -<para> Developers, at a minimum, should familiarize themselves with the unit test detail; unit tests in -HBase have a character not usually seen in other projects.</para> - -<section xml:id="hbase.moduletests"> -<title>Apache HBase Modules</title> -<para>As of 0.96, Apache HBase is split into multiple modules which creates "interesting" rules for - how and where tests are written. If you are writing code for + <title>Tests</title> + + <para> Developers, at a minimum, should familiarize themselves with the unit test detail; + unit tests in HBase have a character not usually seen in other projects.</para> + <para>This information is about unit tests for HBase itself. For developing unit tests for + your HBase applications, see <xref linkend="unit.tests"/>.</para> + <section xml:id="hbase.moduletests"> + <title>Apache HBase Modules</title> + <para>As of 0.96, Apache HBase is split into multiple modules. This creates + "interesting" rules for how and where tests are written. If you are writing code for <classname>hbase-server</classname>, see <xref linkend="hbase.unittests"/> for - how to write your tests; these tests can spin up a minicluster and will need to be + how to write your tests. These tests can spin up a minicluster and will need to be categorized. For any other module, for example <classname>hbase-common</classname>, the tests must be strict unit tests and just test the class under test - no use of the HBaseTestingUtility or minicluster is allowed (or even possible given the dependency tree).</para> + <section xml:id="hbase.moduletest.shell"> + <title>Testing the HBase Shell</title> + <para> + The HBase shell and its tests are predominantly written in jruby. In order to make these + tests run as a part of the standard build, there is a single JUnit test, + <classname>TestShell</classname>, that takes care of loading the jruby implemented tests and + running them. You can run all of these tests from the top level with: + </para> + <programlisting language="bourne"> + mvn clean test -Dtest=TestShell + </programlisting> + <para> + Alternatively, you may limit the shell tests that run using the system variable + <classname>shell.test</classname>. This value may specify a particular test case by name. For + example, the tests that cover the shell commands for altering tables are contained in the test + case <classname>AdminAlterTableTest</classname> and you can run them with: + </para> + <programlisting language="bourne"> + mvn clean test -Dtest=TestShell -Dshell.test=AdminAlterTableTest + </programlisting> + <para> + You may also use a <link xlink:href= + "http://docs.ruby-doc.com/docs/ProgrammingRuby/html/language.html#UJ">Ruby Regular Expression + literal</link> (in the <classname>/pattern/</classname> style) to select a set of test cases. + You can run all of the HBase admin related tests, including both the normal administration and + the security administration, with the command: + </para> + <programlisting language="bourne"> + mvn clean test -Dtest=TestShell -Dshell.test=/.*Admin.*Test/ + </programlisting> + <para> + In the event of a test failure, you can see details by examining the XML version of the + surefire report results + </para> + <programlisting language="bourne"> + vim hbase-shell/target/surefire-reports/TEST-org.apache.hadoop.hbase.client.TestShell.xml + </programlisting> + </section> <section xml:id="hbase.moduletest.run"> <title>Running Tests in other Modules</title> <para>If the module you are developing in has no other dependencies on other HBase modules, then you can cd into that module and just run:</para> - <programlisting>mvn test</programlisting> + <programlisting language="bourne">mvn test</programlisting> <para>which will just run the tests IN THAT MODULE. If there are other dependencies on other modules, then you will have run the command from the ROOT HBASE DIRECTORY. This will run the tests in the other modules, unless you specify to skip the tests in that module. For instance, to skip the tests in the hbase-server module, you would run:</para> - <programlisting>mvn clean test -PskipServerTests</programlisting> + <programlisting language="bourne">mvn clean test -PskipServerTests</programlisting> <para>from the top level directory to run all the tests in modules other than hbase-server. Note that you can specify to skip tests in multiple modules as well as just for a single module. For example, to skip the tests in <classname>hbase-server</classname> and <classname>hbase-common</classname>, you would run:</para> - <programlisting>mvn clean test -PskipServerTests -PskipCommonTests</programlisting> + <programlisting language="bourne">mvn clean test -PskipServerTests -PskipCommonTests</programlisting> <para>Also, keep in mind that if you are running tests in the <classname>hbase-server</classname> module you will need to apply the maven profiles discussed in <xref linkend="hbase.unittests.cmds"/> to get the tests to run properly.</para> </section> @@ -502,7 +925,7 @@ integration with corresponding JUnit <link xlink:href="http://www.junit.org/node <classname>SmallTests</classname>, <classname>MediumTests</classname>, <classname>LargeTests</classname>, <classname>IntegrationTests</classname>. JUnit categories are denoted using java annotations and look like this in your unit test code.</para> -<programlisting>... +<programlisting language="java">... @Category(SmallTests.class) public class TestHRegionInfo { @Test @@ -510,108 +933,112 @@ public class TestHRegionInfo { // ... } }</programlisting> - <para>The above example shows how to mark a unit test as belonging to the small category. - All unit tests in HBase have a categorization. </para> - <para> The first three categories, small, medium, and large are for tests run when you - type <code>$ mvn test</code>; i.e. these three categorizations are for HBase unit - tests. The integration category is for not for unit tests but for integration tests. - These are run when you invoke <code>$ mvn verify</code>. Integration tests are - described in <xref - linkend="integration.tests" /> and will not be discussed further in this section - on HBase unit tests.</para> - <para> Apache HBase uses a patched maven surefire plugin and maven profiles to implement + <para>The above example shows how to mark a unit test as belonging to the + <literal>small</literal> category. All unit tests in HBase have a + categorization. </para> + <para> The first three categories, <literal>small</literal>, <literal>medium</literal>, + and <literal>large</literal>, are for tests run when you type <code>$ mvn + test</code>. In other words, these three categorizations are for HBase unit + tests. The <literal>integration</literal> category is not for unit tests, but for + integration tests. These are run when you invoke <code>$ mvn verify</code>. + Integration tests are described in <xref linkend="integration.tests"/>.</para> + <para>HBase uses a patched maven surefire plugin and maven profiles to implement its unit test characterizations. </para> - <para>Read the below to figure which annotation of the set small, medium, and large to + <para>Keep reading to figure which annotation of the set small, medium, and large to put on your new HBase unit test. </para> - <section - xml:id="hbase.unittests.small"> - <title>Small Tests<indexterm><primary>SmallTests</primary></indexterm></title> - <para> - <emphasis>Small</emphasis> tests are executed in a shared JVM. We put in this - category all the tests that can be executed quickly in a shared JVM. The maximum - execution time for a small test is 15 seconds, and small tests should not use a - (mini)cluster.</para> - </section> + <variablelist> + <title>Categorizing Tests</title> + <varlistentry xml:id="hbase.unittests.small"> + <term>Small Tests<indexterm><primary>SmallTests</primary></indexterm></term> + <listitem> + <para> + <emphasis>Small</emphasis> tests are executed in a shared JVM. We put in + this category all the tests that can be executed quickly in a shared + JVM. The maximum execution time for a small test is 15 seconds, and + small tests should not use a (mini)cluster.</para> + </listitem> + </varlistentry> - <section - xml:id="hbase.unittests.medium"> - <title>Medium Tests<indexterm><primary>MediumTests</primary></indexterm></title> - <para><emphasis>Medium</emphasis> tests represent tests that must be executed before - proposing a patch. They are designed to run in less than 30 minutes altogether, - and are quite stable in their results. They are designed to last less than 50 - seconds individually. They can use a cluster, and each of them is executed in a - separate JVM. </para> - </section> + <varlistentry xml:id="hbase.unittests.medium"> + <term>Medium Tests<indexterm><primary>MediumTests</primary></indexterm></term> + <listitem> + <para><emphasis>Medium</emphasis> tests represent tests that must be + executed before proposing a patch. They are designed to run in less than + 30 minutes altogether, and are quite stable in their results. They are + designed to last less than 50 seconds individually. They can use a + cluster, and each of them is executed in a separate JVM. </para> + </listitem> + </varlistentry> - <section - xml:id="hbase.unittests.large"> - <title>Large Tests<indexterm><primary>LargeTests</primary></indexterm></title> - <para><emphasis>Large</emphasis> tests are everything else. They are typically - large-scale tests, regression tests for specific bugs, timeout tests, - performance tests. They are executed before a commit on the pre-integration - machines. They can be run on the developer machine as well. </para> - </section> - <section - xml:id="hbase.unittests.integration"> - <title>Integration - Tests<indexterm><primary>IntegrationTests</primary></indexterm></title> - <para><emphasis>Integration</emphasis> tests are system level tests. See <xref - linkend="integration.tests" /> for more info. </para> - </section> + <varlistentry xml:id="hbase.unittests.large"> + <term>Large Tests<indexterm><primary>LargeTests</primary></indexterm></term> + <listitem> + <para><emphasis>Large</emphasis> tests are everything else. They are + typically large-scale tests, regression tests for specific bugs, timeout + tests, performance tests. They are execu
<TRUNCATED>
