This is an automated email from the ASF dual-hosted git repository.
github-bot pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/hive-site.git
The following commit(s) were added to refs/heads/asf-site by this push:
new ce17fb9 deploy: cc176b9981c2be369fe47910559e61b87c34b566
ce17fb9 is described below
commit ce17fb916c2c2a52a09853f8971bef3f9f260577
Author: okumin <[email protected]>
AuthorDate: Thu Apr 3 08:14:30 2025 +0000
deploy: cc176b9981c2be369fe47910559e61b87c34b566
---
development/index.html | 3 +
development/index.xml | 3 +-
development/qtest/index.html | 351 +++++++++++++++++++++++
docs/latest/developerguide_27362074/index.html | 107 +------
docs/latest/hivedeveloperfaq_27823747/index.html | 145 +---------
docs/latest/howtocontribute_27362107/index.html | 65 +----
docs/latest/index.xml | 6 +-
index.json | 2 +-
index.xml | 9 +-
qtest.html | 1 +
sitemap.xml | 2 +-
11 files changed, 396 insertions(+), 298 deletions(-)
diff --git a/development/index.html b/development/index.html
index 308075d..76df39a 100644
--- a/development/index.html
+++ b/development/index.html
@@ -124,6 +124,9 @@ ASF
<div class=content>
<h1 class=title>Developments</h1>
<ul>
+<li><a href=https://hive.apache.org/development/qtest/>Query File
Test(qtest)</a></li>
+</ul>
+<ul>
<li><a
href=https://hive.apache.org/development/quickstart/>QuickStarted</a></li>
</ul>
<ul>
diff --git a/development/index.xml b/development/index.xml
index 5e910b7..552f0e6 100644
--- a/development/index.xml
+++ b/development/index.xml
@@ -1,4 +1,5 @@
-<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0"
xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Developments on Hive
Site</title><link>https://hive.apache.org/development/</link><description>Recent
content in Developments on Hive Site</description><generator>Hugo --
gohugo.io</generator><language>en-us</language><lastBuildDate>Fri, 12 May 2023
17:51:06 +0530</lastBuildDate><atom:link
href="https://hive.apache.org/development/index.xml" rel="self" type=" [...]
+<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0"
xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Developments on Hive
Site</title><link>https://hive.apache.org/development/</link><description>Recent
content in Developments on Hive Site</description><generator>Hugo --
gohugo.io</generator><language>en-us</language><lastBuildDate>Fri, 28 Mar 2025
00:00:00 +0000</lastBuildDate><atom:link
href="https://hive.apache.org/development/index.xml" rel="self" type=" [...]
+Tutorial: How to run a specific test case Preparation Run a test case
Tutorial: How to add a new test case Add a QFile Generate a result file Verify
the new result file Commandline options Test options Test Iceberg, Accumulo, or
Kudu QTestOptionHandler: pre/post-processor Using test data Mask
non-deterministic outputs Advanced Locations of log files Negative tests How to
specify drivers How to use PostgreSQL/MySQL/Oracle as a backend database for
Hive Metastore Remote debug Tutorial: How [...]
Quickstart STEP 1: Pull the image Pull the image from DockerHub:
https://hub.docker.com/r/apache/hive/tags. Here are the latest images: 4.0.0
3.1.3 docker pull apache/hive:4.0.0
STEP 2: Export the Hive version export HIVE_VERSION=4.0.0
STEP 3: Launch the HiveServer2 with an embedded
Metastore.</description></item><item><title>Getting
Started</title><link>https://hive.apache.org/development/gettingstarted/</link><pubDate>Tue,
10 Jan 2023 12:35:11
+0530</pubDate><guid>https://hive.apache.org/development/gettingstarted/</guid><description>The
Apache Hive ™ data warehouse software facilitates reading, writing, and
managing large datasets residing in distributed storage using SQL. Structure
can be projected onto data alread [...]
diff --git a/development/qtest/index.html b/development/qtest/index.html
new file mode 100644
index 0000000..ea048b9
--- /dev/null
+++ b/development/qtest/index.html
@@ -0,0 +1,351 @@
+<!doctype html><html><!doctype html>
+<html>
+<head>
+<meta charset=utf-8>
+<meta http-equiv=x-ua-compatible content="IE=edge">
+<meta name=viewport content="width=device-width,initial-scale=1">
+<meta name=description content>
+<meta name=author content>
+<title>Query File Test(qtest)</title>
+<link rel=icon href=/images/hive.svg sizes=any type=image/svg+xml>
+<link rel=stylesheet href=https://hive.apache.org/css/hive-theme.css>
+<link rel=stylesheet href=https://hive.apache.org/css/font-awesome.all.min.css>
+<link rel=stylesheet href=https://hive.apache.org/css/bootstrap.min.css>
+<link rel=stylesheet href=https://hive.apache.org/css/termynal.css>
+<link rel=apple-touch-icon sizes=180x180
href=https://hive.apache.org/images/apple-touch-icon.png>
+<link rel=icon type=image/png sizes=32x32
href=https://hive.apache.org/images/favicon-32x32.png>
+<link rel=icon type=image/png sizes=16x16
href=https://hive.apache.org/images/favicon-16x16.png>
+<link rel=manifest href=https://hive.apache.org/images/site.webmanifest>
+<link rel=mask-icon href=https://hive.apache.org/images/safari-pinned-tab.svg
color=#5bbad5>
+<meta name=msapplication-TileColor content="#da532c">
+<meta name=theme-color content="#ffffff">
+<script>var
_paq=window._paq=window._paq||[];_paq.push(['disableCookies']),_paq.push(['trackPageView']),_paq.push(['enableLinkTracking']),function(){var
b="https://analytics.apache.org/",c,a,d;_paq.push(['setTrackerUrl',b+'matomo.php']),_paq.push(['setSiteId','30']),c=document,a=c.createElement('script'),d=c.getElementsByTagName('script')[0],a.async=!0,a.src=b+'matomo.js',d.parentNode.insertBefore(a,d)}()</script>
+</head>
+<body>
+<body>
+<header>
+<menu style=background:#000;margin:0>
+<nav class="navbar navbar-expand-lg navbar-dark bg-black">
+<div class=container-fluid>
+<a href=https://hive.apache.org> <img
src=https://hive.apache.org/images/hive.svg width=60 height=35 alt="Apache
Software Foundation"></a>
+<a class="header-text navbar-brand" href=https://hive.apache.org>Apache
Hive</a>
+<button class=navbar-toggler type=button data-bs-toggle=collapse
data-bs-target=#navbarSupportedContent aria-controls=navbarSupportedContent
aria-expanded=false aria-label="Toggle navigation">
+<span class=navbar-toggler-icon></span>
+</button>
+<div class="collapse navbar-collapse" id=navbarSupportedContent>
+<ul class="navbar-nav me-auto mb-2 mb-lg-0">
+<li class="nav-item dropdown">
+<a class=nav-link href=/general/downloads id=navbarDropdown role=button
aria-expanded=false>
+Releases
+</a>
+</li>
+<li class="nav-item dropdown">
+<a class="nav-link dropdown-toggle" href=/Document id=navbarDropdown
role=button data-bs-toggle=dropdown aria-expanded=false>
+Documentation
+</a>
+<ul class=dropdown-menu aria-labelledby=navbarDropdown>
+<li><a class=dropdown-item href=/docs/latest/>Latest</a></li>
+<li><a class=dropdown-item
href=https://hive.apache.org/docs/javadocs/>Javadocs</a></li>
+<li><a class=dropdown-item
href=https://cwiki.apache.org/confluence/display/Hive/LanguageManual>Language
Manual</a></li>
+</ul>
+</li>
+<li class="nav-item dropdown">
+<a class="nav-link dropdown-toggle" href=/general id=navbarDropdown
role=button data-bs-toggle=dropdown aria-expanded=false>
+General
+</a>
+<ul class=dropdown-menu aria-labelledby=navbarDropdown>
+<li><a class=dropdown-item
href=https://www.apache.org/licenses/LICENSE-2.0.html>License</a></li>
+<li><a class=dropdown-item
href=https://hive.apache.org/general/privacypolicy/>Privacy Policy</a></li>
+</ul>
+</li>
+<li class="nav-item dropdown">
+<a class="nav-link dropdown-toggle" href=# id=navbarDropdown role=button
data-bs-toggle=dropdown aria-expanded=false>
+Development
+</a>
+<ul class=dropdown-menu aria-labelledby=navbarDropdown>
+<li><a class=dropdown-item
href=https://hive.apache.org/development/gettingstarted/>Getting
Started</a></li>
+<li><a class=dropdown-item
href=https://hive.apache.org/development/quickstart/>Quickstart with
Docker</a></li>
+<li><a class=dropdown-item
href=https://cwiki.apache.org/confluence/display/Hive/DesignDocs>Design
Docs</a></li>
+<li><a class=dropdown-item
href=https://issues.apache.org/jira/projects/HIVE/issues>Hive JIRA</a></li>
+<li><a class=dropdown-item
href=https://cwiki.apache.org/confluence/display/Hive/HiveDeveloperFAQ>Hive
Developer FAQ</a></li>
+<li><a class=dropdown-item
href=https://cwiki.apache.org/confluence/display/Hive/Hive+PreCommit+Patch+Testing>Precommit
Patch Testing</a></li>
+<li><a class=dropdown-item
href=https://hive.apache.org/development/versioncontrol/>Version
Control</a></li>
+</ul>
+</li>
+<li class="nav-item dropdown">
+<a class="nav-link dropdown-toggle" href=# id=navbarDropdown role=button
data-bs-toggle=dropdown aria-expanded=false>
+Community
+</a>
+<ul class=dropdown-menu aria-labelledby=navbarDropdown>
+<li><a class=dropdown-item href=/community/becomingcommitter/>Becoming A
Committer</a></li>
+<li><a class=dropdown-item
href=https://cwiki.apache.org/confluence/display/Hive/HowToContribute>How To
Contribute</a></li>
+<li><a class=dropdown-item
href=https://cwiki.apache.org/confluence/display/Hive/Home#Home-ResourcesforContributors>Resources
for Contributors</a></li>
+<li><a class=dropdown-item
href=https://hive.apache.org/community/mailinglists/>Mailing Lists</a></li>
+<li><a class=dropdown-item
href=https://hive.apache.org/community/issuetracking/>Issue Tracking</a></li>
+<li><a class=dropdown-item
href=https://hive.apache.org/community/people/>People</a></li>
+<li>
+<hr class=dropdown-divider>
+</li>
+<li><a class=dropdown-item href=/community/bylaws/>By Laws</a></li>
+<li><a class=dropdown-item
href=https://cwiki.apache.org/confluence/display/Hive/HowToRelease>How To
Release</a></li>
+</ul>
+</li>
+<li class="nav-item dropdown">
+<a class=nav-link href=https://hive.blog.apache.org/ id=navbarDropdown
role=button aria-expanded=false>
+Blogs
+</a>
+</li>
+<li class="nav-item dropdown">
+<a class="nav-link dropdown-toggle" href=# id=navbarDropdown role=button
data-bs-toggle=dropdown aria-expanded=false>
+ASF
+</a>
+<ul class=dropdown-menu aria-labelledby=navbarDropdown>
+<li><a class=dropdown-item
href=https://www.apache.org/foundation/contributing.html>Donations</a></li>
+<li><a class=dropdown-item
href=https://www.apache.org/foundation/sponsorship.html>Sponsorship</a></li>
+<li><a class=dropdown-item
href=https://www.apache.org/foundation/thanks.html>Thanks</a></li>
+<li><a class=dropdown-item href=https://www.apache.org/>Website</a></li>
+</ul>
+</li>
+<li>
+<form action=/search method=get class=search-bar>
+<input type=search name=q id=search-query placeholder=Search...
class=search-input>
+<button type=submit class=search-button>Search</button>
+</form>
+</li>
+</ul>
+</div>
+</div>
+</nav>
+</menu>
+</header>
+<div class=content>
+<div class=docs>
+<h1 id=query-file-testqtest>Query File Test(qtest)</h1>
+<p>Query File Test is a JUnit-based integration test suite for Apache Hive.
Developers write any SQL; the testing framework runs it and verifies the result
and output.</p>
+<aside class=table-of-contents>
+<nav id=TableOfContents>
+<ul>
+<li><a href=#tutorial-how-to-run-a-specific-test-case>Tutorial: How to run a
specific test case</a>
+<ul>
+<li><a href=#preparation>Preparation</a></li>
+<li><a href=#run-a-test-case>Run a test case</a></li>
+</ul>
+</li>
+<li><a href=#tutorial-how-to-add-a-new-test-case>Tutorial: How to add a new
test case</a>
+<ul>
+<li><a href=#add-a-qfile>Add a QFile</a></li>
+<li><a href=#generate-a-result-file>Generate a result file</a></li>
+<li><a href=#verify-the-new-result-file>Verify the new result file</a></li>
+</ul>
+</li>
+<li><a href=#commandline-options>Commandline options</a>
+<ul>
+<li><a href=#test-options>Test options</a></li>
+<li><a href=#test-iceberg-accumulo-or-kudu>Test Iceberg, Accumulo, or
Kudu</a></li>
+</ul>
+</li>
+<li><a href=#qtestoptionhandler-prepost-processor>QTestOptionHandler:
pre/post-processor</a>
+<ul>
+<li><a href=#using-test-data>Using test data</a></li>
+<li><a href=#mask-non-deterministic-outputs>Mask non-deterministic
outputs</a></li>
+</ul>
+</li>
+<li><a href=#advanced>Advanced</a>
+<ul>
+<li><a href=#locations-of-log-files>Locations of log files</a></li>
+<li><a href=#negative-tests>Negative tests</a></li>
+<li><a href=#how-to-specify-drivers>How to specify drivers</a></li>
+<li><a
href=#how-to-use-postgresqlmysqloracle-as-a-backend-database-for-hive-metastore>How
to use PostgreSQL/MySQL/Oracle as a backend database for Hive
Metastore</a></li>
+<li><a href=#remote-debug>Remote debug</a></li>
+</ul>
+</li>
+</ul>
+</nav>
+</aside>
+<h2 id=tutorial-how-to-run-a-specific-test-case>Tutorial: How to run a
specific test case</h2>
+<h3 id=preparation>Preparation</h3>
+<p>You have to compile Hive’s source codes ahead of time.</p>
+<div class=highlight><pre tabindex=0
style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code
class=language-sh data-lang=sh>$ mvn clean install -Dmaven.javadoc.skip<span
style=color:#f92672>=</span>true -DskipTests -Pitests
+</code></pre></div><h3 id=run-a-test-case>Run a test case</h3>
+<p>Let’s try to run <a
href=https://github.com/apache/hive/blob/master/ql/src/test/queries/clientpositive/alter1.q>alter1.q</a>.</p>
+<div class=highlight><pre tabindex=0
style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code
class=language-sh data-lang=sh>$ mvn test -Pitests -pl itests/qtest
-Dtest<span style=color:#f92672>=</span>TestMiniLlapLocalCliDriver -Dqfile<span
style=color:#f92672>=</span>alter1.q
+</code></pre></div><p>The test should successfully finish.</p>
+<div class=highlight><pre tabindex=0
style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code
class=language-sh data-lang=sh><span style=color:#f92672>[</span>INFO<span
style=color:#f92672>]</span> Results:
+<span style=color:#f92672>[</span>INFO<span style=color:#f92672>]</span>
+<span style=color:#f92672>[</span>INFO<span style=color:#f92672>]</span> Tests
run: 1, Failures: 0, Errors: 0, Skipped: <span style=color:#ae81ff>0</span>
+</code></pre></div><h2 id=tutorial-how-to-add-a-new-test-case>Tutorial: How to
add a new test case</h2>
+<h3 id=add-a-qfile>Add a QFile</h3>
+<p>A QFile includes a set of SQL statements that you want to test. Typically,
we should put a new file in <code>ql/src/test/queries/clientpositive</code>.</p>
+<p>Let’s say you created the following file.</p>
+<div class=highlight><pre tabindex=0
style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code
class=language-sh data-lang=sh>$ cat ql/src/test/queries/clientpositive/aaa.q
+SELECT 1;
+</code></pre></div><h3 id=generate-a-result-file>Generate a result file</h3>
+<p>You can generate the expected output with
<code>-Dtest.output.overwrite=true</code>.</p>
+<div class=highlight><pre tabindex=0
style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code
class=language-sh data-lang=sh>$ mvn test -Pitests -pl itests/qtest
-Dtest<span style=color:#f92672>=</span>TestMiniLlapLocalCliDriver
-Dtest.output.overwrite<span style=color:#f92672>=</span>true -Dqfile<span
style=color:#f92672>=</span>aaa.q
+...
+$ cat ql/src/test/results/clientpositive/llap/aaa.q.out
+PREHOOK: query: SELECT <span style=color:#ae81ff>1</span>
+PREHOOK: type: QUERY
+PREHOOK: Input: _dummy_database@_dummy_table
+<span style=color:#75715e>#### A masked pattern was here ####</span>
+POSTHOOK: query: SELECT <span style=color:#ae81ff>1</span>
+POSTHOOK: type: QUERY
+POSTHOOK: Input: _dummy_database@_dummy_table
+<span style=color:#75715e>#### A masked pattern was here ####</span>
+<span style=color:#ae81ff>1</span>
+</code></pre></div><h3 id=verify-the-new-result-file>Verify the new result
file</h3>
+<p>You can ensure the generated result file is correct by rerunning the test
case without <code>-Dtest.output.overwrite=true</code>.</p>
+<div class=highlight><pre tabindex=0
style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code
class=language-sh data-lang=sh>$ mvn test -Pitests -pl itests/qtest
-Dtest<span style=color:#f92672>=</span>TestMiniLlapLocalCliDriver -Dqfile<span
style=color:#f92672>=</span>aaa.q
+</code></pre></div><h2 id=commandline-options>Commandline options</h2>
+<h3 id=test-options>Test options</h3>
+<table>
+<thead>
+<tr>
+<th>Option</th>
+<th>Description</th>
+<th>Example</th>
+</tr>
+</thead>
+<tbody>
+<tr>
+<td>test</td>
+<td>The class name of the test driver</td>
+<td><code>-Dtest=TestMiniLlapLocalCliDriver</code></td>
+</tr>
+<tr>
+<td>qfile</td>
+<td>The name(s) of Query Files</td>
+<td><code>-Dqfile=alter1.q</code>, <code>-Dqfile=alter1.q,alter2.q</code></td>
+</tr>
+<tr>
+<td>qfile_regex</td>
+<td>The pattern to list Query Files</td>
+<td><code>-Dqfile_regex=alter[0-9]</code></td>
+</tr>
+<tr>
+<td>test.output.overwrite</td>
+<td>Whether you want to (re)generate result files or not</td>
+<td><code>-Dtest.output.overwrite=true</code></td>
+</tr>
+<tr>
+<td>test.metastore.db</td>
+<td>Which RDBMS to be used as a metastore backend</td>
+<td>See <a
href=#how-to-use-postgresqlmysqloracle-as-a-backend-database-for-hive-metastore>How
to use PostgreSQL/MySQL/Oracle as a backend database for Hive
Metastore</a></td>
+</tr>
+</tbody>
+</table>
+<h3 id=test-iceberg-accumulo-or-kudu>Test Iceberg, Accumulo, or Kudu</h3>
+<p>Most test drivers are available in the <code>itest/qtest</code> project.
However, there are some exceptional ones.</p>
+<table>
+<thead>
+<tr>
+<th>Driver</th>
+<th>Project</th>
+</tr>
+</thead>
+<tbody>
+<tr>
+<td>TestAccumuloCliDriver</td>
+<td>itest/qtest-accumulo</td>
+</tr>
+<tr>
+<td>TestIcebergCliDriver</td>
+<td>itest/qtest-iceberg</td>
+</tr>
+<tr>
+<td>TestIcebergLlapLocalCliDriver</td>
+<td>itest/qtest-iceberg</td>
+</tr>
+<tr>
+<td>TestIcebergLlapLocalCompactorCliDriver</td>
+<td>itest/qtest-iceberg</td>
+</tr>
+<tr>
+<td>TestIcebergNegativeCliDriver</td>
+<td>itest/qtest-iceberg</td>
+</tr>
+<tr>
+<td>TestKuduCliDriver</td>
+<td>itest/qtest-kudu</td>
+</tr>
+<tr>
+<td>TestKuduNegativeCliDriver</td>
+<td>itest/qtest-kudu</td>
+</tr>
+</tbody>
+</table>
+<p>When you use <code>TestIcebergLlapLocalCliDriver</code>, you have to
specify <code>-pl itest/qtest-iceberg</code>.</p>
+<div class=highlight><pre tabindex=0
style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code
class=language-sh data-lang=sh>$ mvn test -Pitests -pl itests/qtest-iceberg
-Dtest<span style=color:#f92672>=</span>TestIcebergLlapLocalCliDriver
-Dqfile_regex<span style=color:#f92672>=</span>iceberg_bucket_map_join_8
+</code></pre></div><h2
id=qtestoptionhandler-prepost-processor>QTestOptionHandler:
pre/post-processor</h2>
+<p>We extend JUnit by implementing <a
href=https://github.com/apache/hive/blob/master/itests/util/src/main/java/org/apache/hadoop/hive/ql/qoption/QTestOptionHandler.java>QTestOptionHandlers</a>,
which are custom pre-processors and post-processors. This section explains a
couple of typical processors.</p>
+<h3 id=using-test-data>Using test data</h3>
+<p>Adding <code>--! qt:dataset:{table name}</code>, <a
href=https://github.com/apache/hive/blob/master/itests/util/src/main/java/org/apache/hadoop/hive/ql/dataset/QTestDatasetHandler.java>QTestDatasetHandler</a>
automatically sets up a test table. You can find the table definitions <a
href=https://github.com/apache/hive/tree/master/data/files/datasets>here</a>.</p>
+<div class=highlight><pre tabindex=0
style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code
class=language-sql data-lang=sql><span style=color:#75715e>--! qt:dataset:src
+</span><span style=color:#75715e></span><span
style=color:#66d9ef>SELECT</span> <span style=color:#f92672>*</span> <span
style=color:#66d9ef>FROM</span> src;
+</code></pre></div><h3 id=mask-non-deterministic-outputs>Mask
non-deterministic outputs</h3>
+<p>Some test cases generate random results. <a
href=https://github.com/apache/hive/blob/master/itests/util/src/main/java/org/apache/hadoop/hive/ql/qoption/QTestReplaceHandler.java>QTestReplaceHandler</a>
masks such a non-deterministic part. You can use it with a special comment
prefixed with <code>--! qt:replace:</code>.</p>
+<p>For example, the result of <code>CURRENT_DATE</code> changes every day.
Using the comment, the output will be <code>non-deterministic-output
#Masked#</code>, which is stable across executions.</p>
+<div class=highlight><pre tabindex=0
style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code
class=language-sql data-lang=sql><span style=color:#75715e>--!
qt:replace:/(non-deterministic-output\s)[0-9]{4}-[0-9]{2}-[0-9]{2}/$1#Masked#/
+</span><span style=color:#75715e></span><span
style=color:#66d9ef>SELECT</span> <span
style=color:#e6db74>'non-deterministic-output'</span>, <span
style=color:#66d9ef>CURRENT_DATE</span>();
+</code></pre></div><h2 id=advanced>Advanced</h2>
+<h3 id=locations-of-log-files>Locations of log files</h3>
+<p>The Query File Test framework outputs log files in the following paths.</p>
+<ul>
+<li><code>itests/{qtest, qtest-accumulo, qtest-iceberg,
qtest-kudu}/target/surefire-reports</code></li>
+<li>From the root of the source tree: <code>find . -name hive.log</code></li>
+</ul>
+<h3 id=negative-tests>Negative tests</h3>
+<p>Negative drivers allow us to make sure that a test case fails expectedly.
For example, the query in <a
href=https://github.com/apache/hive/blob/master/ql/src/test/queries/clientnegative/strict_timestamp_to_numeric.q>strict_timestamp_to_numeric.q</a>
must fail based on Hive’s specifications. We can use
<code>TestNegativeLlapLocalCliDriver</code>,
<code>TestIcebergNegativeCliDriver</code>, and so on.</p>
+<div class=highlight><pre tabindex=0
style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code
class=language-sh data-lang=sh>$ mvn -Pitests -pl itests/qtest test
-Dtest<span style=color:#f92672>=</span>TestNegativeLlapLocalCliDriver
-Dqfile<span style=color:#f92672>=</span>strict_timestamp_to_numeric.q
+</code></pre></div><h3 id=how-to-specify-drivers>How to specify drivers</h3>
+<p>We define the default mapping of Query Files and test drivers using <a
href=https://github.com/apache/hive/blob/master/itests/src/test/resources/testconfiguration.properties>testconfiguration.properties</a>
and <a
href=https://github.com/apache/hive/blob/master/itests/util/src/main/java/org/apache/hadoop/hive/cli/control/CliConfigs.java>CliConfigs</a>.
For example, <code>TestMiniLlapLocalCliDriver</code> is the default driver for
query files stored in <code>ql/src/test/queries/clientp [...]
+<p>You can override the mapping through <a
href=https://github.com/apache/hive/blob/master/itests/src/test/resources/testconfiguration.properties>testconfiguration.properties</a>.
For example, if you want to test
<code>ql/src/test/queries/clientpositive/aaa.q</code> not by LLAP but by Tez,
you must include the file name in <code>minitez.query.files</code> and generate
the result file with <code>-Dtest=TestMiniTezCliDriver</code>.</p>
+<p>In most cases, we should use <code>TestMiniLlapLocalCliDriver</code> for
positive tests and <code>TestNegativeLlapLocalCliDriver</code> for negative
tests.</p>
+<h3
id=how-to-use-postgresqlmysqloracle-as-a-backend-database-for-hive-metastore>How
to use PostgreSQL/MySQL/Oracle as a backend database for Hive Metastore</h3>
+<p>To run a test with a specified DB, it is possible by adding the
“-Dtest.metastore.db” parameter like in the following commands:</p>
+<div class=highlight><pre tabindex=0
style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code
class=language-sh data-lang=sh>$ mvn test -Pitests -pl itests/qtest
-Dtest<span style=color:#f92672>=</span>TestCliDriver -Dqfile<span
style=color:#f92672>=</span>partition_params_postgres.q
-Dtest.metastore.db<span style=color:#f92672>=</span>postgres
+$ mvn test -Pitests -pl itests/qtest -Dtest<span
style=color:#f92672>=</span>TestCliDriver -Dqfile<span
style=color:#f92672>=</span>partition_params_postgres.q
-Dtest.metastore.db<span style=color:#f92672>=</span>mssql
+$ mvn test -Pitests -pl itests/qtest -Dtest<span
style=color:#f92672>=</span>TestCliDriver -Dqfile<span
style=color:#f92672>=</span>partition_params_postgres.q
-Dtest.metastore.db<span style=color:#f92672>=</span>mysql
+$ mvn test -Pitests -pl itests/qtest -Dtest<span
style=color:#f92672>=</span>TestCliDriver -Dqfile<span
style=color:#f92672>=</span>partition_params_postgres.q
-Dtest.metastore.db<span style=color:#f92672>=</span>oracle
-Ditest.jdbc.jars<span
style=color:#f92672>=</span>/path/to/your/god/damn/oracle/jdbc/driver/ojdbc6.jar
+</code></pre></div><h3 id=remote-debug>Remote debug</h3>
+<p>Remote debugging with Query File Test is a potent tool for debugging Hive.
With the following command, Query File Test listens to port 5005 and waits for
a debugger to be attached.</p>
+<div class=highlight><pre tabindex=0
style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code
class=language-sh data-lang=sh>$ mvn -Pitests -pl itests/qtest
-Dmaven.surefire.debug test -Dtest<span
style=color:#f92672>=</span>TestMiniLlapLocalCliDriver -Dqfile<span
style=color:#f92672>=</span>alter1.q
+</code></pre></div>
+</div>
+</div>
+<footer class="black-background static-bottom" style=padding:30px>
+<div class=row>
+<div class=col-3>
+<a href=https://www.apache.org/>
+<img src=https://hive.apache.org/images/asf_logo.png width=270 height=100
alt="Apache Software Foundation"></a>
+</a>
+</div>
+<div class=col-9>
+<p class=footer-text>Apache is a non-profit organization helping open-source
+software projects released under the Apache
+<a href=https://www.apache.org/licenses/>license</a>
+and managed with
+<a href=https://www.apache.org/foundation/how-it-works.html>
+open governance</a> and
+<a href=https://privacy.apache.org/policies/privacy-policy-public.html>
+privacy policy</a>. See upcoming
+<a href=https://www.apache.org/events/current-event>Apache Events</a>.
+If you discover any
+<a href=https://www.apache.org/security/>security</a> vulnerabilities, please
+report them privately. Finally,
+<a href=https://www.apache.org/foundation/sponsorship.html>thanks
+</a> to the sponsors who
+<a href=https://www.apache.org/foundation/contributing.html>
+donate</a> to the Apache Foundation.
+</p>
+</div>
+</div>
+<div class="copyright row">
+<a href=https://hive.apache.org style=color:grey>
+The contents of this website are © 2023 Apache Software Foundation under the
terms of the Apache License v2. Apache Hive and its logo are trademarks of the
Apache Software Foundation.
+</a>
+</div>
+</footer>
+<script src=https://hive.apache.org/js/bootstrap.bundle.min.js></script>
+</body>
+</html>
\ No newline at end of file
diff --git a/docs/latest/developerguide_27362074/index.html
b/docs/latest/developerguide_27362074/index.html
index 5d2e285..c82d1d2 100644
--- a/docs/latest/developerguide_27362074/index.html
+++ b/docs/latest/developerguide_27362074/index.html
@@ -121,35 +121,15 @@ ASF
<div class=content>
<div class=docs>
<h1 id=apache-hive--developerguide>Apache Hive : DeveloperGuide</h1>
-<h1 id=developer-guide>Developer Guide</h1>
-<ul>
-<li><a href=#developer-guide>Developer Guide</a>
+<aside class=table-of-contents>
+<nav id=TableOfContents>
<ul>
<li><a href=#code-organization-and-a-brief-architecture>Code Organization and
a Brief Architecture</a>
<ul>
<li><a href=#introduction>Introduction</a></li>
-<li><a href=#hive-serde>Hive SerDe</a>
-<ul>
-<li><a href=#how-to-write-your-own-serde>How to Write Your Own SerDe</a></li>
-<li><a href=#objectinspector>ObjectInspector</a></li>
-<li><a href=#registration-of-native-serdes>Registration of Native
SerDes</a></li>
-</ul>
-</li>
+<li><a href=#hive-serde>Hive SerDe</a></li>
<li><a href=#metastore>MetaStore</a></li>
-<li><a href=#query-processor>Query Processor</a>
-<ul>
-<li><a href=#compiler>Compiler</a></li>
-<li><a href=#parser>Parser</a></li>
-<li><a href=#typechecking>TypeChecking</a></li>
-<li><a href=#semantic-analysis>Semantic Analysis</a></li>
-<li><a href=#plan-generation>Plan generation</a></li>
-<li><a href=#task-generation>Task generation</a></li>
-<li><a href=#execution-engine>Execution Engine</a></li>
-<li><a href=#plan>Plan</a></li>
-<li><a href=#operators>Operators</a></li>
-<li><a href=#udfs-and-udafs>UDFs and UDAFs</a></li>
-</ul>
-</li>
+<li><a href=#query-processor>Query Processor</a></li>
</ul>
</li>
<li><a href=#compiling-and-running-hive>Compiling and Running Hive</a>
@@ -162,15 +142,7 @@ ASF
<li><a href=#unit-tests-and-debugging>Unit tests and debugging</a>
<ul>
<li><a href=#layout-of-the-unit-tests>Layout of the unit tests</a></li>
-<li><a href=#running-unit-tests>Running unit tests</a></li>
-<li><a href=#adding-new-unit-tests>Adding new unit tests</a></li>
-<li><a href=#debugging-hive-code>Debugging Hive Code</a>
-<ul>
-<li><a href=#debugging-client-side-code>Debugging Client-Side Code</a></li>
-<li><a href=#debugging-server-side-code>Debugging Server-Side Code</a></li>
-<li><a href=#debugging-without-ant-client-and-server-side>Debugging without
Ant (Client and Server Side)</a></li>
-</ul>
-</li>
+<li><a href=#debugging-hive-code>Debugging Hive Code</a></li>
</ul>
</li>
<li><a href=#pluggable-interfaces>Pluggable interfaces</a>
@@ -182,8 +154,8 @@ ASF
</ul>
</li>
</ul>
-</li>
-</ul>
+</nav>
+</aside>
<h2 id=code-organization-and-a-brief-architecture>Code Organization and a
Brief Architecture</h2>
<h3 id=introduction>Introduction</h3>
<p>Hive has 3 main components:</p>
@@ -391,70 +363,7 @@ ASF
</code></pre><p>Then you can run
‘<code>build/dist/bin/hive</code>’ and it will work against your
local file system.</p>
<h2 id=unit-tests-and-debugging>Unit tests and debugging</h2>
<h3 id=layout-of-the-unit-tests>Layout of the unit tests</h3>
-<p>Hive uses <a href=http://junit.org/>JUnit</a> for unit tests. Each of the 3
main components of Hive have their unit test implementations in the
corresponding src/test directory e.g. trunk/metastore/src/test has all the unit
tests for metastore, trunk/serde/src/test has all the unit tests for serde and
trunk/ql/src/test has all the unit tests for the query processor. The metastore
and serde unit tests provide the TestCase implementations for JUnit. The query
processor tests on the othe [...]
-<ul>
-<li>Test Queries:
-<ul>
-<li>queries/clientnegative - This directory contains the query files (.q
files) for the negative test cases. These are run through the CLI classes and
therefore test the entire query processor stack.</li>
-<li>queries/clientpositive - This directory contains the query files (.q
files) for the positive test cases. Thesre are run through the CLI classes and
therefore test the entire query processor stack.</li>
-<li>qureies/positive (Will be deprecated) - This directory contains the query
files (.q files) for the positive test cases for the compiler. These only test
the compiler and do not run the execution code.</li>
-<li>queries/negative (Will be deprecated) - This directory contains the query
files (.q files) for the negative test cases for the compiler. These only test
the compiler and do not run the execution code.</li>
-</ul>
-</li>
-<li>Test Results:
-<ul>
-<li>results/clientnegative - The expected results from the queries in
queries/clientnegative.</li>
-<li>results/clientpositive - The expected results from the queries in
queries/clientpositive.</li>
-<li>results/compiler/errors - The expected results from the queries in
queries/negative.</li>
-<li>results/compiler/parse - The expected Abstract Syntax Tree output for the
queries in queries/positive.</li>
-<li>results/compiler/plan - The expected query plans for the queries in
queries/positive.</li>
-</ul>
-</li>
-<li>Velocity Templates to Generate the Tests:
-<ul>
-<li>templates/TestCliDriver.vm - Generates the tests from
queries/clientpositive.</li>
-<li>templates/TestNegativeCliDriver.vm - Generates the tests from
queries/clientnegative.</li>
-<li>templates/TestParse.vm - Generates the tests from queries/positive.</li>
-<li>templates/TestParseNegative.vm - Generates the tests from
queries/negative.</li>
-</ul>
-</li>
-</ul>
-<h3 id=running-unit-tests>Running unit tests</h3>
-<p>Ant to Maven</p>
-<p>As of version <a
href=https://issues.apache.org/jira/browse/HIVE-5107>0.13</a> Hive uses Maven
instead of Ant for its build. The following instructions are not up to date.</p>
-<p>See the <a href=#hive-developer-faq>Hive Developer FAQ</a> for updated
instructions.</p>
-<p>Run all tests:</p>
-<pre tabindex=0><code>ant package test
-
-</code></pre><p>Run all positive test queries:</p>
-<pre tabindex=0><code>ant test -Dtestcase=TestCliDriver
-
-</code></pre><p>Run a specific positive test query:</p>
-<pre tabindex=0><code>ant test -Dtestcase=TestCliDriver -Dqfile=groupby1.q
-
-</code></pre><p>The above test produces the following files:</p>
-<ul>
-<li><code>build/ql/test/TEST-org.apache.hadoop.hive.cli.TestCliDriver.txt</code>
- Log output for the test. This can be helpful when examining test
failures.</li>
-<li><code>build/ql/test/logs/groupby1.q.out</code> - Actual query result for
the test. This result is compared to the expected result as part of the
test.</li>
-</ul>
-<p>Run the set of unit tests matching a regex, e.g. partition_wise_fileformat
tests 10-16:</p>
-<pre tabindex=0><code>ant test -Dtestcase=TestCliDriver
-Dqfile_regex=partition_wise_fileformat1[0-6]
-
-</code></pre><p>Note that this option matches against the basename of the test
without the .q suffix.</p>
-<p>Apparently the Hive tests do not run successfully after a clean unless you
run <code>ant package</code> first. Not sure why build.xml doesn’t encode
this dependency.</p>
-<h3 id=adding-new-unit-tests>Adding new unit tests</h3>
-<p>Ant to Maven</p>
-<p>As of version <a
href=https://issues.apache.org/jira/browse/HIVE-5107>0.13</a> Hive uses Maven
instead of Ant for its build. The following instructions are not up to date.</p>
-<p>See the <a href=#hive-developer-faq>Hive Developer FAQ</a> for updated
instructions. See also <a
href=https://hive.apache.org/docs/latest/tipsforaddingnewtests_27362060/>Tips
for Adding New Tests in Hive</a> and <a
href=#how-to-contribute:-add-a-unit-test>How to Contribute: Add a Unit
Test</a>.</p>
-<p>First, write a new myname.q in ql/src/test/queries/clientpositive.</p>
-<p>Then, run the test with the query and overwrite the result (useful when you
add a new test).</p>
-<pre tabindex=0><code>ant test -Dtestcase=TestCliDriver -Dqfile=myname.q
-Doverwrite=true
-
-</code></pre><p>Then we can create a patch by:</p>
-<pre tabindex=0><code>svn add ql/src/test/queries/clientpositive/myname.q
ql/src/test/results/clientpositive/myname.q.out
-svn diff > patch.txt
-
-</code></pre><p>Similarly, to add negative client tests, write a new query
input file in ql/src/test/queries/clientnegative and run the same command, this
time specifying the testcase name as TestNegativeCliDriver instead of
TestCliDriver. Note that for negative client tests, the output file if created
using the overwrite flag can be be found in the directory
ql/src/test/results/clientnegative.</p>
+<p>Hive uses <a href=http://junit.org/>JUnit</a> for unit tests. Each of the 3
main components of Hive have their unit test implementations in the
corresponding src/test directory e.g. trunk/metastore/src/test has all the unit
tests for metastore, trunk/serde/src/test has all the unit tests for serde and
trunk/ql/src/test has all the unit tests for the query processor. The metastore
and serde unit tests provide the TestCase implementations for JUnit.</p>
<h3 id=debugging-hive-code>Debugging Hive Code</h3>
<p>Hive code includes both client-side code (e.g., compiler, semantic
analyzer, and optimizer of HiveQL) and server-side code (e.g.,
operator/task/SerDe implementations). Debugging is different for client-side
and server-side code, as described below.</p>
<h4 id=debugging-client-side-code>Debugging Client-Side Code</h4>
diff --git a/docs/latest/hivedeveloperfaq_27823747/index.html
b/docs/latest/hivedeveloperfaq_27823747/index.html
index cf08514..6428889 100644
--- a/docs/latest/hivedeveloperfaq_27823747/index.html
+++ b/docs/latest/hivedeveloperfaq_27823747/index.html
@@ -121,9 +121,8 @@ ASF
<div class=content>
<div class=docs>
<h1 id=apache-hive--hivedeveloperfaq>Apache Hive : HiveDeveloperFAQ</h1>
-<h1 id=hive-developer-faq>Hive Developer FAQ</h1>
-<ul>
-<li><a href=#hive-developer-faq>Hive Developer FAQ</a>
+<aside class=table-of-contents>
+<nav id=TableOfContents>
<ul>
<li><a href=#developing>Developing</a>
<ul>
@@ -139,7 +138,10 @@ ASF
<li><a href=#how-to-generate-tarball>How to generate tarball?</a></li>
<li><a href=#how-to-generate-protobuf-code>How to generate protobuf
code?</a></li>
<li><a href=#how-to-generate-thrift-code>How to generate Thrift code?</a></li>
-<li><a href=#how-to-run-findbugs-after-a-change?>How to run findbugs after a
change?</a></li>
+</ul>
+</li>
+<li><a
href=#httpsissuesapacheorgjirasecureviewavatarsizexsmallavatarid21141avatartypeissuetypehive-26769httpsissuesapacheorgjirabrowsehive-26769srcconfmacro><a
href="https://issues.apache.org/jira/browse/HIVE-26769?src=confmacro"><img
src="https://issues.apache.org/jira/secure/viewavatar?size=xsmall&avatarId=21141&avatarType=issuetype"
alt>HIVE-26769</a></a>
+<ul>
<li><a href=#how-to-compile-odbc>How to compile ODBC?</a></li>
<li><a href=#how-do-i-publish-hive-artifacts-to-my-local-maven-repository>How
do I publish Hive artifacts to my local Maven repository?</a></li>
</ul>
@@ -152,29 +154,19 @@ ASF
<li><a href=#how-do-i-run-all-of-the-unit-tests>How do I run all of the unit
tests?</a></li>
<li><a
href=#how-do-i-run-all-of-the-unit-tests-except-for-a-certain-few-tests>How do
I run all of the unit tests except for a certain few tests?</a></li>
<li><a href=#how-do-i-run-the-clientpositiveclientnegative-unit-tests>How do I
run the clientpositive/clientnegative unit tests?</a></li>
-<li><a href=#how-do-i-run-with-postgremysqloracle>How do I run with
Postgre/MySQL/Oracle?</a></li>
-<li><a href=#how-do-i-remote-debug-a-qtest>How do I remote debug a
qtest?</a></li>
-<li><a href=#how-do-i-modify-the-init-script-when-testing>How do I modify the
init script when testing?</a></li>
-<li><a href=#how-do-i-update-the-output-of-a-clidriver-testcase>How do I
update the output of a CliDriver testcase?</a></li>
-<li><a href=#how-do-i-update-the-results-of-many-test-cases>How do I update
the results of many test cases?</a></li>
-<li><a href=#where-is-the-log-output-of-a-test>Where is the log output of a
test?</a></li>
-<li><a href=#how-do-i-add-a-test-case>How do I add a test case?</a></li>
<li><a href=#why-isnt-the-itests-pom-connected-to-the-root-pom>Why isn’t
the itests pom connected to the root pom?</a></li>
<li><a
href=#why-does-a-test-fail-with-a-nullpointerexception-in-minidfscluster>Why
does a test fail with a NullPointerException in MiniDFSCluster?</a></li>
-<li><a href=#why-do-spark-unit-tests-fail-with-a-securityexception>Why do
Spark unit tests fail with a SecurityException?</a></li>
</ul>
</li>
<li><a href=#debugging>Debugging</a>
<ul>
-<li><a href=#how-do-i-debug-into-a-single-test-in-eclipse?>How do I debug into
a single test in Eclipse?</a></li>
+<li><a href=#how-do-i-debug-into-asingle-testin-eclipse>How do I debug into a
single test in Eclipse?</a></li>
<li><a href=#how-do-i-debug-my-queries-in-hive>How do I debug my queries in
Hive?</a></li>
</ul>
</li>
</ul>
-</li>
-</ul>
-<p>Maven</p>
-<p>Run the test and generate the output file using the appropriate
<code>-Dtest</code>Hive is using <a href=#maven>Maven</a> as its build tool.
Versions prior to 0.13 were using Ant.</p>
+</nav>
+</aside>
<h2 id=developing>Developing</h2>
<h3 id=how-do-i-move-some-files>How do I move some files?</h3>
<p>Post a patch for testing purposes which simply does add and deletes. SVN
will not understand these patches are actually moves, therefore you should
actually upload the following, <strong>in order</strong> so the last upload is
the patch for testing purposes:</p>
@@ -275,14 +267,9 @@ mvn clean install -DskipTests
</code></pre><h2 id=testing>Testing</h2>
<p>For general information, see <a href=#unit-tests-and-debugging>Unit Tests
and Debugging</a> in the Developer Guide.</p>
<h3 id=how-do-i-run-precommit-tests-on-a-patch>How do I run precommit tests on
a patch?</h3>
-<p>Hive precommit testing is triggered automatically when a file is uploaded
to the JIRA ticket:</p>
-<ol>
-<li>Attach the patch file to a JIRA ticket: in the ticket’s
“More” tab, select “Attach Files” and use “Choose
File” to upload the file, then add a descriptive comment.</li>
-<li>Put the patch in the review queue: click the “Submit Patch”
button. The button name will change to “Cancel Patch” and the
ticket status will change to Patch Available.</li>
-</ol>
-<p>See <a
href=https://hive.apache.org/docs/latest/hive-precommit-patch-testing_33295252/>Hive
PreCommit Patch Testing</a> for more detail.</p>
+<p>A Jenkins job will start when you create a pull request on GitHub.</p>
<h3 id=how-do-i-rerun-precommit-tests-over-the-same-patch>How do I rerun
precommit tests over the same patch?</h3>
-<p>For patch updates, our convention is to number them like HIVE-1856.1.patch,
HIVE-1856.2.patch, etc. And then click the “Submit Patch” button
again when a new one is uploaded; this makes sure it gets back into the review
queue.</p>
+<p>You can ask a committer to rerun it, or do <code>git commit --amend</code>
and push it.</p>
<h3 id=how-do-i-run-a-single-test>How do I run a single test?</h3>
<p>ITests</p>
<p>Note that any test in the itests directory needs to be executed from within
the itests directory. The pom is disconnected from the parent project for
technical reasons.Single test class:</p>
@@ -298,114 +285,16 @@ mvn clean install -DskipTests
<pre tabindex=0><code>mvn test
cd itests
mvn test
-
</code></pre><p>Note that you need to have previously built and installed the
jars:</p>
<pre tabindex=0><code>mvn clean install -DskipTests
cd itests
mvn clean install -DskipTests
-
</code></pre><h3
id=how-do-i-run-all-of-the-unit-tests-except-for-a-certain-few-tests>How do I
run all of the unit tests except for a certain few tests?</h3>
<p>Similar to running all tests, but define test.excludes.additional to
specify a test/pattern to exclude from the test run. For example the following
will run all tests except for the CliDriver tests:</p>
<pre tabindex=0><code>cd itests
mvn test -Dtest.excludes.additional='**/Test*CliDriver.java'
</code></pre><h3
id=how-do-i-run-the-clientpositiveclientnegative-unit-tests>How do I run the
clientpositive/clientnegative unit tests?</h3>
-<p>All of the below require that you have previously run <code>ant
package</code>.</p>
-<p>To run clientpositive tests</p>
-<pre tabindex=0><code>cd itests/qtest
-mvn test -Dtest=TestCliDriver
-</code></pre><p>To run a single clientnegative test alter1.q</p>
-<pre tabindex=0><code>cd itests/qtest
-mvn test -Dtest=TestNegativeCliDriver -Dqfile=alter1.q
-</code></pre><p>To run all of the clientpositive tests that match a regex, for
example the partition_wise_fileformat tests</p>
-<pre tabindex=0><code>cd itests/qtest
-mvn test -Dtest=TestCliDriver -Dqfile_regex=partition_wise_fileformat.*
-
-# Alternatively, you can specify comma separated list with "-Dqfile"
argument
-mvn test -Dtest=TestMiniLlapLocalCliDriver
-Dqfile='vectorization_0.q,vectorization_17.q,vectorization_8.q'
-</code></pre><p>To run a single contrib test alter1.q and overwrite the result
file</p>
-<pre tabindex=0><code>cd itests/qtest
-mvn test -Dtest=TestContribCliDriver -Dqfile=alter1.q
-Dtest.output.overwrite=true
-</code></pre><h3 id=how-do-i-run-with-postgremysqloracle>How do I run with
Postgre/MySQL/Oracle?</h3>
-<p>To run test test with a specified DB it is possible by adding
“-Dtest.metastore.db” parameter like in the following commands:</p>
-<pre tabindex=0><code>mvn test -Pitests -pl itests/qtest -Dtest=TestCliDriver
-Dqfile=partition_params_postgres.q -Dtest.metastore.db=postgres
-
-mvn test -Pitests -pl itests/qtest -Dtest=TestCliDriver
-Dqfile=partition_params_postgres.q -Dtest.metastore.db=mssql
-mvn test -Pitests -pl itests/qtest -Dtest=TestCliDriver
-Dqfile=partition_params_postgres.q -Dtest.metastore.db=mysql
-mvn test -Pitests -pl itests/qtest -Dtest=TestCliDriver
-Dqfile=partition_params_postgres.q -Dtest.metastore.db=oracle
-Ditest.jdbc.jars=/path/to/your/god/damn/oracle/jdbc/driver/ojdbc6.jar
-</code></pre><p>Without specifying -Dqfile it will run all .q files .</p>
-<h3 id=how-do-i-remote-debug-a-qtest>How do I remote debug a qtest?</h3>
-<pre tabindex=0><code>cd itests/qtest
-mvn -Dmaven.surefire.debug="-Xdebug
-Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000 -Xnoagent
-Djava.compiler=NONE" test -Dtest=TestCliDriver -Dqfile=<test>.q
-</code></pre><h3 id=how-do-i-modify-the-init-script-when-testing>How do I
modify the init script when testing?</h3>
-<p>The option to skip the init script or supply a custom init script was added
in Hive 2.0 (see <a
href=https://issues.apache.org/jira/browse/HIVE-11538>HIVE-11538</a>).</p>
-<p>To skip initialization:</p>
-<pre tabindex=0><code>mvn test -Dtest=TestCliDriver -Phadoop-2
-Dqfile=test_to_run.q -DinitScript=
-</code></pre><p>To supply a custom script:</p>
-<pre tabindex=0><code>mvn test -Dtest=TestCliDriver -Phadoop-2
-Dtest.output.overwrite=true -Dqfile=test_to_run.q
-DinitScript=custom_script.sql
-</code></pre><h3 id=how-do-i-update-the-output-of-a-clidriver-testcase>How do
I update the output of a CliDriver testcase?</h3>
-<pre tabindex=0><code>cd itests/qtest
-mvn test -Dtest=TestCliDriver -Dqfile=alter1.q -Dtest.output.overwrite=true
-
-</code></pre><h3 id=how-do-i-update-the-results-of-many-test-cases>How do I
update the results of many test cases?</h3>
-<p>Assume that you have a file like below which you’d like to
re-generate output files for. Such a file could be created by copying the
output from the precommit tests.</p>
-<pre tabindex=0><code>head -2 /tmp/failed-TestCliDriver-file-tests
-org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_allcolref_in_udf
-org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_annotate_stats_join
-</code></pre><p>You can re-generate all those output files in batches of 20
with the command below</p>
-<pre tabindex=0><code>egrep 'TestCliDriver'
/tmp/failed-TestCliDriver-file-tests | perl -pe 's@.*testCliDriver_@@g' | awk
'{print $1 ".q"}' | xargs -n 30 | perl -pe 's@ @,@g' | xargs -I{} mvn
test -Dtest=TestCliDriver -Dtest.output.overwrite=true -Dqfile={}
-</code></pre><p>To do the same from the output of a precommit result, with
multiple drivers, you can do</p>
-<pre tabindex=0><code>import re
-from itertools import groupby
-s = """
-org.apache.hadoop.hive.cli.TestBeeLineDriver.testCliDriver[drop_with_concurrency]
(batchId=231)
-org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[comments] (batchId=35)
-org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[vector_if_expr]
(batchId=141)
-"""
-PAT = re.compile("org.apache.hadoop.hive.cli.([^\.]*).*\[([^\]]*).*")
-l = [PAT.match(x.strip()) for x in s.split("\n") if x.strip()]
-for driver,q in groupby(sorted([a.groups() for a in l if a]), key=lambda
a:a[0]):
- print """mvn clean test -Dtest=%s '-Dqfile=%s'
-Dtest.output.overwrite=true""" % (driver,
",".join(["%s.q" % a[1] for a in q]))
-
-</code></pre><h3 id=where-is-the-log-output-of-a-test>Where is the log output
of a test?</h3>
-<p>Logs are put in a couple locations:</p>
-<ul>
-<li>
-<pre tabindex=0><code></code></pre></li>
-</ul>
-<p>From the root of the source tree: find . -name hive.log</p>
-<pre tabindex=0><code>* ```
-/tmp/$USER/ (Linux) or $TMPDIR/$USER/ (MacOS)
-</code></pre><p>See <a href=#hive-logging>Hive Logging</a> for details about
log files, including alternative configurations.</p>
-<h3 id=how-do-i-add-a-test-case>How do I add a test case?</h3>
-<p>First, add the test case to the qfile test suite:</p>
-<ul>
-<li>Copy the test to a new file under
<code>ql/src/test/queries/clientpositive/<filename>.q</code> (or
<code>/clientnegative</code> if it is a negative test).
-<ul>
-<li>If the new test creates any table, view, function, etc., make sure that
the name is unique across tests. For instance, name a table in the test file
<code>foo.q</code>, <code>foo_t1</code> instead of simply <code>t1</code>. This
will help reduce flakiness in the test runs, since Jenkins will run tests and
batches, and currently it does not restore to former state after running each
of the q files.</li>
-<li>If there is any interaction with file system, use unique folders for the
test to avoid any collision with other tests.</li>
-</ul>
-</li>
-<li>Add the <code><filename.q></code> to
<code>itests/src/test/resources/testconfiguration.properties</code> to the
appropriate variable (ex. <code>minimr.query.files</code>).</li>
-</ul>
-<p>Next, generate the golden (output) file the first time you run the test
case:</p>
-<ul>
-<li>Compile the Hive source from the top level Hive directory:</li>
-</ul>
-<pre tabindex=0><code>mvn clean install -DskipTests
-</code></pre><ul>
-<li>Compile the itests:</li>
-</ul>
-<pre tabindex=0><code>cd itests
-mvn clean install -DskipTests
-</code></pre><ul>
-<li>Run the test and generate the output file using the appropriate
<code>-Dtest</code> (ex. <code>TestCliDriver</code>; see
<code>itests/qtest/pom.xml</code> for the names of other test suites):</li>
-</ul>
-<pre tabindex=0><code>cd qtest
-mvn test -Dtest=TestCliDriver -Dqfile=<filename>.q
-Dtest.output.overwrite=true
-</code></pre><ul>
-<li>Add your output file to
<code>ql/src/test/results/clientpositive/<filename>.q.out</code> (or
<code>/clientnegative</code> if it is a negative test).</li>
-</ul>
-<p>With the above steps, you can <a href=#create-a-patch>create a patch</a>
which has a <code>.java</code> file, a <code>.q</code> file and a
<code>.q.out</code> file.</p>
+<p><a href=/development/qtest/>See this page</a>.</p>
<h3 id=why-isnt-the-itests-pom-connected-to-the-root-pom>Why isn’t the
itests pom connected to the root pom?</h3>
<p>It would be great to have it connected, but it would make it harder to use
<strong>mvn test</strong> locally. The best option would be to utilize the
failsafe plugin for integration testing; but it needs a bit different setup,
and it’s harder to use for now…. If you’d like to give that a
try, by all means, go ahead.</p>
<p>There is an option to attach all the itest subprojects to the main project
by enabling this with <strong>-Pitests</strong> (<a
href=https://issues.apache.org/jira/browse/HIVE-13490>HIVE-13490</a>).</p>
@@ -420,15 +309,7 @@ mvn test -Dtest=TestCliDriver -Dqfile=<filename>.q
-Dtest.output.overwrite
at
org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:284)
at
org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:124)
-</code></pre><h3 id=why-do-spark-unit-tests-fail-with-a-securityexception>Why
do Spark unit tests fail with a SecurityException?</h3>
-<p>If you get the following errors in the unit tests:</p>
-<pre tabindex=0><code>2016-01-07T09:51:49,365 ERROR [Driver[]]:
spark.SparkContext (Logging.scala:logError(96)) - Error initializing
SparkContext.
-java.lang.SecurityException: class
"javax.servlet.FilterRegistration"'s signer information does not
match signer information of other classes in the same package
-
-</code></pre><p>It happens because there are two conflicting versions of the
same classes “javax.servlet:servlet-api” and
“org.eclipse.jetty.orbit:javax-servlet”. Spark requires the eclipse
version, but most tools including Hadoop and Jetty depend on the javax one. To
avoid this problem, we need to exclude the javax version everywhere it comes
up. Fortunately, maven has a tool to do that with:</p>
-<pre tabindex=0><code>mvn dependency:tree -Dverbose
-</code></pre><p>which prints out the dependency tree. Go to each directory
with the failing unit tests and search the dependency tree for
“javax.servlet:servlet-api” and use exclusions in the pom.xml to
remove it. See <a
href=https://issues.apache.org/jira/browse/HIVE-12783>HIVE-12783</a> for an
example.</p>
-<h2 id=debugging>Debugging</h2>
+</code></pre><h2 id=debugging>Debugging</h2>
<h3 id=how-do-i-debug-into-asingle-testin-eclipse>How do I debug into a single
test in Eclipse?</h3>
<p>You can debug into a single JUnit test in <a
href=https://www.eclipse.org/users/>Eclipse</a> by first making sure
you’ve built the Eclipse files and imported the project into Eclipse as
described <a href=#here>here</a>. Then set one or more breakpoints, highlight
the method name of the JUnit test method you want to debug into, and do
<code>Run->Debug</code>.</p>
<p>Another useful method to debug these tests is to attach a remote debugger.
When you run the test, enable the debug mode for surefire by passing in
“<code>-Dmaven.surefire.debug</code>”. Additional details on how to
turning on debugging for surefire can be found <a
href=http://maven.apache.org/surefire/maven-surefire-plugin/examples/debugging.html>here</a>.
Now when you run the tests, it will wait with a message similar to</p>
diff --git a/docs/latest/howtocontribute_27362107/index.html
b/docs/latest/howtocontribute_27362107/index.html
index 61507d5..8eaa338 100644
--- a/docs/latest/howtocontribute_27362107/index.html
+++ b/docs/latest/howtocontribute_27362107/index.html
@@ -123,8 +123,8 @@ ASF
<h1 id=apache-hive--howtocontribute>Apache Hive : HowToContribute</h1>
<h1 id=how-to-contribute-to-apache-hive>How to Contribute to Apache Hive</h1>
<p>This page describes the mechanics of <em>how</em> to contribute software to
Apache Hive. For ideas about <em>what</em> you might contribute, please see
open tickets in <a href=https://issues.apache.org/jira/browse/HIVE>Jira</a>.</p>
-<ul>
-<li><a href=#how-to-contribute-to-apache-hive>How to Contribute to Apache
Hive</a>
+<aside class=table-of-contents>
+<nav id=TableOfContents>
<ul>
<li><a href=#getting-the-source-code>Getting the Source Code</a></li>
<li><a href=#becoming-a-contributor>Becoming a Contributor</a></li>
@@ -133,21 +133,9 @@ ASF
<li><a href=#coding-conventions>Coding Conventions</a></li>
<li><a href=#understanding-maven>Understanding Maven</a></li>
<li><a href=#understanding-hive-branches>Understanding Hive Branches</a></li>
-<li><a href=#hadoop-dependencies>Hadoop Dependencies</a>
-<ul>
-<li><a href=#branch-1>branch-1</a></li>
-<li><a href=#branch-2>branch-2</a></li>
-</ul>
-</li>
+<li><a href=#hadoop-dependencies>Hadoop Dependencies</a></li>
<li><a href=#unit-tests>Unit Tests</a></li>
-<li><a href=#add-a-unit-test>Add a Unit Test</a>
-<ul>
-<li><a href=#java-unit-test>Java Unit Test</a></li>
-<li><a href=#query-unit-test>Query Unit Test</a></li>
-<li><a href=#beeline-query-unit-test>Beeline Query Unit Test</a></li>
-</ul>
-</li>
-<li><a href=#debugging>Debugging</a></li>
+<li><a href=#add-a-unit-test>Add a Unit Test</a></li>
<li><a href=#submitting-a-pr>Submitting a PR</a></li>
<li><a href=#fetching-a-pr-from-github>Fetching a PR from Github</a></li>
</ul>
@@ -161,8 +149,8 @@ ASF
<li><a href=#generating-thrift-code>Generating Thrift Code</a></li>
<li><a href=#see-also>See Also</a></li>
</ul>
-</li>
-</ul>
+</nav>
+</aside>
<h2 id=getting-the-source-code>Getting the Source Code</h2>
<p>First of all, you need the Hive source code.</p>
<p>Get the source code on your local drive using git. See <a
href=#understanding-hive-branches>Understanding Hive Branches</a> below to
understand which branch you should be using.</p>
@@ -275,45 +263,8 @@ mvn test -Dtest=SomeTest
<li>Add a new class (name must start with <code>Test</code>) in the
component’s <code>*/src/test/java</code> directory.</li>
<li>To test only the new testcase, run <code>mvn test -Dtest=TestAbc</code>
(where <code>TestAbc</code> is the name of the new class), which will be faster
than <code>mvn test</code> which tests all testcases.</li>
</ul>
-<h4 id=query-unit-test>Query Unit Test</h4>
-<p>If the new feature can be tested using a Hive query in the command line, we
just need to add a new <code>*.q</code> file and a new <code>*.q.out</code>
file.</p>
-<p>If the feature is added in <code>ql</code> (query language):</p>
-<ul>
-<li>Add a new <code>XXXXXX.q</code> file in
<code>ql/src/test/queries/clientpositive</code>. (Optionally, add a new
<code>XXXXXX.q</code> file for a query that is expected to fail in
<code>ql/src/test/queries/clientnegative</code>.)</li>
-<li>Run <code>mvn test -Dtest=TestMiniLlapLocalCliDriver -Dqfile=XXXXXX.q
-Dtest.output.overwrite=true</code>. This will generate a new
<code>XXXXXX.q.out</code> file in
<code>ql/src/test/results/clientpositive</code>.
-<ul>
-<li>If you want to run multiple .q files in the test run, you can specify
comma separated .q files, for example <code>-Dqfile="X1.q,X2.q"</code>. You can
also specify a Java regex, for example <code>-Dqfile_regex='join.*'</code>.
(Note that it takes Java regex, i.e., <code>'join.*</code>'`` and not
<code>'join*'</code>.) The regex match first removes the <code>.q</code> from
the file name before matching regex, so specifying <code>join*.q</code> will
not work.</li>
-</ul>
-</li>
-</ul>
-<p>If the feature is added in <code>contrib</code>:</p>
-<ul>
-<li>Do the steps above, replacing <code>ql</code> with <code>contrib</code>,
and <code>TestCliDriver</code> with <code>TestContribCliDriver</code>.</li>
-</ul>
-<p>See the FAQ “<a href=#how-do-i-add-a-test-case?>How do I add a test
case?</a>” for more details.</p>
-<h4 id=beeline-query-unit-test>Beeline Query Unit Test</h4>
-<p>Legacy query test Drivers (all of them except TestBeeLineDriver) uses
HiveCli to run the tests. TestBeeLineDriver runs the tests using the <a
href=#beeline-client>Beeline client</a>. Creates a specific database for them,
so the tests can run parallel. Running the tests you have the following
configuration options:</p>
-<ul>
-<li><code>-Dqfile=XXXXXX.q</code> - To run one or more specific query file
tests. For the exact format, check the Query Unit Test paragraph. If not
provided only those query files from
<code>ql/src/test/queries/clientpositive</code> directory will be run which are
mentioned in
<code>itests/src/test/resources/testconfiguration.properties</code> in the
<code>beeline.positive.include</code> parameter.</li>
-<li><code>-Dtest.output.overwrite=true</code> - This will rewrite the output
of the q.out files in <code>ql/src/test/results/clientpositive/beeline</code>.
The default value is false, and it will check the current output against the
golden files</li>
-<li><code>-Dtest.beeline.compare.portable</code> - If this parameter is true,
the generated and the golden query output files will be filtered before
comparing them. This way the existing query tests can be run against different
configurations using the same golden output files. The result of the following
commands will be filtered out from the output files: EXPLAIN, DESCRIBE,
DESCRIBE EXTENDED, DESCRIBE FORMATTED, SHOW TABLES, SHOW FORMATTED INDEXES and
SHOW DATABASES.<br>
-The default value is <code>false</code>.</li>
-<li><code>-Djunit.parallel.threads=1</code> - The number of the parallel
threads running the tests. The default is <code>1</code>. There were some
flakiness caused by parallelization</li>
-<li><code>-Djunit.parallel.timeout=10</code> - The tests are terminated after
the given timeout. The parameter is set in minutes and the default is 10
minutes. (As of <a href=https://issues.apache.org/jira/browse/HIVE-17072>HIVE
3.0.0</a>.)</li>
-<li>The BeeLine tests could run against an existing cluster. Or if not
provided, then against a MiniHS2 cluster created during the tests.
-<ul>
-<li><code>-Dtest.beeline.url</code> - The jdbc url which should be used to
connect to the existing cluster. If not set then a MiniHS2 cluster will be
created instead.</li>
-<li><code>-Dtest.beeline.user</code> - The user which should be used to
connect to the cluster. If not set <code>"user"</code> will be used.</li>
-<li><code>-Dtest.beeline.password</code> - The password which should be used
to connect to the cluster. If not set <code>"password"</code> will be used.</li>
-<li><code>-Dtest.data.dir</code> - The test data directory on the cluster. If
not set <code><HIVEROOT>/data/files</code> will be used.</li>
-<li><code>-Dtest.results.dir</code> - The test results directory to compare
against. If not set the default configuration will be used.</li>
-<li><code>-Dtest.init.script</code> - The test init script. If not set the
default configuration will be used.</li>
-<li><code>-Dtest.beeline.shared.database</code> - If true, then the default
database will be used, otherwise a test-specific database will be created for
every run. The default value is false.</li>
-</ul>
-</li>
-</ul>
-<h3 id=debugging>Debugging</h3>
-<p>Please see <a href=#debugging-hive-code>Debugging Hive code</a> in
Development Guide.</p>
+<h4 id=query-file-testqtest>Query File Test(qtest)</h4>
+<p><a href=/development/qtest/>You can run end-to-end integration tests using
LLAP, Tez, Iceberg, etc.</a></p>
<h3 id=submitting-a-pr>Submitting a PR</h3>
<p>There are many excellent howtos about how to submit pullrequests for github
projects. The following is one way to approach it:</p>
<p>Setting up a repo with 2 remotes; I would recommend to use the github user
as the remote name - as it may make things easier if you need to add someone
else’s repo as well.</p>
diff --git a/docs/latest/index.xml b/docs/latest/index.xml
index d9f15b8..679b1ca 100644
--- a/docs/latest/index.xml
+++ b/docs/latest/index.xml
@@ -47,7 +47,7 @@ T is a partitioned table by date and hour, and Tsignal is an
external table whic
Hive Architecture Hive Data Model Metastore Motivation Metadata Objects
Metastore Architecture Metastore Interface Hive Query Language Compiler
Optimizer Hive APIs Figure 1
Hive Architecture Figure 1 shows the major components of Hive and its
interactions with Hadoop. As shown in that figure, the main components of Hive
are:</description></item><item><title>Apache Hive :
DesignDocs</title><link>https://hive.apache.org/docs/latest/designdocs_27362075/</link><pubDate>Thu,
12 Dec 2024 00:00:00
+0000</pubDate><guid>https://hive.apache.org/docs/latest/designdocs_27362075/</guid><description>Apache
Hive : DesignDocs Hive Design Documents Proposals that appear in [...]
Completed Views (HIVE-1143) Partitioned Views (HIVE-1941) Storage Handlers
(HIVE-705) HBase Integration HBase Bulk Load Locking (HIVE-1293) Indexes
(HIVE-417) Bitmap Indexes (HIVE-1803) Filter Pushdown (HIVE-279) Table-level
Statistics (HIVE-1361) Dynamic Partitions Binary Data Type (HIVE-2380) Decimal
Precision and Scale Support HCatalog (formerly Howl) HiveServer2 (HIVE-2935)
Column Statistics in Hive (HIVE-1362) List Bucketing (HIVE-3026) Group By With
Rollup (HIVE-2397) Enhanced Aggr [...]
-Hive Developer Guide Code organization and architecture Compiling and running
Hive Unit tests Debugging Hive code Pluggable interfaces Hive Developer FAQ
Moving files Building Hive Testing Hive MiniDriver and Beeline tests Plugin
Developer Kit Writing UDTFs Hive on Spark: Getting
Started</description></item><item><title>Apache Hive :
DeveloperGuide</title><link>https://hive.apache.org/docs/latest/developerguide_27362074/</link><pubDate>Thu,
12 Dec 2024 00:00:00 +0000</pubDate><guid>https [...]
+Hive Developer Guide Code organization and architecture Compiling and running
Hive Unit tests Debugging Hive code Pluggable interfaces Hive Developer FAQ
Moving files Building Hive Testing Hive MiniDriver and Beeline tests Plugin
Developer Kit Writing UDTFs Hive on Spark: Getting
Started</description></item><item><title>Apache Hive :
DeveloperGuide</title><link>https://hive.apache.org/docs/latest/developerguide_27362074/</link><pubDate>Thu,
12 Dec 2024 00:00:00 +0000</pubDate><guid>https [...]
Meeting Minutes April 18, 2012 December 5, 2011 September 7, 2011 July 26,
2011 June 30, 2011 April 25, 2011 January 11, 2011 (forgot to take notes)
October 25, 2010 September 13, 2010 August 8, 2010 July 6, 2010 June 1,
2010</description></item><item><title>Apache Hive : Development
ContributorsMeetings
HiveContributorsMinutes100601</title><link>https://hive.apache.org/docs/latest/development-contributorsmeetings-hivecontributorsminutes100601_27362084/</link><pubDate>Thu,
12 Dec 2024 00 [...]
The following people were present:
Facebook (Paul Yang; Ning Zhang; Yongqiang He; Ahmed Aly; John Sichi; Ashish
Thusoo; Namit Jain) Netflix (Eva Tse; Jerome Boulon) Cloudera (Arvind
Prabhakar; Vinithra Varadharajan; Carl Steinbach) Yahoo (Alan Gates) The
following were the main meeting minutes:
@@ -143,7 +143,7 @@ Facebook (Paul Yang; Ning Zhang; Yongqiang He; Ahmed Aly;
John Sichi; Ashish Thu
We should have these meetings more often, say every month. Cloudera will host
the next meeting.</description></item><item><title>Apache Hive :
HiveContributorsMinutes100706</title><link>https://hive.apache.org/docs/latest/hivecontributorsminutes100706_27362065/</link><pubDate>Thu,
12 Dec 2024 00:00:00
+0000</pubDate><guid>https://hive.apache.org/docs/latest/hivecontributorsminutes100706_27362065/</guid><description>Apache
Hive : HiveContributorsMinutes100706 Attendees: Amr Awadallah, Joh [...]
bc Wong gave a live demo of Cloudera&rsquo;s Hue framework and the Beeswax
Hive web interface. Slides from this talk are available here:
http://www.slideshare.net/cwsteinbach/cloudera-huebeeswax Hue was recently
released as open source. The code is available on Github here:
http://github.com/cloudera/hue Olga Natkovich gave a whiteboard talk on
HOwl.</description></item><item><title>Apache Hive :
HiveCounters</title><link>https://hive.apache.org/docs/latest/hivecounters_67636835/</li
[...]
For Tez execution, %context is set to the mapper/reducer name. For other
execution engines it is not included in the counter name.
-Counter Name Description RECORDS_IN[_%context] Input records read
RECORDS_OUT[_%context] Output records written
RECORDS_OUT_INTERMEDIATE[_%context] Records written as intermediate records to
ReduceSink (which become input records to other tasks) CREATED_FILES Number of
files created DESERIALIZE_ERRORS Deserialization errors encountered while
reading data</description></item><item><title>Apache Hive :
HiveDerbyServerMode</title><link>https://hive.apache.org/docs/latest/hivederbyservermode
[...]
+Counter Name Description RECORDS_IN[_%context] Input records read
RECORDS_OUT[_%context] Output records written
RECORDS_OUT_INTERMEDIATE[_%context] Records written as intermediate records to
ReduceSink (which become input records to other tasks) CREATED_FILES Number of
files created DESERIALIZE_ERRORS Deserialization errors encountered while
reading data</description></item><item><title>Apache Hive :
HiveDerbyServerMode</title><link>https://hive.apache.org/docs/latest/hivederbyservermode
[...]
To see how the JDBC interface can be used, see sample code.
Integration with Pentaho Download pentaho report designer from the pentaho
website. Overwrite report-designer.</description></item><item><title>Apache
Hive :
HiveODBC</title><link>https://hive.apache.org/docs/latest/hiveodbc_27362099/</link><pubDate>Thu,
12 Dec 2024 00:00:00
+0000</pubDate><guid>https://hive.apache.org/docs/latest/hiveodbc_27362099/</guid><description>Apache
Hive : HiveODBC Hive ODBC Driver Hive ODBC Driver Introduction Suggested
Reading Software Requirements Driver Arch [...]
package com.example.hive.udf; import org.apache.hadoop.hive.ql.exec.UDF;
import org.apache.hadoop.io.Text; public final class Lower extends UDF { public
Text evaluate(final Text s) { if (s == null) { return null; } return new
Text(s.toString().toLowerCase()); } } (Note that there&rsquo;s already a
built-in function for this, it&rsquo;s just an easy
example).</description></item><item><title>Apache Hive :
HiveQL</title><link>https://hive.apache.org/docs/latest/hiveql_27362097/</li
[...]
@@ -155,7 +155,7 @@ How-to article Provide step-by-step guidance for completing
a task.
Add how-to article</description></item><item><title>Apache Hive :
Howl</title><link>https://hive.apache.org/docs/latest/howl_27362109/</link><pubDate>Thu,
12 Dec 2024 00:00:00
+0000</pubDate><guid>https://hive.apache.org/docs/latest/howl_27362109/</guid><description>Apache
Hive : Howl This page collects some pointers to resources about Howl (an
effort to create a metastore for all of Hadoop) and how its first incarnation
is being built by reusing and extending Hive&rsquo;s metastore [...]
Howl wiki Yahoo group for Howl developers (including mailing list archive)
Howl source code at github Howl CLI functional spec Original plans for Owl
(predecessor to Howl)</description></item><item><title>Apache Hive :
HowToCommit</title><link>https://hive.apache.org/docs/latest/howtocommit_27362108/</link><pubDate>Thu,
12 Dec 2024 00:00:00
+0000</pubDate><guid>https://hive.apache.org/docs/latest/howtocommit_27362108/</guid><description>Apache
Hive : HowToCommit Guide for Hive Committers [...]
New committers New committers are encouraged to first read Apache&rsquo;s
generic committer documentation:</description></item><item><title>Apache Hive :
HowToContribute</title><link>https://hive.apache.org/docs/latest/howtocontribute_27362107/</link><pubDate>Thu,
12 Dec 2024 00:00:00
+0000</pubDate><guid>https://hive.apache.org/docs/latest/howtocontribute_27362107/</guid><description>Apache
Hive : HowToContribute How to Contribute to Apache Hive This page describes
the mechanics of [...]
-How to Contribute to Apache Hive Getting the Source Code Becoming a
Contributor Making Changes Coding Conventions Understanding Maven Understanding
Hive Branches Hadoop Dependencies branch-1 branch-2 Unit Tests Add a Unit Test
Java Unit Test Query Unit Test Beeline Query Unit Test Debugging Submitting a
PR Fetching a PR from Github Contributing Your Work JIRA Guidelines Generating
Thrift Code See Also Getting the Source Code First of all, you need the Hive
source code.</description></ite [...]
+Getting the Source Code Becoming a Contributor Making Changes Coding
Conventions Understanding Maven Understanding Hive Branches Hadoop Dependencies
Unit Tests Add a Unit Test Submitting a PR Fetching a PR from Github
Contributing Your Work JIRA Guidelines Generating Thrift Code See Also Getting
the Source Code First of all, you need the Hive source
code.</description></item><item><title>Apache Hive :
HowToRelease</title><link>https://hive.apache.org/docs/latest/howtorelease_27362106/</l
[...]
Storage API Release Storage API Prepare Master Branch Storage API Branching
Making Storage API Release Artifacts Publishing the Storage API Artifacts
Preparing Branch for further development Cleaning Up Storage API Artifacts Hive
Release Preparation Branching Updating Release Branch Building Voting Verifying
the Release Candidate Publishing Archive old releases Preparing Branch for
Future Maintenance Release See Also Hadoop Version
Warning</description></item><item><title>Apache Hive : H [...]
Materialized views with automatic rewriting can result in very similar
results. Hive 2.3.0 adds support for materialzed views. Using columnar file
formats (Parquet, ORC) – they can do selective scanning; they may even skip
entire files/blocks.</description></item><item><title>Apache Hive : IndexDev
Bitmap</title><link>https://hive.apache.org/docs/latest/indexdev-bitmap_27362028/</link><pubDate>Thu,
12 Dec 2024 00:00:00
+0000</pubDate><guid>https://hive.apache.org/docs/latest/indexdev-bit [...]
Introduction Approach Proposal First implementation Second iteration Example
Introduction This document explains the proposed design for adding a bitmap
index handler (https://issues.apache.org/jira/browse/HIVE-1803).
diff --git a/index.json b/index.json
index 83880dc..dacf50f 100644
--- a/index.json
+++ b/index.json
@@ -1 +1 @@
-[{"categories":null,"contents":"Apache Hive : AboutThisWiki How to get
permission to edit How to edit the Hive wiki Advanced links How to find
documentation tasks How to export the Hive wiki History This page provides
information about the Hive wiki, including how to edit and export as well as a
brief history of the wiki. Note that the Hive website is separate from the Hive
wiki. See How to edit the website for information on editing the website.\nHow
to get permission to edit Crea [...]
\ No newline at end of file
+[{"categories":null,"contents":"Query File Test(qtest) Query File Test is a
JUnit-based integration test suite for Apache Hive. Developers write any SQL;
the testing framework runs it and verifies the result and output.\n Tutorial:
How to run a specific test case Preparation Run a test case Tutorial: How to
add a new test case Add a QFile Generate a result file Verify the new result
file Commandline options Test options Test Iceberg, Accumulo, or Kudu
QTestOptionHandler: pre/pos [...]
\ No newline at end of file
diff --git a/index.xml b/index.xml
index e5757d7..6654512 100644
--- a/index.xml
+++ b/index.xml
@@ -1,4 +1,5 @@
-<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0"
xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Apache Hive on Hive
Site</title><link>https://hive.apache.org/</link><description>Recent content in
Apache Hive on Hive Site</description><generator>Hugo --
gohugo.io</generator><language>en-us</language><lastBuildDate>Fri, 27 Jan 2023
19:16:15 +0530</lastBuildDate><atom:link
href="https://hive.apache.org/index.xml" rel="self"
type="application/rss+xml"/><ite [...]
+<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0"
xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Apache Hive on Hive
Site</title><link>https://hive.apache.org/</link><description>Recent content in
Apache Hive on Hive Site</description><generator>Hugo --
gohugo.io</generator><language>en-us</language><lastBuildDate>Fri, 27 Jan 2023
19:16:15 +0530</lastBuildDate><atom:link
href="https://hive.apache.org/index.xml" rel="self"
type="application/rss+xml"/><ite [...]
+Tutorial: How to run a specific test case Preparation Run a test case
Tutorial: How to add a new test case Add a QFile Generate a result file Verify
the new result file Commandline options Test options Test Iceberg, Accumulo, or
Kudu QTestOptionHandler: pre/post-processor Using test data Mask
non-deterministic outputs Advanced Locations of log files Negative tests How to
specify drivers How to use PostgreSQL/MySQL/Oracle as a backend database for
Hive Metastore Remote debug Tutorial: How [...]
Make Apache Hive’s data model and metadata services accessible to users of the
Apache Pig dataflow programming language as well as other Hadoop language
runtimes. Make it possible for Hive users and users of other Hadoop language
runtimes to share data stored in Hive’s HDFS data
warehouse.</description></item><item><title>Apache Hive :
AccumuloIntegration</title><link>https://hive.apache.org/docs/latest/accumulointegration_46633569/</link><pubDate>Thu,
12 Dec 2024 00:00:00 +0000</pubDate [...]
Version Note Introduction Changes From Hive 2 to Hive 3 General Configuration
RDBMS Option 1: Embedding Derby Option 2: External RDBMS Supported RDBMSs
Installing and Upgrading the Metastore Schema Running the Metastore Embedded
Mode Metastore Server High Availability Securing the Service Running the
Metastore Without Hive Performance Optimizations CachedStore Less Commonly
Changed Configuration Parameters Version Note This document applies only to the
Metastore in Hive 3.</description>< [...]
This information is versioned by Hive release version, allowing a user to
quickly identify features available to them.
@@ -48,7 +49,7 @@ T is a partitioned table by date and hour, and Tsignal is an
external table whic
Hive Architecture Hive Data Model Metastore Motivation Metadata Objects
Metastore Architecture Metastore Interface Hive Query Language Compiler
Optimizer Hive APIs Figure 1
Hive Architecture Figure 1 shows the major components of Hive and its
interactions with Hadoop. As shown in that figure, the main components of Hive
are:</description></item><item><title>Apache Hive :
DesignDocs</title><link>https://hive.apache.org/docs/latest/designdocs_27362075/</link><pubDate>Thu,
12 Dec 2024 00:00:00
+0000</pubDate><guid>https://hive.apache.org/docs/latest/designdocs_27362075/</guid><description>Apache
Hive : DesignDocs Hive Design Documents Proposals that appear in [...]
Completed Views (HIVE-1143) Partitioned Views (HIVE-1941) Storage Handlers
(HIVE-705) HBase Integration HBase Bulk Load Locking (HIVE-1293) Indexes
(HIVE-417) Bitmap Indexes (HIVE-1803) Filter Pushdown (HIVE-279) Table-level
Statistics (HIVE-1361) Dynamic Partitions Binary Data Type (HIVE-2380) Decimal
Precision and Scale Support HCatalog (formerly Howl) HiveServer2 (HIVE-2935)
Column Statistics in Hive (HIVE-1362) List Bucketing (HIVE-3026) Group By With
Rollup (HIVE-2397) Enhanced Aggr [...]
-Hive Developer Guide Code organization and architecture Compiling and running
Hive Unit tests Debugging Hive code Pluggable interfaces Hive Developer FAQ
Moving files Building Hive Testing Hive MiniDriver and Beeline tests Plugin
Developer Kit Writing UDTFs Hive on Spark: Getting
Started</description></item><item><title>Apache Hive :
DeveloperGuide</title><link>https://hive.apache.org/docs/latest/developerguide_27362074/</link><pubDate>Thu,
12 Dec 2024 00:00:00 +0000</pubDate><guid>https [...]
+Hive Developer Guide Code organization and architecture Compiling and running
Hive Unit tests Debugging Hive code Pluggable interfaces Hive Developer FAQ
Moving files Building Hive Testing Hive MiniDriver and Beeline tests Plugin
Developer Kit Writing UDTFs Hive on Spark: Getting
Started</description></item><item><title>Apache Hive :
DeveloperGuide</title><link>https://hive.apache.org/docs/latest/developerguide_27362074/</link><pubDate>Thu,
12 Dec 2024 00:00:00 +0000</pubDate><guid>https [...]
Meeting Minutes April 18, 2012 December 5, 2011 September 7, 2011 July 26,
2011 June 30, 2011 April 25, 2011 January 11, 2011 (forgot to take notes)
October 25, 2010 September 13, 2010 August 8, 2010 July 6, 2010 June 1,
2010</description></item><item><title>Apache Hive : Development
ContributorsMeetings
HiveContributorsMinutes100601</title><link>https://hive.apache.org/docs/latest/development-contributorsmeetings-hivecontributorsminutes100601_27362084/</link><pubDate>Thu,
12 Dec 2024 00 [...]
The following people were present:
Facebook (Paul Yang; Ning Zhang; Yongqiang He; Ahmed Aly; John Sichi; Ashish
Thusoo; Namit Jain) Netflix (Eva Tse; Jerome Boulon) Cloudera (Arvind
Prabhakar; Vinithra Varadharajan; Carl Steinbach) Yahoo (Alan Gates) The
following were the main meeting minutes:
@@ -144,7 +145,7 @@ Facebook (Paul Yang; Ning Zhang; Yongqiang He; Ahmed Aly;
John Sichi; Ashish Thu
We should have these meetings more often, say every month. Cloudera will host
the next meeting.</description></item><item><title>Apache Hive :
HiveContributorsMinutes100706</title><link>https://hive.apache.org/docs/latest/hivecontributorsminutes100706_27362065/</link><pubDate>Thu,
12 Dec 2024 00:00:00
+0000</pubDate><guid>https://hive.apache.org/docs/latest/hivecontributorsminutes100706_27362065/</guid><description>Apache
Hive : HiveContributorsMinutes100706 Attendees: Amr Awadallah, Joh [...]
bc Wong gave a live demo of Cloudera&rsquo;s Hue framework and the Beeswax
Hive web interface. Slides from this talk are available here:
http://www.slideshare.net/cwsteinbach/cloudera-huebeeswax Hue was recently
released as open source. The code is available on Github here:
http://github.com/cloudera/hue Olga Natkovich gave a whiteboard talk on
HOwl.</description></item><item><title>Apache Hive :
HiveCounters</title><link>https://hive.apache.org/docs/latest/hivecounters_67636835/</li
[...]
For Tez execution, %context is set to the mapper/reducer name. For other
execution engines it is not included in the counter name.
-Counter Name Description RECORDS_IN[_%context] Input records read
RECORDS_OUT[_%context] Output records written
RECORDS_OUT_INTERMEDIATE[_%context] Records written as intermediate records to
ReduceSink (which become input records to other tasks) CREATED_FILES Number of
files created DESERIALIZE_ERRORS Deserialization errors encountered while
reading data</description></item><item><title>Apache Hive :
HiveDerbyServerMode</title><link>https://hive.apache.org/docs/latest/hivederbyservermode
[...]
+Counter Name Description RECORDS_IN[_%context] Input records read
RECORDS_OUT[_%context] Output records written
RECORDS_OUT_INTERMEDIATE[_%context] Records written as intermediate records to
ReduceSink (which become input records to other tasks) CREATED_FILES Number of
files created DESERIALIZE_ERRORS Deserialization errors encountered while
reading data</description></item><item><title>Apache Hive :
HiveDerbyServerMode</title><link>https://hive.apache.org/docs/latest/hivederbyservermode
[...]
To see how the JDBC interface can be used, see sample code.
Integration with Pentaho Download pentaho report designer from the pentaho
website. Overwrite report-designer.</description></item><item><title>Apache
Hive :
HiveODBC</title><link>https://hive.apache.org/docs/latest/hiveodbc_27362099/</link><pubDate>Thu,
12 Dec 2024 00:00:00
+0000</pubDate><guid>https://hive.apache.org/docs/latest/hiveodbc_27362099/</guid><description>Apache
Hive : HiveODBC Hive ODBC Driver Hive ODBC Driver Introduction Suggested
Reading Software Requirements Driver Arch [...]
package com.example.hive.udf; import org.apache.hadoop.hive.ql.exec.UDF;
import org.apache.hadoop.io.Text; public final class Lower extends UDF { public
Text evaluate(final Text s) { if (s == null) { return null; } return new
Text(s.toString().toLowerCase()); } } (Note that there&rsquo;s already a
built-in function for this, it&rsquo;s just an easy
example).</description></item><item><title>Apache Hive :
HiveQL</title><link>https://hive.apache.org/docs/latest/hiveql_27362097/</li
[...]
@@ -156,7 +157,7 @@ How-to article Provide step-by-step guidance for completing
a task.
Add how-to article</description></item><item><title>Apache Hive :
Howl</title><link>https://hive.apache.org/docs/latest/howl_27362109/</link><pubDate>Thu,
12 Dec 2024 00:00:00
+0000</pubDate><guid>https://hive.apache.org/docs/latest/howl_27362109/</guid><description>Apache
Hive : Howl This page collects some pointers to resources about Howl (an
effort to create a metastore for all of Hadoop) and how its first incarnation
is being built by reusing and extending Hive&rsquo;s metastore [...]
Howl wiki Yahoo group for Howl developers (including mailing list archive)
Howl source code at github Howl CLI functional spec Original plans for Owl
(predecessor to Howl)</description></item><item><title>Apache Hive :
HowToCommit</title><link>https://hive.apache.org/docs/latest/howtocommit_27362108/</link><pubDate>Thu,
12 Dec 2024 00:00:00
+0000</pubDate><guid>https://hive.apache.org/docs/latest/howtocommit_27362108/</guid><description>Apache
Hive : HowToCommit Guide for Hive Committers [...]
New committers New committers are encouraged to first read Apache&rsquo;s
generic committer documentation:</description></item><item><title>Apache Hive :
HowToContribute</title><link>https://hive.apache.org/docs/latest/howtocontribute_27362107/</link><pubDate>Thu,
12 Dec 2024 00:00:00
+0000</pubDate><guid>https://hive.apache.org/docs/latest/howtocontribute_27362107/</guid><description>Apache
Hive : HowToContribute How to Contribute to Apache Hive This page describes
the mechanics of [...]
-How to Contribute to Apache Hive Getting the Source Code Becoming a
Contributor Making Changes Coding Conventions Understanding Maven Understanding
Hive Branches Hadoop Dependencies branch-1 branch-2 Unit Tests Add a Unit Test
Java Unit Test Query Unit Test Beeline Query Unit Test Debugging Submitting a
PR Fetching a PR from Github Contributing Your Work JIRA Guidelines Generating
Thrift Code See Also Getting the Source Code First of all, you need the Hive
source code.</description></ite [...]
+Getting the Source Code Becoming a Contributor Making Changes Coding
Conventions Understanding Maven Understanding Hive Branches Hadoop Dependencies
Unit Tests Add a Unit Test Submitting a PR Fetching a PR from Github
Contributing Your Work JIRA Guidelines Generating Thrift Code See Also Getting
the Source Code First of all, you need the Hive source
code.</description></item><item><title>Apache Hive :
HowToRelease</title><link>https://hive.apache.org/docs/latest/howtorelease_27362106/</l
[...]
Storage API Release Storage API Prepare Master Branch Storage API Branching
Making Storage API Release Artifacts Publishing the Storage API Artifacts
Preparing Branch for further development Cleaning Up Storage API Artifacts Hive
Release Preparation Branching Updating Release Branch Building Voting Verifying
the Release Candidate Publishing Archive old releases Preparing Branch for
Future Maintenance Release See Also Hadoop Version
Warning</description></item><item><title>Apache Hive : H [...]
Materialized views with automatic rewriting can result in very similar
results. Hive 2.3.0 adds support for materialzed views. Using columnar file
formats (Parquet, ORC) – they can do selective scanning; they may even skip
entire files/blocks.</description></item><item><title>Apache Hive : IndexDev
Bitmap</title><link>https://hive.apache.org/docs/latest/indexdev-bitmap_27362028/</link><pubDate>Thu,
12 Dec 2024 00:00:00
+0000</pubDate><guid>https://hive.apache.org/docs/latest/indexdev-bit [...]
Introduction Approach Proposal First implementation Second iteration Example
Introduction This document explains the proposed design for adding a bitmap
index handler (https://issues.apache.org/jira/browse/HIVE-1803).
diff --git a/qtest.html b/qtest.html
new file mode 100644
index 0000000..ff0e69b
--- /dev/null
+++ b/qtest.html
@@ -0,0 +1 @@
+<!doctype
html><html><head><title>https://hive.apache.org/development/qtest/</title><link
rel=canonical href=https://hive.apache.org/development/qtest/><meta name=robots
content="noindex"><meta charset=utf-8><meta http-equiv=refresh content="0;
url=https://hive.apache.org/development/qtest/"></head></html>
\ No newline at end of file
diff --git a/sitemap.xml b/sitemap.xml
index cfb5d85..07c475a 100644
--- a/sitemap.xml
+++ b/sitemap.xml
@@ -1 +1 @@
-<?xml version="1.0" encoding="utf-8" standalone="yes"?><urlset
xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"
xmlns:xhtml="http://www.w3.org/1999/xhtml"><url><loc>https://hive.apache.org/search/</loc><lastmod>2024-12-25T17:12:03+05:30</lastmod></url><url><loc>https://hive.apache.org/docs/latest/aboutthiswiki_27820116/</loc><lastmod>2024-12-12T00:00:00+00:00</lastmod></url><url><loc>https://hive.apache.org/docs/latest/accessserver-design-proposal_31823045/</loc><lastmod>2024-12-12T00
[...]
\ No newline at end of file
+<?xml version="1.0" encoding="utf-8" standalone="yes"?><urlset
xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"
xmlns:xhtml="http://www.w3.org/1999/xhtml"><url><loc>https://hive.apache.org/development/</loc><lastmod>2025-03-28T00:00:00+00:00</lastmod></url><url><loc>https://hive.apache.org/development/qtest/</loc><lastmod>2025-03-28T00:00:00+00:00</lastmod></url><url><loc>https://hive.apache.org/search/</loc><lastmod>2024-12-25T17:12:03+05:30</lastmod></url><url><loc>https://hive.apac
[...]
\ No newline at end of file