sync w/Jacques rn

Bridget's edits

1.1 update per JN's rn

dead link

ODBC driver link updates

AVG doc enhancement


Project: http://git-wip-us.apache.org/repos/asf/drill/repo
Commit: http://git-wip-us.apache.org/repos/asf/drill/commit/45ada885
Tree: http://git-wip-us.apache.org/repos/asf/drill/tree/45ada885
Diff: http://git-wip-us.apache.org/repos/asf/drill/diff/45ada885

Branch: refs/heads/gh-pages
Commit: 45ada885c3e2d973b11f58fdb69cd3bb1f9a8de3
Parents: 02af103
Author: Kristine Hahn <[email protected]>
Authored: Mon Jul 6 15:38:04 2015 -0700
Committer: Kristine Hahn <[email protected]>
Committed: Tue Jul 7 17:14:22 2015 -0700

----------------------------------------------------------------------
 ...ser-impersonation-with-hive-authorization.md | 10 +++--
 .../070-hive-storage-plugin.md                  | 13 ++++--
 .../090-mongodb-plugin-for-apache-drill.md      |  2 +-
 .../020-hive-to-drill-data-type-mapping.md      |  2 +
 _docs/getting-started/010-drill-introduction.md |  5 ++-
 .../010-installing-the-driver-on-linux.md       | 10 ++---
 .../020-installing-the-driver-on-mac-os-x.md    |  2 +-
 .../030-installing-the-driver-on-windows.md     |  4 +-
 .../060-querying-the-information-schema.md      |  2 +-
 .../050-aggregate-and-aggregate-statistical.md  | 44 +++++++++++++++++++-
 10 files changed, 71 insertions(+), 23 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/drill/blob/45ada885/_docs/configure-drill/076-configuring-user-impersonation-with-hive-authorization.md
----------------------------------------------------------------------
diff --git 
a/_docs/configure-drill/076-configuring-user-impersonation-with-hive-authorization.md
 
b/_docs/configure-drill/076-configuring-user-impersonation-with-hive-authorization.md
index bd16195..678f0b3 100644
--- 
a/_docs/configure-drill/076-configuring-user-impersonation-with-hive-authorization.md
+++ 
b/_docs/configure-drill/076-configuring-user-impersonation-with-hive-authorization.md
@@ -54,7 +54,7 @@ Complete the following steps on each Drillbit node to enable 
user impersonation,
    * If the underlying file system has MapR security enabled, add the 
following line:
     `export MAPR_TICKETFILE_LOCATION=/opt/mapr/conf/mapruserticket`  
    * If you are implementing Hive SQL standard based authorization, and you 
are running Drill     and Hive in a secure MapR cluster, add the following 
lines:  
-        `export DRILLBIT_JAVA_OPTS="$DRILLBIT_JAVA_OPTS 
-Dmapr_sec_enabled=true -Dhadoop.login=maprsasl 
-Dzookeeper.saslprovider=com.mapr.security.maprsasl.MaprSaslProvider 
-Dmapr.library.flatclass"`  
+        `export DRILL_JAVA_OPTS="$DRILL_JAVA_OPTS -Dmapr_sec_enabled=true 
-Dhadoop.login=maprsasl 
-Dzookeeper.saslprovider=com.mapr.security.maprsasl.MaprSaslProvider 
-Dmapr.library.flatclass"`  
        `export MAPR_IMPERSONATION_ENABLED=true`  
        `export MAPR_TICKETFILE_LOCATION=/opt/mapr/conf/mapruserticket`
 
@@ -194,14 +194,16 @@ Add the following required authorization parameters in 
hive-site.xml to configur
 
 Modify the Hive storage plugin instance in the Drill Web UI to include 
specific authorization settings. The Drillbit that you use to access the Web UI 
must be running. 
 
-Note: The metastore host port for MapR is typically 9083.  
+{% include startnote.html %}The metastore host port for MapR is typically 
9083.{% include endnote.html %}  
 
 Complete the following steps to modify the Hive storage plugin:  
 
 1.  Navigate to `http://<drillbit_hostname>:8047`, and select the **Storage 
tab**.  
 2.  Click **Update** next to the hive instance.  
-3.  In the configuration window, add the configuration settings for the 
authorization type.  
-    * For storage based authorization, add the following settings:  
+3.  In the configuration window, add the configuration settings for the 
authorization type. If you are running Drill and Hive in a secure MapR cluster, 
do not include the line `"hive.metastore.sasl.enabled" : "false"`.  
+
+  
+   * For storage based authorization, add the following settings:  
 
               {
                type:"hive",

http://git-wip-us.apache.org/repos/asf/drill/blob/45ada885/_docs/connect-a-data-source/070-hive-storage-plugin.md
----------------------------------------------------------------------
diff --git a/_docs/connect-a-data-source/070-hive-storage-plugin.md 
b/_docs/connect-a-data-source/070-hive-storage-plugin.md
index 0e30fd8..2c55b37 100644
--- a/_docs/connect-a-data-source/070-hive-storage-plugin.md
+++ b/_docs/connect-a-data-source/070-hive-storage-plugin.md
@@ -21,7 +21,7 @@ metastore service communicates with the Hive database over 
JDBC. Point Drill
 to the Hive metastore service address, and provide the connection parameters
 in the Drill Web UI to configure a connection to Drill.
 
-{% include startnote.html %}Verify that the Hive metastore service is running 
before you register the Hive metastore.{% include endnote.html %}
+{% include startnote.html %}Verify that the Hive metastore service is running 
before you register the Hive metastore.{% include endnote.html %}  
 
 To register a remote Hive metastore with Drill, complete the following steps:
 
@@ -30,7 +30,7 @@ To register a remote Hive metastore with Drill, complete the 
following steps:
         hive --service metastore
   2. Navigate to [http://localhost:8047](http://localhost:8047/), and select 
the **Storage** tab.
   3. In the disabled storage plugins section, click **Update** next to the 
`hive` instance.
-  4. In the configuration window, add the `Thrift URI` and port to 
`hive.metastore.uris`.
+  4. In the configuration window, add the `Thrift URI` and port to 
`hive.metastore.uris`. 
 
      **Example**
      
@@ -41,8 +41,13 @@ To register a remote Hive metastore with Drill, complete the 
following steps:
             "hive.metastore.uris": "thrift://<localhost>:<port>",  
             "hive.metastore.sasl.enabled": "false"
           }
-        }       
-  5. Click **Enable**.
+        }  
+       
+  5. If you are running Drill and Hive in a secure MapR cluster, remove the 
following line from the configuration:  
+  `"hive.metastore.sasl.enabled" : "false"`
+  6. Click **Enable**.  
+  7. If you are running Drill and Hive in a secure MapR cluster, add the 
following line to `<DRILL_HOME>/conf/drill-env.sh` on each Drill node and then 
[restart the Drillbit 
service]({{site.baseurl}}/docs/starting-drill-in-distributed-mode/):  
+ ` export DRILL_JAVA_OPTS="$DRILL_JAVA_OPTS -Dmapr_sec_enabled=true 
-Dhadoop.login=maprsasl 
-Dzookeeper.saslprovider=com.mapr.security.maprsasl.MaprSaslProvider 
-Dmapr.library.flatclass"`
 
 
 Once you have configured a storage plugin instance for a Hive data source, you

http://git-wip-us.apache.org/repos/asf/drill/blob/45ada885/_docs/connect-a-data-source/090-mongodb-plugin-for-apache-drill.md
----------------------------------------------------------------------
diff --git a/_docs/connect-a-data-source/090-mongodb-plugin-for-apache-drill.md 
b/_docs/connect-a-data-source/090-mongodb-plugin-for-apache-drill.md
index 450d079..6a7b56d 100644
--- a/_docs/connect-a-data-source/090-mongodb-plugin-for-apache-drill.md
+++ b/_docs/connect-a-data-source/090-mongodb-plugin-for-apache-drill.md
@@ -4,7 +4,7 @@ parent: "Connect a Data Source"
 ---
 ## Overview
 
-Drill provides a mongodb format plugin to connect to MongoDB, and run queries
+Drill supports MongoDB 3.0, providing a mongodb format plugin to connect to 
MongoDB using MongoDB's latest Java driver. You can run queries
 to read, but not write, the Mongo data using Drill. Attempting to write data 
back to Mongo results in an error. You do not need any upfront schema 
definitions. 
 
 {% include startnote.html %}A local instance of Drill is used in this tutorial 
for simplicity. {% include endnote.html %}

http://git-wip-us.apache.org/repos/asf/drill/blob/45ada885/_docs/data-sources-and-file-formats/020-hive-to-drill-data-type-mapping.md
----------------------------------------------------------------------
diff --git 
a/_docs/data-sources-and-file-formats/020-hive-to-drill-data-type-mapping.md 
b/_docs/data-sources-and-file-formats/020-hive-to-drill-data-type-mapping.md
index 20a921f..76b7640 100644
--- a/_docs/data-sources-and-file-formats/020-hive-to-drill-data-type-mapping.md
+++ b/_docs/data-sources-and-file-formats/020-hive-to-drill-data-type-mapping.md
@@ -10,6 +10,7 @@ Using Drill you can read tables created in Hive that use data 
types compatible w
 
|--------------------|-----------|------------------------------------------------------------|
 | BIGINT             | BIGINT    | 8-byte signed integer                       
               |
 | BOOLEAN            | BOOLEAN   | TRUE (1) or FALSE (0)                       
               |
+| BYTE               | TINYINT   | 1-byte integer                              
               |
 | CHAR               | CHAR      | Character string, fixed-length max 255      
               |
 | DATE               | DATE      | Years months and days in the form in the 
form YYYY-­MM-­DD   |
 | DECIMAL*           | DECIMAL   | 38-digit precision                          
               |
@@ -24,6 +25,7 @@ Using Drill you can read tables created in Hive that use data 
types compatible w
 | TIMESTAMP          | TIMESTAMP | JDBC timestamp in yyyy-mm-dd hh:mm:ss 
format               |
 | None               | STRING    | Binary string (16)                          
               |
 | VARCHAR            | VARCHAR   | Character string variable length            
               |
+| VARBINARY          | BINARY    | Binary string                               
               |
 
 \* In this release, Drill disables the DECIMAL data type, including casting to 
DECIMAL and reading DECIMAL types from Parquet and Hive. To enable the DECIMAL 
type, set the `planner.enable_decimal_data_type` option to `true`.
 

http://git-wip-us.apache.org/repos/asf/drill/blob/45ada885/_docs/getting-started/010-drill-introduction.md
----------------------------------------------------------------------
diff --git a/_docs/getting-started/010-drill-introduction.md 
b/_docs/getting-started/010-drill-introduction.md
index 68a0ae7..fd06770 100644
--- a/_docs/getting-started/010-drill-introduction.md
+++ b/_docs/getting-started/010-drill-introduction.md
@@ -15,7 +15,8 @@ Many enhancements in Apache Drill 1.1 include the following 
key features:
 
 * [SQL window functions]({{site.baseurl}}/docs/sql-window-functions)
 * [Automatic partitioning]({{site.baseurl}}) using the new [PARTITION 
BY]({{site.baseurl}}/docs/partition-by-clause) clause in the CTAS command
-* [User impersonation with Hive authorization](({{site.baseurl}})
+* [Delegated Hive 
impersonation](({{site.baseurl}}/docs/configuring-user-impersonation-with-hive-authorization/)
+* Support for UNION ALL and better optimized plans that include UNION.
 
 ## What's New in Apache Drill 1.0
 
@@ -28,7 +29,7 @@ Apache Drill 1.0 offers the following new features:
 * New Errors tab in the Query Profiles UI that facilitates troubleshooting and 
distributed storing of profiles.
 * Support for a new storage plugin input format: 
[Avro](http://avro.apache.org/docs/current/spec.html)
 
-In this release, Drill disables the DECIMAL data type, including casting to 
DECIMAL and reading DECIMAL types from Parquet and Hive. You can [enable the 
DECIMAL type](docs/supported-data-types/#enabling-the-decimal-type), but this 
is not recommented.
+In this release, Drill disables the DECIMAL data type, including casting to 
DECIMAL and reading DECIMAL types from Parquet and Hive. You can [enable the 
DECIMAL type](docs/supported-data-types/#enabling-the-decimal-type), but this 
is not recommended.
 
 ## Apache Drill Key Features
 

http://git-wip-us.apache.org/repos/asf/drill/blob/45ada885/_docs/odbc-jdbc-interfaces/installing-the-odbc-driver/010-installing-the-driver-on-linux.md
----------------------------------------------------------------------
diff --git 
a/_docs/odbc-jdbc-interfaces/installing-the-odbc-driver/010-installing-the-driver-on-linux.md
 
b/_docs/odbc-jdbc-interfaces/installing-the-odbc-driver/010-installing-the-driver-on-linux.md
old mode 100755
new mode 100644
index c74fb70..5bf2170
--- 
a/_docs/odbc-jdbc-interfaces/installing-the-odbc-driver/010-installing-the-driver-on-linux.md
+++ 
b/_docs/odbc-jdbc-interfaces/installing-the-odbc-driver/010-installing-the-driver-on-linux.md
@@ -37,14 +37,10 @@ If not, create an entry in `/etc/hosts` for each node in 
the following format:
 To install the driver, you need Administrator privileges on the computer.
 
 ## Step 1: Download the MapR Drill ODBC Driver
-Download either the 32- or 64-bit driver from the following sites:
+Download either the 32- or 64-bit driver:
 
-  * [MapR Drill ODBC Driver (32-bit)]
-
-    
http://package.mapr.com/tools/MapR-ODBC/MapR_Drill/MapRDrill_odbc_v1.0.0.1001/MapRDrillODBC-32bit-1.0.0.i686.rpm
-  * [MapR Drill ODBC Driver (64-bit)]
-
-    
http://package.mapr.com/tools/MapR-ODBC/MapR_Drill/MapRDrill_odbc_v1.0.0.1001/MapRDrillODBC-1.0.0.x86_64.rpm
+  * [MapR Drill ODBC Driver 
(32-bit)](http://package.mapr.com/tools/MapR-ODBC/MapR_Drill/MapRDrill_odbc_v1.1.0.1000/MapRDrillODBC-32bit-1.1.0.i686.rpm)
+  * [MapR Drill ODBC Driver 
(64-bit)](http://package.mapr.com/tools/MapR-ODBC/MapR_Drill/MapRDrill_odbc_v1.1.0.1000/MapRDrillODBC-1.1.0.x86_64.rpm)
 
 ## Step 2: Install the MapR Drill ODBC Driver
 

http://git-wip-us.apache.org/repos/asf/drill/blob/45ada885/_docs/odbc-jdbc-interfaces/installing-the-odbc-driver/020-installing-the-driver-on-mac-os-x.md
----------------------------------------------------------------------
diff --git 
a/_docs/odbc-jdbc-interfaces/installing-the-odbc-driver/020-installing-the-driver-on-mac-os-x.md
 
b/_docs/odbc-jdbc-interfaces/installing-the-odbc-driver/020-installing-the-driver-on-mac-os-x.md
old mode 100755
new mode 100644
index ab24810..725b33c
--- 
a/_docs/odbc-jdbc-interfaces/installing-the-odbc-driver/020-installing-the-driver-on-mac-os-x.md
+++ 
b/_docs/odbc-jdbc-interfaces/installing-the-odbc-driver/020-installing-the-driver-on-mac-os-x.md
@@ -31,7 +31,7 @@ To install the driver, you need Administrator privileges on 
the computer.
 
 Click the following link to download the driver:  
 
-[MapR Drill ODBC Driver for 
Mac](http://package.mapr.com/tools/MapR-ODBC/MapR_Drill/MapRDrill_odbc_v1.0.0.1001/MapRDrillODBC.dmg)
+[MapR Drill ODBC Driver for 
Mac](http://package.mapr.com/tools/MapR-ODBC/MapR_Drill/MapRDrill_odbc_v1.1.0.1000/MapRDrillODBC.dmg)
 
 ----------
 

http://git-wip-us.apache.org/repos/asf/drill/blob/45ada885/_docs/odbc-jdbc-interfaces/installing-the-odbc-driver/030-installing-the-driver-on-windows.md
----------------------------------------------------------------------
diff --git 
a/_docs/odbc-jdbc-interfaces/installing-the-odbc-driver/030-installing-the-driver-on-windows.md
 
b/_docs/odbc-jdbc-interfaces/installing-the-odbc-driver/030-installing-the-driver-on-windows.md
old mode 100755
new mode 100644
index e62147e..68219d7
--- 
a/_docs/odbc-jdbc-interfaces/installing-the-odbc-driver/030-installing-the-driver-on-windows.md
+++ 
b/_docs/odbc-jdbc-interfaces/installing-the-odbc-driver/030-installing-the-driver-on-windows.md
@@ -37,8 +37,8 @@ To install the driver, you need Administrator privileges on 
the computer.
 
 Download the installer that corresponds to the bitness of the client 
application from which you want to create an ODBC connection:
 
-* [MapR Drill ODBC Driver 
(32-bit)](http://package.mapr.com/tools/MapR-ODBC/MapR_Drill/MapRDrill_odbc_v1.0.0.1001/MapRDrillODBC32.msi)
  
-* [MapR Drill ODBC Driver 
(64-bit)](http://package.mapr.com/tools/MapR-ODBC/MapR_Drill/MapRDrill_odbc_v1.0.0.1001/MapRDrillODBC64.msi)
+* [MapR Drill ODBC Driver 
(32-bit)](http://package.mapr.com/tools/MapR-ODBC/MapR_Drill/MapRDrill_odbc_v1.1.0.1000/MapRDrillODBC32.msi)
  
+* [MapR Drill ODBC Driver 
(64-bit)](http://package.mapr.com/tools/MapR-ODBC/MapR_Drill/MapRDrill_odbc_v1.1.0.1000/MapRDrillODBC64.msi)
 
 ----------
 

http://git-wip-us.apache.org/repos/asf/drill/blob/45ada885/_docs/query-data/060-querying-the-information-schema.md
----------------------------------------------------------------------
diff --git a/_docs/query-data/060-querying-the-information-schema.md 
b/_docs/query-data/060-querying-the-information-schema.md
index 43d58e5..c4c1008 100644
--- a/_docs/query-data/060-querying-the-information-schema.md
+++ b/_docs/query-data/060-querying-the-information-schema.md
@@ -107,4 +107,4 @@ of those columns:
     | OrderTotal  | Decimal    |
     +-------------+------------+
 
-In this release, Drill disables the DECIMAL data type, including casting to 
DECIMAL and reading DECIMAL types from Parquet and Hive. You can [enable the 
DECIMAL type](docs/supported-data-types/#enabling-the-decimal-type), but this 
is not recommented.
\ No newline at end of file
+In this release, Drill disables the DECIMAL data type, including casting to 
DECIMAL and reading DECIMAL types from Parquet and Hive. You can [enable the 
DECIMAL type](docs/supported-data-types/#enabling-the-decimal-type), but this 
is not recommended.
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/drill/blob/45ada885/_docs/sql-reference/sql-functions/050-aggregate-and-aggregate-statistical.md
----------------------------------------------------------------------
diff --git 
a/_docs/sql-reference/sql-functions/050-aggregate-and-aggregate-statistical.md 
b/_docs/sql-reference/sql-functions/050-aggregate-and-aggregate-statistical.md
index b57743b..4e0566b 100644
--- 
a/_docs/sql-reference/sql-functions/050-aggregate-and-aggregate-statistical.md
+++ 
b/_docs/sql-reference/sql-functions/050-aggregate-and-aggregate-statistical.md
@@ -17,10 +17,52 @@ MAX(expression)| BINARY, DECIMAL, VARCHAR, DATE, TIME, or 
TIMESTAMP| same as arg
 MIN(expression)| BINARY, DECIMAL, VARCHAR, DATE, TIME, or TIMESTAMP| same as 
argument type
 SUM(expression)| SMALLINT, INTEGER, BIGINT, FLOAT, DOUBLE, DECIMAL, 
INTERVALDAY, or INTERVALYEAR| BIGINT for SMALLINT or INTEGER arguments, DECIMAL 
for BIGINT arguments, DOUBLE for floating-point arguments, otherwise the same 
as the argument data type
 
-\* In this release, Drill disables the DECIMAL data type, including casting to 
DECIMAL and reading DECIMAL types from Parquet and Hive. You can [enable the 
DECIMAL type](docs/supported-data-types/#enabling-the-decimal-type), but this 
is not recommented.
+\* In this release, Drill disables the DECIMAL data type, including casting to 
DECIMAL and reading DECIMAL types from Parquet and Hive. You can [enable the 
DECIMAL type](docs/supported-data-types/#enabling-the-decimal-type), but this 
is not recommended.
 
 MIN, MAX, COUNT, AVG, and SUM accept ALL and DISTINCT keywords. The default is 
ALL.
 
+## AVG 
+
+Returns the average of all records of a column or the average of groups of 
records.
+
+### Syntax
+
+    SELECT AVG(aggregate_expression)
+    FROM tables
+    WHERE conditions;
+
+    SELECT expression1, expression2, ... expression_n,
+           AVG(aggregate_expression)
+    FROM tables
+    WHERE conditions
+    GROUP BY expression1, expression2, ... expression_n;
+
+Expressions listed within the AVG function and must be included in the GROUP 
BY clause.
+
+### Examples
+
+    SELECT AVG(salary) FROM cp.`employee.json`;
+    +---------------------+
+    |       EXPR$0        |
+    +---------------------+
+    | 4019.6017316017314  |
+    +---------------------+
+    1 row selected (0.221 seconds)
+
+    SELECT education_level, AVG(salary) FROM cp.`employee.json` GROUP BY 
education_level;
+    +----------------------+---------------------+
+    |   education_level    |       EXPR$1        |
+    +----------------------+---------------------+
+    | Graduate Degree      | 4392.823529411765   |
+    | Bachelors Degree     | 4492.404181184669   |
+    | Partial College      | 4047.1180555555557  |
+    | High School Degree   | 3516.1565836298932  |
+    | Partial High School  | 3511.0852713178297  |
+    +----------------------+---------------------+
+    5 rows selected (0.495 seconds)
+
+## MIN, MAX, COUNT, and SUM
+
 ### Examples
 
     SELECT a2 FROM t2;

Reply via email to