Repository: incubator-hawq-docs
Updated Branches:
  refs/heads/develop 19c676a70 -> f6e8b8a23


remove references to Greenplum (gpfdist)


Project: http://git-wip-us.apache.org/repos/asf/incubator-hawq-docs/repo
Commit: 
http://git-wip-us.apache.org/repos/asf/incubator-hawq-docs/commit/499d4f00
Tree: http://git-wip-us.apache.org/repos/asf/incubator-hawq-docs/tree/499d4f00
Diff: http://git-wip-us.apache.org/repos/asf/incubator-hawq-docs/diff/499d4f00

Branch: refs/heads/develop
Commit: 499d4f001a73182aaabad7b3c14e7eb67481a34b
Parents: 3df5448
Author: Lisa Owen <[email protected]>
Authored: Wed Oct 19 09:41:05 2016 -0700
Committer: Lisa Owen <[email protected]>
Committed: Wed Oct 19 09:41:05 2016 -0700

----------------------------------------------------------------------
 datamgmt/load/g-gpfdist-protocol.html.md.erb                       | 2 +-
 datamgmt/load/g-installing-gpfdist.html.md.erb                     | 2 +-
 datamgmt/load/g-loading-and-unloading-data.html.md.erb             | 2 +-
 ...-using-the-greenplum-parallel-file-server--gpfdist-.html.md.erb | 2 +-
 reference/sql/CREATE-EXTERNAL-TABLE.html.md.erb                    | 2 +-
 5 files changed, 5 insertions(+), 5 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/incubator-hawq-docs/blob/499d4f00/datamgmt/load/g-gpfdist-protocol.html.md.erb
----------------------------------------------------------------------
diff --git a/datamgmt/load/g-gpfdist-protocol.html.md.erb 
b/datamgmt/load/g-gpfdist-protocol.html.md.erb
index f41c946..d28ba72 100644
--- a/datamgmt/load/g-gpfdist-protocol.html.md.erb
+++ b/datamgmt/load/g-gpfdist-protocol.html.md.erb
@@ -8,7 +8,7 @@ The `gpfdist://` protocol is used in a URI to reference a 
running `gpfdist` inst
 
 Run `gpfdist` on the host where the external data files reside. `gpfdist` 
uncompresses `gzip` (`.gz`) and `bzip2` (.`bz2`) files automatically. You can 
use the wildcard character (\*) or other C-style pattern matching to denote 
multiple files to read. The files specified are assumed to be relative to the 
directory that you specified when you started the `gpfdist` instance.
 
-All virtual segments access the external file(s) in parallel, subject to the 
number of segments set in the `gp_external_max_segments` parameter, the length 
of the `gpfdist` location list, and the limits specified by the 
`hawq_rm_nvseg_perquery_limit` and `hawq_rm_nvseg_perquery_perseg_limit` 
parameters. Use multiple `gpfdist` data sources in a `CREATE EXTERNAL TABLE` 
statement to scale the external table's scan performance. For more information 
about configuring `gpfdist`, see [Using the Greenplum Parallel File Server 
(gpfdist)](g-using-the-greenplum-parallel-file-server--gpfdist-.html#topic13).
+All virtual segments access the external file(s) in parallel, subject to the 
number of segments set in the `gp_external_max_segments` parameter, the length 
of the `gpfdist` location list, and the limits specified by the 
`hawq_rm_nvseg_perquery_limit` and `hawq_rm_nvseg_perquery_perseg_limit` 
parameters. Use multiple `gpfdist` data sources in a `CREATE EXTERNAL TABLE` 
statement to scale the external table's scan performance. For more information 
about configuring `gpfdist`, see [Using the HAWQ File Server 
(gpfdist)](g-using-the-greenplum-parallel-file-server--gpfdist-.html#topic13).
 
 See the `gpfdist` reference documentation for more information about using 
`gpfdist` with external tables.
 

http://git-wip-us.apache.org/repos/asf/incubator-hawq-docs/blob/499d4f00/datamgmt/load/g-installing-gpfdist.html.md.erb
----------------------------------------------------------------------
diff --git a/datamgmt/load/g-installing-gpfdist.html.md.erb 
b/datamgmt/load/g-installing-gpfdist.html.md.erb
index 25f923e..85549df 100644
--- a/datamgmt/load/g-installing-gpfdist.html.md.erb
+++ b/datamgmt/load/g-installing-gpfdist.html.md.erb
@@ -2,6 +2,6 @@
 title: Installing gpfdist
 ---
 
-`gpfdist` is installed in `$GPHOME/bin` of your HAWQ master host installation. 
Run `gpfdist` from a machine other than the HAWQ master, such as on a machine 
devoted to ETL processing. If you want to install `gpfdist` on your ETL server, 
get it from the *Greenplum Load Tools* package and follow its installation 
instructions.
+You may choose to run `gpfdist` from a machine other than the HAWQ master, 
such as on a machine devoted to ETL processing. To install `gpfdist` on your 
ETL server, refer to [Client-Based HAWQ Load Tools](client-loadtools.html) for 
information related to Linux and Windows load tools installation and 
configuration.
 
 

http://git-wip-us.apache.org/repos/asf/incubator-hawq-docs/blob/499d4f00/datamgmt/load/g-loading-and-unloading-data.html.md.erb
----------------------------------------------------------------------
diff --git a/datamgmt/load/g-loading-and-unloading-data.html.md.erb 
b/datamgmt/load/g-loading-and-unloading-data.html.md.erb
index 6d27685..671e899 100644
--- a/datamgmt/load/g-loading-and-unloading-data.html.md.erb
+++ b/datamgmt/load/g-loading-and-unloading-data.html.md.erb
@@ -24,7 +24,7 @@ HAWQ leverages the parallel architecture of the Hadoop 
Distributed File System t
 
 -   **[Working with File-Based External 
Tables](../../datamgmt/load/g-working-with-file-based-ext-tables.html)**
 
--   **[Using the Greenplum Parallel File Server 
(gpfdist)](../../datamgmt/load/g-using-the-greenplum-parallel-file-server--gpfdist-.html)**
+-   **[Using the HAWQ File Server 
(gpfdist)](../../datamgmt/load/g-using-the-greenplum-parallel-file-server--gpfdist-.html)**
 
 -   **[Creating and Using Web External 
Tables](../../datamgmt/load/g-creating-and-using-web-external-tables.html)**
 

http://git-wip-us.apache.org/repos/asf/incubator-hawq-docs/blob/499d4f00/datamgmt/load/g-using-the-greenplum-parallel-file-server--gpfdist-.html.md.erb
----------------------------------------------------------------------
diff --git 
a/datamgmt/load/g-using-the-greenplum-parallel-file-server--gpfdist-.html.md.erb
 
b/datamgmt/load/g-using-the-greenplum-parallel-file-server--gpfdist-.html.md.erb
index f357b2c..0c68b2c 100644
--- 
a/datamgmt/load/g-using-the-greenplum-parallel-file-server--gpfdist-.html.md.erb
+++ 
b/datamgmt/load/g-using-the-greenplum-parallel-file-server--gpfdist-.html.md.erb
@@ -1,5 +1,5 @@
 ---
-title: Using the Greenplum Parallel File Server (gpfdist)
+title: Using the HAWQ File Server (gpfdist)
 ---
 
 The `gpfdist` protocol provides the best performance and is the easiest to set 
up. `gpfdist` ensures optimum use of all segments in your HAWQ system for 
external table reads.

http://git-wip-us.apache.org/repos/asf/incubator-hawq-docs/blob/499d4f00/reference/sql/CREATE-EXTERNAL-TABLE.html.md.erb
----------------------------------------------------------------------
diff --git a/reference/sql/CREATE-EXTERNAL-TABLE.html.md.erb 
b/reference/sql/CREATE-EXTERNAL-TABLE.html.md.erb
index 2f19eab..2a76d27 100644
--- a/reference/sql/CREATE-EXTERNAL-TABLE.html.md.erb
+++ b/reference/sql/CREATE-EXTERNAL-TABLE.html.md.erb
@@ -119,7 +119,7 @@ Regular readable external tables can access static flat 
files or, by using HAWQ
 
 Web external tables access dynamic data sources – either on a web server or 
by executing OS commands or scripts.
 
-The LOCATION clause specifies the location of the external data. The location 
string begins with a protocol string that specifies the storage type and 
protocol used to access the data. The `gpfdist://` protocol specifies data 
files served by one or more instances of the Greenplum parallel file 
distribution server `gpfdist`. The `http://` protocol specifies one or more 
HTTP URLs and is used with web tables. The `pxf://` protocol specifies data 
accessed through the PXF service, which provides access to data in a Hadoop 
system. Using the PXF API, you can create PXF plug-ins to provide HAWQ access 
to any other data source.
+The LOCATION clause specifies the location of the external data. The location 
string begins with a protocol string that specifies the storage type and 
protocol used to access the data. The `gpfdist://` protocol specifies data 
files served by one or more instances of the HAWQ file server `gpfdist`. The 
`http://` protocol specifies one or more HTTP URLs and is used with web tables. 
The `pxf://` protocol specifies data accessed through the PXF service, which 
provides access to data in a Hadoop system. Using the PXF API, you can create 
PXF plug-ins to provide HAWQ access to any other data source.
 
 **Note:** The `file://` protocol is deprecated. Instead, use the `gpfdist://`, 
`gpfdists://`, or `pxf://` protocol, or the `COPY` command instead.
 

Reply via email to