Github user dyozie commented on a diff in the pull request:

    https://github.com/apache/incubator-hawq-docs/pull/17#discussion_r81382521
  
    --- Diff: datamgmt/load/g-register_files.html.md.erb ---
    @@ -0,0 +1,213 @@
    +---
    +title: Registering Files into HAWQ Internal Tables
    +---
    +
    +The `hawq register` utility loads and registers HDFS data files or folders 
into HAWQ internal tables. Files can be read directly, rather than having to be 
copied or loaded, resulting in higher performance and more efficient 
transaction processing.
    +
    +Data from the file or directory specified by \<hdfsfilepath\> is loaded 
into the appropriate HAWQ table directory in HDFS and the utility updates the 
corresponding HAWQ metadata for the files. Either AO for Parquet-formatted in 
HDFS can be loaded into a corresponding table in HAWQ.
    +
    +You can use `hawq register` either to:
    +
    +-  Load and register external Parquet-formatted file data generated by an 
external system such as Hive or Spark.
    +-  Recover cluster data from a backup cluster for disaster recovery. 
    +
    +Requirements for running `hawq register` on the client server are:
    --- End diff --
    
    Need to change "client server" to one or the other, I think.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to