Repository: sqoop Updated Branches: refs/heads/sqoop2 fd8299cba -> a00c94100
SQOOP-2856: Sqoop2: Enrich HDFS Connector resource file (Jarek Jarcec Cecho via Abraham Fine) Project: http://git-wip-us.apache.org/repos/asf/sqoop/repo Commit: http://git-wip-us.apache.org/repos/asf/sqoop/commit/a00c9410 Tree: http://git-wip-us.apache.org/repos/asf/sqoop/tree/a00c9410 Diff: http://git-wip-us.apache.org/repos/asf/sqoop/diff/a00c9410 Branch: refs/heads/sqoop2 Commit: a00c94100ce6d637c3758e5a696a943917e5137e Parents: fd8299c Author: Abraham Fine <[email protected]> Authored: Mon Mar 7 12:56:45 2016 -0800 Committer: Abraham Fine <[email protected]> Committed: Mon Mar 7 12:56:45 2016 -0800 ---------------------------------------------------------------------- .../resources/hdfs-connector-config.properties | 97 +++++++++++--------- 1 file changed, 54 insertions(+), 43 deletions(-) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/sqoop/blob/a00c9410/connector/connector-hdfs/src/main/resources/hdfs-connector-config.properties ---------------------------------------------------------------------- diff --git a/connector/connector-hdfs/src/main/resources/hdfs-connector-config.properties b/connector/connector-hdfs/src/main/resources/hdfs-connector-config.properties index 69f50c1..29efced 100644 --- a/connector/connector-hdfs/src/main/resources/hdfs-connector-config.properties +++ b/connector/connector-hdfs/src/main/resources/hdfs-connector-config.properties @@ -13,78 +13,89 @@ # See the License for the specific language governing permissions and # limitations under the License. -# Generic HDFS Connector Resources - -############################ +connector.name = HDFS Connector # Link Config -linkConfig.label = Link configuration -linkConfig.help = Here you supply information necessary to connect to HDFS +linkConfig.label = HDFS cluster +linkConfig.help = Contains configuration required to connect to your HDFS cluster. -linkConfig.uri.label = HDFS URI -linkConfig.uri.help = HDFS URI used to connect to HDFS +linkConfig.uri.label = URI +linkConfig.uri.example = hdfs://nn1.example.com/ +linkConfig.uri.help = Namenode URI for your cluster. -linkConfig.confDir.label = Hadoop conf directory -linkConfig.confDir.help = Directory with Hadoop configuration files. The connector will load all -site.xml files. +linkConfig.confDir.label = Conf directory +linkConfig.confDir.example = /etc/hadoop/conf/ +linkConfig.confDir.help = Directory on Sqoop server machine with hdfs configuration files (hdfs-site.xml, \ + ...). This connector will load all files ending with -site.xml. -linkConfig.configOverrides.label = Override configuration -linkConfig.configOverrides.help = Map of properties that that should be set for the Hadoop's configuration object on top of the files loaded from configuration directory. +linkConfig.configOverrides.label = Additional configs: +linkConfig.configOverrides.example = custom.key=custom.value +linkConfig.configOverrides.help = Additional configuration that will be set on HDFS Configuration object, \ + possibly overriding any keys loaded from configuration files. # To Job Config -# -toJobConfig.label = To HDFS configuration -toJobConfig.help = You must supply the information requested in order to \ - get information where you want to store your data. - -toJobConfig.storageType.label = Storage type -toJobConfig.storageType.help = Target on Hadoop ecosystem where to store data +toJobConfig.label = Target configuration +toJobConfig.help = Configuration describing where and how the resulting data should be stored. -toJobConfig.outputFormat.label = Output format -toJobConfig.outputFormat.help = Format in which data should be serialized +toJobConfig.outputFormat.label = File format +toJobConfig.outputFormat.example = PARQUET_FILE +toJobConfig.outputFormat.help = File format that should be used for transferred data. -toJobConfig.compression.label = Compression format -toJobConfig.compression.help = Compression that should be used for the data +toJobConfig.compression.label = Compression codec +toJobConfig.compression.example = SNAPPY +toJobConfig.compression.help = Compression codec that should be use to compress transferred data. -toJobConfig.customCompression.label = Custom compression format -toJobConfig.customCompression.help = Full class name of the custom compression +toJobConfig.customCompression.label = Custom codec +toJobConfig.customCompression.example = org.apache.hadoop.io.compress.SnappyCodec +toJobConfig.customCompression.help = Fully qualified class name with Hadoop codec implementation that should be \ + used if none of the build-in options are suitable. toJobConfig.outputDirectory.label = Output directory -toJobConfig.outputDirectory.help = Output directory for final data +toJobConfig.outputDirectory.example = /user/jarcec/output-dir +toJobConfig.outputDirectory.help = HDFS directory where transferred data will be written to. toJobConfig.appendMode.label = Append mode -toJobConfig.appendMode.help = Append new files to existing directory if the output directory already exists +toJobConfig.appendMode.example = true +toJobConfig.appendMode.help = If set to false, job will fail if output directory already exists. If set to true \ + then imported data will be stored to already existing and possibly non empty directory. toJobConfig.overrideNullValue.label = Override null value -toJobConfig.overrideNullValue.help = If set to true, then the null value will \ - be overridden with the value set in \ - toJobConfig.nullValue. +toJobConfig.overrideNullValue.example = true +toJobConfig.overrideNullValue.help = If set to true, then the null value will be overridden with the value set in \ + Null value. toJobConfig.nullValue.label = Null value -toJobConfig.nullValue.help = Use this particular character or sequence of characters \ - as a value representing null when outputting to a file. +toJobConfig.nullValue.example = \N +toJobConfig.nullValue.help = For file formats that doesn't have native representation of NULL (as for example text file) \ + use this particular string to encode NULL value. + incremental.label = Incremental import -incremental.help = Information relevant for incremental import from HDFS +incremental.help = Information relevant for incremental reading from HDFS. incremental.incrementalType.label = Incremental type -incremental.incrementalType.help = Type of incremental import +incremental.incrementalType.example = NEW_FILES +incremental.incrementalType.help = Type of incremental import. incremental.lastImportedDate.label = Last imported date -incremental.lastImportedDate.help = Date when last import happened +incremental.lastImportedDate.example = 1987-02-02 13:15:27 +incremental.lastImportedDate.help = Datetime stamp of last read file. Next job execution will read only files that have been \ + created after this point in time. # From Job Config -# -fromJobConfig.label = From HDFS configuration -fromJobConfig.help = Specifies information required to get data from Hadoop ecosystem +fromJobConfig.label = Input configuration +fromJobConfig.help = Specifies information required to get data from HDFS. fromJobConfig.inputDirectory.label = Input directory -fromJobConfig.inputDirectory.help = Directory that should be exported +fromJobConfig.inputDirectory.example = /user/jarcec/input-dir +fromJobConfig.inputDirectory.help = Input directory containing files that should be transferred. fromJobConfig.overrideNullValue.label = Override null value -fromJobConfig.overrideNullValue.help = If set to true, then the null value will \ - be overridden with the value set in \ - toJobConfig.nullValue. +fromJobConfig.overrideNullValue.example = true +fromJobConfig.overrideNullValue.help = If set to true, then the null value will be overridden with the value set in \ + Null value. fromJobConfig.nullValue.label = Null value -fromJobConfig.nullValue.help = Use this particular character or sequence of characters \ - as a value representing null when outputting to a file. +fromJobConfig.nullValue.example = \N +fromJobConfig.nullValue.help = For file formats that doesn't have native representation of NULL (as for example text file) \ + use this particular string to decode NULL value.
