remove kettle from configuration file

Project: http://git-wip-us.apache.org/repos/asf/incubator-carbondata/repo
Commit: 
http://git-wip-us.apache.org/repos/asf/incubator-carbondata/commit/0ef41ae2
Tree: http://git-wip-us.apache.org/repos/asf/incubator-carbondata/tree/0ef41ae2
Diff: http://git-wip-us.apache.org/repos/asf/incubator-carbondata/diff/0ef41ae2

Branch: refs/heads/12-dev
Commit: 0ef41ae2bc21f2b19f4b7d8efa34d32e4aca6b99
Parents: 81cebdb
Author: Vinod Rohilla <vi...@knoldus.in>
Authored: Thu Apr 6 11:47:11 2017 +0530
Committer: chenliang613 <chenliang...@huawei.com>
Committed: Thu Apr 6 12:28:32 2017 +0530

----------------------------------------------------------------------
 docs/configuration-parameters.md | 3 +--
 1 file changed, 1 insertion(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/incubator-carbondata/blob/0ef41ae2/docs/configuration-parameters.md
----------------------------------------------------------------------
diff --git a/docs/configuration-parameters.md b/docs/configuration-parameters.md
index 774734a..4f454e6 100644
--- a/docs/configuration-parameters.md
+++ b/docs/configuration-parameters.md
@@ -36,8 +36,7 @@ This section provides the details of all the configurations 
required for the Car
 | carbon.storelocation | /user/hive/warehouse/carbon.store | Location where 
CarbonData will create the store, and write the data in its own format. NOTE: 
Store location should be in HDFS. |
 | carbon.ddl.base.hdfs.url | hdfs://hacluster/opt/data | This property is used 
to configure the HDFS relative path, the path configured in 
carbon.ddl.base.hdfs.url will be appended to the HDFS path configured in 
fs.defaultFS. If this path is configured, then user need not pass the complete 
path while dataload. For example: If absolute path of the csv file is 
hdfs://10.18.101.155:54310/data/cnbc/2016/xyz.csv, the path 
"hdfs://10.18.101.155:54310" will come from property fs.defaultFS and user can 
configure the /data/cnbc/ as carbon.ddl.base.hdfs.url. Now while dataload user 
can specify the csv path as /2016/xyz.csv. |
 | carbon.badRecords.location | /opt/Carbon/Spark/badrecords | Path where the 
bad records are stored. |
-| carbon.kettle.home | $SPARK_HOME/carbonlib/carbonplugins | Configuration for 
loading the data with kettle. |
-| carbon.data.file.version | 2 | If this parameter value is set to 1, then 
CarbonData will support the data load which is in old format(0.x version). If 
the value is set to 2(1.x onwards version), then CarbonData will support the 
data load of new format only.|                    
+| carbon.data.file.version | 2 | If this parameter value is set to 1, then 
CarbonData will support the data load which is in old format(0.x version). If 
the value is set to 2(1.x onwards version), then CarbonData will support the 
data load of new format only.|
 
 ##  Performance Configuration
 This section provides the details of all the configurations required for 
CarbonData Performance Optimization.

Reply via email to