Repository: hive
Updated Branches:
  refs/heads/master 896f10dba -> 5971e691d


HIVE-20377 addendum: fix spaces for readMe.md file. (Slim B)


Project: http://git-wip-us.apache.org/repos/asf/hive/repo
Commit: http://git-wip-us.apache.org/repos/asf/hive/commit/5971e691
Tree: http://git-wip-us.apache.org/repos/asf/hive/tree/5971e691
Diff: http://git-wip-us.apache.org/repos/asf/hive/diff/5971e691

Branch: refs/heads/master
Commit: 5971e691d5e04496152a33f42cf22c52b163f9e6
Parents: 896f10d
Author: Slim Bouguerra <[email protected]>
Authored: Tue Dec 11 15:47:37 2018 -0800
Committer: Slim Bouguerra <[email protected]>
Committed: Tue Dec 11 15:47:37 2018 -0800

----------------------------------------------------------------------
 kafka-handler/README.md | 10 +++++-----
 1 file changed, 5 insertions(+), 5 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/hive/blob/5971e691/kafka-handler/README.md
----------------------------------------------------------------------
diff --git a/kafka-handler/README.md b/kafka-handler/README.md
index 11b893c..c986d85 100644
--- a/kafka-handler/README.md
+++ b/kafka-handler/README.md
@@ -1,4 +1,4 @@
-#Kafka Storage Handler Module
+# Kafka Storage Handler Module
 
 Storage Handler that allows user to Connect/Analyse/Transform Kafka topics.
 The workflow is as follow,  first the user will create an external table that 
is a view over one Kafka topic,
@@ -114,7 +114,7 @@ left join wiki_kafka_hive as future_activity on
 
 ```
 
-#Configuration
+# Configuration
 
 ## Table Properties
 
@@ -139,7 +139,7 @@ and will inject `max.poll.records=5000` to the Kafka 
Consumer.
 ALTER TABLE kafka_table SET TBLPROPERTIES 
("kafka.consumer.max.poll.records"="5000");
 ```
 
-#Kafka to Hive ETL PIPE LINE
+# Kafka to Hive ETL PIPE LINE
 
 load form Kafka every Record exactly once
 Goal is to read data and commit both data and its offsets in a single 
Transaction 
@@ -193,9 +193,9 @@ Insert overwrite table kafka_table_offsets select
 `__partition`, max(`__offset`), CURRENT_TIMESTAMP group by `__partition`, 
CURRENT_TIMESTAMP;
 ```
 
-#ETL from Hive to Kafka
+# ETL from Hive to Kafka
 
-##INSERT INTO
+## INSERT INTO
 First create the table in have that will be the target table. Now all the 
inserts will go to the topic mapped by this Table.
 
 ```sql

Reply via email to