This is an automated email from the ASF dual-hosted git repository.
unknown pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-pinot.wiki.git
The following commit(s) were added to refs/heads/master by this push:
new 2b39552 Updated links and Fully Qualified Class Name for
KafkaConsumerFactory
2b39552 is described below
commit 2b39552561ce270f211287a2d64c03b6d08c332e
Author: Shaun Schembri <[email protected]>
AuthorDate: Wed Feb 13 16:06:56 2019 +0100
Updated links and Fully Qualified Class Name for KafkaConsumerFactory
---
Pluggable-Streams-in-Realtime.md | 22 +++++++++++-----------
1 file changed, 11 insertions(+), 11 deletions(-)
diff --git a/Pluggable-Streams-in-Realtime.md b/Pluggable-Streams-in-Realtime.md
index 4c75f71..35b5c25 100644
--- a/Pluggable-Streams-in-Realtime.md
+++ b/Pluggable-Streams-in-Realtime.md
@@ -1,6 +1,6 @@
-Prior to commit
[ba9f2d](https://github.com/linkedin/pinot/commit/ba9f2ddfc0faa42fadc2cc48df1d77fec6b174fb),
Pinot was only able to support reading from
[Kafka](https://kafka.apache.org/documentation/) stream.
+Prior to commit
[ba9f2d](https://github.com/apache/incubator-pinot/commit/ba9f2ddfc0faa42fadc2cc48df1d77fec6b174fb),
Pinot was only able to support reading from
[Kafka](https://kafka.apache.org/documentation/) stream.
<br>Pinot now enables its users to write plug-ins to read from pub-sub streams
-other than Kafka. (Please refer to Issue
[#2583](https://github.com/linkedin/pinot/issues/2583))
+other than Kafka. (Please refer to Issue
[#2583](https://github.com/apache/incubator-pinot/issues/2583))
<br>Some of the streams for which plug-ins can be added are:
* [Amazon
kinesis](https://docs.aws.amazon.com/streams/latest/dev/building-enhanced-consumers-kcl.html)
* [Azure Event
Hubs](https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-java-get-started-receive-eph)
@@ -13,7 +13,7 @@ You may encounter some limitations either in Pinot or in the
stream system while
## Pinot Stream Consumers
Pinot consumes rows from event streams and serves queries on the data
consumed. Rows may be consumed either at stream level (also referred to as high
level) or at partition level (also referred to as low level).
-<img src="https://github.com/linkedin/pinot/wiki/High-level-stream.png"
alt="Stream Level Consumer" width="500" height="300">
+<img
src="https://github.com/apache/incubator-pinot/wiki/High-level-stream.png"
alt="Stream Level Consumer" width="500" height="300">
<br>Stream Level Consumer
<br><br>The stream should provide the following guarantees:
@@ -26,7 +26,7 @@ Pinot consumes rows from event streams and serves queries on
the data consumed.
* earliest available data
* last saved checkpoint
-<img src="https://github.com/linkedin/pinot/wiki/Low-level-stream.png"
alt="Partition Level Consumer" width="500" height="300">
+<img src="https://github.com/apache/incubator-pinot/wiki/Low-level-stream.png"
alt="Partition Level Consumer" width="500" height="300">
<br>Partition Level Consumer
<br><br>While consuming rows at a partition level, the stream should support
the following
@@ -47,16 +47,16 @@ reduced over time.
## Stream plug-in implementation
In order to add a new type of stream (say,Foo) implement the following classes:
-1. FooConsumerFactory extends
[StreamConsumerFactory](https://github.com/linkedin/pinot/blob/master/pinot-core/src/main/java/com/linkedin/pinot/core/realtime/stream/StreamConsumerFactory.java)
-2. FooPartitionLevelConsumer implements
[PartitionLevelConsumer](https://github.com/linkedin/pinot/blob/master/pinot-core/src/main/java/com/linkedin/pinot/core/realtime/stream/PartitionLevelConsumer.java)
-3. FooStreamLevelConsumer implements
[StreamLevelConsumer](https://github.com/linkedin/pinot/blob/master/pinot-core/src/main/java/com/linkedin/pinot/core/realtime/stream/StreamLevelConsumer.java)
-4. FooMetadataProvider implements
[StreamMetadataProvider](https://github.com/linkedin/pinot/blob/master/pinot-core/src/main/java/com/linkedin/pinot/core/realtime/stream/StreamMetadataProvider.java)
-5. FooMessageDecoder implements
[StreamMessageDecoder](https://github.com/linkedin/pinot/blob/master/pinot-core/src/main/java/com/linkedin/pinot/core/realtime/stream/StreamMessageDecoder.java)
+1. FooConsumerFactory extends
[StreamConsumerFactory](https://github.com/apache/incubator-pinot/blob/master/pinot-core/src/main/java/org/apache/pinot/core/realtime/stream/StreamConsumerFactory.java)
+2. FooPartitionLevelConsumer implements
[PartitionLevelConsumer](https://github.com/apache/incubator-pinot/blob/master/pinot-core/src/main/java/org/apache/pinot/core/realtime/stream/PartitionLevelConsumer.java)
+3. FooStreamLevelConsumer implements
[StreamLevelConsumer](https://github.com/apache/incubator-pinot/blob/master/pinot-core/src/main/java/org/apache/pinot/core/realtime/stream/StreamLevelConsumer.java)
+4. FooMetadataProvider implements
[StreamMetadataProvider](https://github.com/apache/incubator-pinot/blob/master/pinot-core/src/main/java/org/apache/pinot/core/realtime/stream/StreamMetadataProvider.java)
+5. FooMessageDecoder implements
[StreamMessageDecoder](https://github.com/apache/incubator-pinot/blob/master/pinot-core/src/main/java/org/apache/pinot/core/realtime/stream/StreamMessageDecoder.java)
Depending on stream level or partition level, your implementation needs to
include StreamLevelConsumer or PartitionLevelConsumer.
-The properties for the stream implementation are to be set in the table
configuration, inside
[streamConfigs](https://github.com/linkedin/pinot/blob/master/pinot-core/src/main/java/com/linkedin/pinot/core/realtime/stream/StreamConfig.java)
section.
+The properties for the stream implementation are to be set in the table
configuration, inside
[streamConfigs](https://github.com/apache/incubator-pinot/blob/master/pinot-core/src/main/java/org/apache/pinot/core/realtime/stream/StreamConfig.java)
section.
<br>Use the **streamType** property to define the stream type. For example,
for the implementation of stream "foo", set the property `"streamType" :
"foo"`.<br>
The rest of the configuration properties for your stream should be set with
the prefix **"stream.foo"**. Be sure to use the same suffix for: (see examples
below)
* topic
@@ -97,4 +97,4 @@ The properties for the thresholds are as follows:
"realtime.segment.flush.threshold.time" : "6h"
```
-An example of this implementation can be found in the
[KafkaConsumerFactory](com.linkedin.pinot.core.realtime.impl.kafka.KafkaConsumerFactory),
which is an implementation for the kafka stream.
+An example of this implementation can be found in the
[KafkaConsumerFactory](org.apache.pinot.core.realtime.impl.kafka.KafkaConsumerFactory),
which is an implementation for the kafka stream.
\ No newline at end of file
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]