I always like to simplify things. If I were you, I would use the well known
and used spout of kafka to ingest data into your storm cluster. Simply
write a Kafka Producer that utilizes the postgre java driver to pull out
your required data and send it as a message. You'll find it is pretty easy
to write kafka producers. Check out my project of creating some simple
producers and just mirror that to produce your postgre producer:

https://github.com/leerobert/kafka-producers


On Fri, Jun 27, 2014 at 2:49 PM, Sa Li <[email protected]> wrote:

> Thanks a lot, John. The entire project is getting data from postgresql and
> finally emit and update cassandra tables. With the help of Robert in this
> group, think I have some resource of storm-cassandra integration. However,
>  really not much tutorials regarding postgres with storm, '*storm-rdbms* ‘
> is the only examples I can find about db->storm. That would be great if
> someone can contribute more example code about posture-storm. Sorry for the
> shameless requirement from a new storm user.
>
>
> thanka
>
> Alec
> On Jun 27, 2014, at 5:53 AM, John Welcher <[email protected]> wrote:
>
> Hi
>
> We use Postgres notifications. The spout open method  registers for
> database notifications (add, update, delete). Each time the spout next
> method is called we check for pending notifications and process accordingly.
>
> Good Luck
>
> John
>
>
> On Fri, Jun 27, 2014 at 12:07 AM, Sa Li <[email protected]> wrote:
>
>> Dear all
>>
>> I am doing an implementation of spout, the stream of is coming from a
>> postgresql ingress API (in-house project).  All I know for now is to get
>> spout connected to postgresl, and retrieve the data periodically and store
>> the data to a queue and then emits to the topology.  Anyone has ever done
>> the similar job, hope to get some instructions and details from you.
>>
>>
>> thanks
>>
>> Alec
>>
>>
>>
>
>

Reply via email to