Hi Joe,
I understand Oracle Golden Gate is a data replication tool which uses log-based technology to stream all changes to a database from source, to target. Can you please help me in understand what is the role of Kafka after the data is provided to it by OGG ? Also if we just need to data replicate our oracle DB, is there any other ways we can do it without Oracle Golden Gate ? Is Apache Storm/Flink a prospect for us to look into (though they are streaming tools). Thanks & Regards, Kailash Kota Product Development | JDA Software Pvt Ltd. Ph: +91 80 6101 8649 -----Original Message----- From: Joe Ammann <j...@pyx.ch> Sent: 04 June 2019 18:23 To: users@kafka.apache.org Subject: Re: Live data streaming from Oracle to oracle using Kafka CAUTION: External email. Please validate the sender before using the content. Do not select "reply all" unless everyone on the list is a valid recipient. ________________________________ Hi Kailash On 6/4/19 12:35 PM, Kailash Kota wrote: > We want to live stream data from Oracle DB and to Oracle DB as a target using > Kafka and we do not want to use Oracle Golden Gate because of the extra > license cost. > Is there a way we can read the redo logs and achieve this ? Debezium https://urldefense.proofpoint.com/v2/url?u=https-3A__debezium.io_&d=DwID-g&c=ToVbMNJY2W2sI6cHZmZYL1a1DSDW03K-K6TT0TaILp0&r=aGAW1WZThhVvUQAp92-lwMJA5-ISno7QC7uoFNhFRNo&m=WwpykHPxz1oLXCWQAc8N6OsoURKODnCwOynfOp9A0hY&s=xopkn1vSbRxPAV31yERJM9_G2U0yQ6ZVOXsSbhC64dM&e= could be an option, but the current Oracle implementation is based on XStream, which also requires a GoldenGate license. There is work underway to use Oracle LogMiner directly https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_debezium_debezium-2Dincubator_pull_87&d=DwID-g&c=ToVbMNJY2W2sI6cHZmZYL1a1DSDW03K-K6TT0TaILp0&r=aGAW1WZThhVvUQAp92-lwMJA5-ISno7QC7uoFNhFRNo&m=WwpykHPxz1oLXCWQAc8N6OsoURKODnCwOynfOp9A0hY&s=50E0z2X4n9mlkGTKwwOjHjEUSHuJWaogDJblSwZgjJU&e= , but not yet ready > If not, can you help us on the other ways that we can stream data. Please > provide any documentation and procedure is available. Other options involve Kafka Connect with the JDBC connector, or DB based approaches like triggers or materialized views. Robin Moffatt has an execellent overview here https://urldefense.proofpoint.com/v2/url?u=https-3A__rmoff.net_2018_12_12_streaming-2Ddata-2Dfrom-2Doracle-2Dinto-2Dkafka-2Ddecember-2D2018_&d=DwID-g&c=ToVbMNJY2W2sI6cHZmZYL1a1DSDW03K-K6TT0TaILp0&r=aGAW1WZThhVvUQAp92-lwMJA5-ISno7QC7uoFNhFRNo&m=WwpykHPxz1oLXCWQAc8N6OsoURKODnCwOynfOp9A0hY&s=aSAKJPmsAdTsFEI4Jq1APMAqRQXtKGgltIO-l3YM-nk&e= -- CU, Joe To the extent permitted by law, we may monitor electronic communications for the purposes of ensuring compliance with our legal and regulatory obligations and internal policies. We may also collect email traffic headers for analyzing patterns of network traffic and managing client relationships. For additional information see https://jda.com/privacy-policy.