gt; Thanks,
> Jungtaek Lim (HeartSaVioR)
>
> On Thu, Mar 11, 2021 at 4:54 PM Kuttaiah Robin wrote:
>
>> Hello,
>>
>> I have a use case where I need to read events(non correlated) from a
>> source kafka topic, then correlate and push forward to another target topic.
&
Hello,
I have a use case where I need to read events(non correlated) from a source
kafka topic, then correlate and push forward to another target topic.
I use spark structured streaming with FlatMapGroupsWithStateFunction along
with GroupStateTimeout.ProcessingTimeTimeout() . After each timeout
Hello,
I have a use case where I need to read events(non correlated) from a kafka
topic, then correlate and push forward to another topic.
I use spark structured streaming with FlatMapGroupsWithStateFunction along
with GroupStateTimeout.ProcessingTimeTimeout() . After each timeout, I do
some co
Hello,
Is there a way spark streaming application will get to know during the
start and end of the data read from a dataset partition?
I want to create partition specific cache during the start and delete
during the partition is read completely.
Thanks for you help in advance.
regards,
Robin Kut
Hello all,
Am using spark-2.3.0 and hadoop-2.7.4.
I have spark streaming application which listens to kafka topic, does some
transformation and writes to Oracle database using JDBC client.
Step 1.
Read events from Kafka as shown below;
--
Dataset kafkaEven
Hello all,
Am using spark-2.3.0 and hadoop-2.7.4.
I have spark streaming application which listens to kafka topic, does some
transformation and writes to Oracle database using JDBC client.
Read events from Kafka as shown below;
m_oKafkaEvents = getSparkSession().readStream().format("kaf
rocess
> method as it will run in executors.
>
> Best Regards,
> Ryan
>
>
> On Fri, Oct 5, 2018 at 6:54 AM Kuttaiah Robin wrote:
>
>> Hello,
>>
>> I have a spark streaming application which reads from Kafka based on the
>> given schema.
>>
>>
Hello,
I have a spark streaming application which reads from Kafka based on the
given schema.
Dataset m_oKafkaEvents =
getSparkSession().readStream().format("kafka")
.option("kafka.bootstrap.servers", strKafkaAddress)
.option("assign", strSubscription)
.option
Hope it helps.
>
> Regards,
> Shahab
>
> On Wed, Sep 26, 2018 at 8:02 AM Kuttaiah Robin wrote:
>
>> Hello,
>>
>> Currently I have Oracle database table with description as shown below;
>>
>> Table INSIGHT_ID_FED_IDENTIFIERS
>> -
Hello,
Currently I have Oracle database table with description as shown below;
Table INSIGHT_ID_FED_IDENTIFIERS
-
CURRENT_INSTANCE_ID VARCHAR2(100)
PREVIOUS_INSTANCE_ID VARCHAR2(100)
Sample values in the table basically output of select * from
IN
Hello,
Am using FlatMapGroupsWithStateFunction in my spark streaming application.
FlatMapGroupsWithStateFunction
idstateUpdateFunction =
new FlatMapGroupsWithStateFunction() {.}
SessionUpdate class is having trouble when added the highlighted code which
throws below exception; The same att
11 matches
Mail list logo