1) this is not a use case, but a technical solution. Hence nobody can tell you if it make sense or not 2) do an upsert in Cassandra. However keep in mind that the application submitting to the Kafka topic and the one consuming from the Kafka topic need to ensure that they process messages in the right order. This may not be always guaranteed, eg in case of errors, and they need to avoid overwriting new data with old data. This is also not a Kafka setting that has to be dealt with at producer and consumer level
> Am 29.08.2019 um 13:21 schrieb Shyam P <shyamabigd...@gmail.com>: > > Hi, > I need to do a PoC for a business use-case. > > Use case : Need to update a record in Cassandra table if exists. > > Will spark streaming support compare each record and update existing > Cassandra record ? > > For each record received from kakfa topic , If I want to check and compare > each record whether its already there in Cassandra or not , if yes , update > the record else insert a new record. > > How can be this done using spark-structured streaming and cassandra? any > snippet or sample if you have. > > Thank you, > > Shyam