Spark Structured Streaming can write to anything as long as an appropriate API or JDBC connection exists.
I have not tried Kinesis but have you thought about how you want to write it as a Sync? Those quota limitations, much like quotas set by the vendors (say Google on BigQuery writes etc) are default but can be negotiated with the vendor.to increase it. What facts have you established so far? HTH view my Linkedin profile <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/> https://en.everybodywiki.com/Mich_Talebzadeh *Disclaimer:* Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction. On Mon, 6 Mar 2023 at 04:20, hueiyuan su <hueiyua...@gmail.com> wrote: > *Component*: Spark Structured Streaming > *Level*: Advanced > *Scenario*: How-to > > ------------------------ > *Problems Description* > 1. I currently would like to use pyspark structured streaming to > write data to kinesis. But it seems like does not have corresponding > connector can use. I would confirm whether have another method in addition > to this solution > <https://repost.aws/questions/QUP_OJomilTO6oIgvK00VHEA/writing-data-to-kinesis-stream-from-py-spark> > 2. Because aws kinesis have quota limitation (like 1MB/s and 1000 > records/s), if spark structured streaming micro batch size too large, how > can we handle this? > > -- > Best Regards, > > Mars Su > *Phone*: 0988-661-013 > *Email*: hueiyua...@gmail.com >