Github user emlaver commented on the issue:
https://github.com/apache/bahir/pull/45
WIP: While running tests against databases with a size > 500 MB,
`java.lang.OutOfMemoryError: Java heap space` error would occur (even when
setting `--conf spark.driver.memory=10g`). I believe this has to do with how
the HTTP request is setup and called against the `_changes API` in
[JsonStoreDataAccess.scala](https://github.com/apache/bahir/pull/45/files#diff-ab440bd537d48f7cf58cd9cf0ea143b1).
Good news is I've created a test that uses Spark streaming (using
CloudantReceiver.java) to read all docs from a Cloudant database into a Spark
DataFrame. It should work for a SQL Temp Table. I ran this test without any
java heap errors against a database size of 1 GB, 1.8 GB, and 14.2 GB.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---