Github user ckadner commented on the issue:
https://github.com/apache/bahir/pull/50
@emlaver
> `CloudantChangesDFSuite`:
> `- save dataframe to database using createDBOnSave=true option FAILED`
**Test failures:**
```
14:10:05 [INFO] --- scalatest-maven-plugin:1.0:test (test) @
spark-sql-cloudant_2.11 ---
14:10:05 Discovery starting.
14:10:05
14:10:05 Sql-cloudant tests that require Cloudant databases have been
enabled by
14:10:05 the environment variables CLOUDANT_USER and CLOUDANT_PASSWORD.
14:10:05
14:10:05 Discovery completed in 187 milliseconds.
14:10:05 Run starting. Expected test count is: 22
14:10:05 CloudantOptionSuite:
14:10:09 - invalid api receiver option throws an error message
14:10:09 - empty username option throws an error message
14:10:09 - empty password option throws an error message
14:10:10 - empty databaseName throws an error message
14:10:10 ClientSparkFunSuite:
14:10:10 CloudantChangesDFSuite:
14:10:34 - load and save data from Cloudant database
14:10:36 - load and count data from Cloudant search index
14:10:52 - load data and verify deleted doc is not in results
14:11:12 - load data and count rows in filtered dataframe
14:11:52 - save filtered dataframe to database
14:12:12 - save dataframe to database using createDBOnSave=true option ***
FAILED ***
14:12:12 org.apache.bahir.cloudant.common.CloudantException: Database
airportcodemapping_df create error: {"error":"file_exists","reason":"The
database could not be created, the file already exists."}
14:12:12 at
org.apache.bahir.cloudant.common.JsonStoreDataAccess.createDB(JsonStoreDataAccess.scala:143)
14:12:12 at
org.apache.bahir.cloudant.CloudantReadWriteRelation.insert(DefaultSource.scala:72)
14:12:12 at
org.apache.bahir.cloudant.DefaultSource.createRelation(DefaultSource.scala:172)
14:12:12 at
org.apache.bahir.cloudant.DefaultSource.createRelation(DefaultSource.scala:86)
14:12:12 at
org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:472)
14:12:12 at
org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:48)
14:12:12 at
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
14:12:12 at
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
14:12:12 at
org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74)
14:12:12 at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117)
14:12:12 ...
14:12:13 - load and count data from view
14:12:13 - load data from view with MapReduce function
14:12:53 - load data and verify total count of selector, filter, and view
option
14:12:53 CloudantSparkSQLSuite:
14:12:56 - verify results from temp view of database n_airportcodemapping
14:12:59 - verify results from temp view of index in n_flight
14:13:00 CloudantAllDocsDFSuite:
14:13:03 - load and save data from Cloudant database
14:13:04 - load and count data from Cloudant search index
14:13:04 - load data and count rows in filtered dataframe
14:13:06 - save filtered dataframe to database
14:13:07 - save dataframe to database using createDBOnSave=true option ***
FAILED ***
14:13:07 org.apache.bahir.cloudant.common.CloudantException: Database
airportcodemapping_df create error: {"error":"file_exists","reason":"The
database could not be created, the file already exists."}
14:13:07 at
org.apache.bahir.cloudant.common.JsonStoreDataAccess.createDB(JsonStoreDataAccess.scala:143)
14:13:07 at
org.apache.bahir.cloudant.CloudantReadWriteRelation.insert(DefaultSource.scala:72)
14:13:07 at
org.apache.bahir.cloudant.DefaultSource.createRelation(DefaultSource.scala:172)
14:13:07 at
org.apache.bahir.cloudant.DefaultSource.createRelation(DefaultSource.scala:86)
14:13:07 at
org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:472)
14:13:07 at
org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:48)
14:13:07 at
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
14:13:07 at
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
14:13:07 at
org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74)
14:13:07 at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117)
14:13:07 ...
14:13:07 - load and count data from view
14:13:07 - load data from view with MapReduce function
14:13:08 Run completed in 3 minutes, 2 seconds.
14:13:08 Total number of tests run: 22
14:13:08 Suites: completed 6, aborted 0
14:13:08 Tests: succeeded 20, failed 2, canceled 0, ignored 0, pending 0
14:13:08 *** 2 TESTS FAILED ***
14:13:08 [INFO]
------------------------------------------------------------------------
...
14:13:08 [INFO] Apache Bahir - Spark SQL Cloudant DataSource .......
FAILURE [03:24 min]
...
```
---