[ 
https://issues.apache.org/jira/browse/BAHIR-123?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16181612#comment-16181612
 ] 

ASF GitHub Bot commented on BAHIR-123:
--------------------------------------

Github user ckadner commented on the issue:

    https://github.com/apache/bahir/pull/50
  
    @emlaver
    
    > `CloudantChangesDFSuite`: 
    > `- save dataframe to database using createDBOnSave=true option FAILED`
    
    **Test failures:** 
    ```
    14:10:05 [INFO] --- scalatest-maven-plugin:1.0:test (test) @ 
spark-sql-cloudant_2.11 ---
    14:10:05 Discovery starting.
    14:10:05 
    14:10:05 Sql-cloudant tests that require Cloudant databases have been 
enabled by
    14:10:05 the environment variables CLOUDANT_USER and CLOUDANT_PASSWORD.
    14:10:05         
    14:10:05 Discovery completed in 187 milliseconds.
    14:10:05 Run starting. Expected test count is: 22
    14:10:05 CloudantOptionSuite:
    14:10:09 - invalid api receiver option throws an error message
    14:10:09 - empty username option throws an error message
    14:10:09 - empty password option throws an error message
    14:10:10 - empty databaseName throws an error message
    14:10:10 ClientSparkFunSuite:
    14:10:10 CloudantChangesDFSuite:
    14:10:34 - load and save data from Cloudant database
    14:10:36 - load and count data from Cloudant search index
    14:10:52 - load data and verify deleted doc is not in results
    14:11:12 - load data and count rows in filtered dataframe
    14:11:52 - save filtered dataframe to database
    14:12:12 - save dataframe to database using createDBOnSave=true option *** 
FAILED ***
    14:12:12   org.apache.bahir.cloudant.common.CloudantException: Database 
airportcodemapping_df create error: {"error":"file_exists","reason":"The 
database could not be created, the file already exists."}
    14:12:12   at 
org.apache.bahir.cloudant.common.JsonStoreDataAccess.createDB(JsonStoreDataAccess.scala:143)
    14:12:12   at 
org.apache.bahir.cloudant.CloudantReadWriteRelation.insert(DefaultSource.scala:72)
    14:12:12   at 
org.apache.bahir.cloudant.DefaultSource.createRelation(DefaultSource.scala:172)
    14:12:12   at 
org.apache.bahir.cloudant.DefaultSource.createRelation(DefaultSource.scala:86)
    14:12:12   at 
org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:472)
    14:12:12   at 
org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:48)
    14:12:12   at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
    14:12:12   at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
    14:12:12   at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74)
    14:12:12   at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117)
    14:12:12   ...
    14:12:13 - load and count data from view
    14:12:13 - load data from view with MapReduce function
    14:12:53 - load data and verify total count of selector, filter, and view 
option
    14:12:53 CloudantSparkSQLSuite:
    14:12:56 - verify results from temp view of database n_airportcodemapping
    14:12:59 - verify results from temp view of index in n_flight
    14:13:00 CloudantAllDocsDFSuite:
    14:13:03 - load and save data from Cloudant database
    14:13:04 - load and count data from Cloudant search index
    14:13:04 - load data and count rows in filtered dataframe
    14:13:06 - save filtered dataframe to database
    14:13:07 - save dataframe to database using createDBOnSave=true option *** 
FAILED ***
    14:13:07   org.apache.bahir.cloudant.common.CloudantException: Database 
airportcodemapping_df create error: {"error":"file_exists","reason":"The 
database could not be created, the file already exists."}
    14:13:07   at 
org.apache.bahir.cloudant.common.JsonStoreDataAccess.createDB(JsonStoreDataAccess.scala:143)
    14:13:07   at 
org.apache.bahir.cloudant.CloudantReadWriteRelation.insert(DefaultSource.scala:72)
    14:13:07   at 
org.apache.bahir.cloudant.DefaultSource.createRelation(DefaultSource.scala:172)
    14:13:07   at 
org.apache.bahir.cloudant.DefaultSource.createRelation(DefaultSource.scala:86)
    14:13:07   at 
org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:472)
    14:13:07   at 
org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:48)
    14:13:07   at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
    14:13:07   at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
    14:13:07   at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74)
    14:13:07   at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117)
    14:13:07   ...
    14:13:07 - load and count data from view
    14:13:07 - load data from view with MapReduce function
    14:13:08 Run completed in 3 minutes, 2 seconds.
    14:13:08 Total number of tests run: 22
    14:13:08 Suites: completed 6, aborted 0
    14:13:08 Tests: succeeded 20, failed 2, canceled 0, ignored 0, pending 0
    14:13:08 *** 2 TESTS FAILED ***
    14:13:08 [INFO] 
------------------------------------------------------------------------
    ...
    14:13:08 [INFO] Apache Bahir - Spark SQL Cloudant DataSource ....... 
FAILURE [03:24 min]
    ...
    ```


> Fix errors to support the latest version of Play JSON library for sql-cloudant
> ------------------------------------------------------------------------------
>
>                 Key: BAHIR-123
>                 URL: https://issues.apache.org/jira/browse/BAHIR-123
>             Project: Bahir
>          Issue Type: Task
>            Reporter: Esteban Laver
>            Assignee: Esteban Laver
>            Priority: Minor
>
> The latest version is 2.6.2.  Error during mvn install -pl sql-cloudant after 
> updating play-json to 2.6.2 in sql-cloudant/pom.xml:
> [ERROR] 
> /Users/estebanmlaver/emlaver-bahir/sql-cloudant/src/main/scala/org/apache/bahir/cloudant/common/JsonStoreConfigManager.scala:19:
>  object typesafe is not a member of package com
> [ERROR] import com.typesafe.config.ConfigFactory
> [ERROR]            ^
> [ERROR] 
> /Users/estebanmlaver/emlaver-bahir/sql-cloudant/src/main/scala/org/apache/bahir/cloudant/common/JsonStoreConfigManager.scala:52:
>  not found: value ConfigFactory
> [ERROR]   private val configFactory = ConfigFactory.load()
> [ERROR]                               ^
> [ERROR] two errors found
> Maven compile dependencies need to be added to pom.xml that existed in 
> play-json 2.5.9 but were removed in 2.6.2.
> Additional info. from Patrick Titzler between play-json versions 2.5.x and 
> 2.6.x:
> Looks like the parameter data type has been changed from `Seq[JsValue]` 
> (https://www.playframework.com/documentation/2.5.x/api/scala/index.html#play.api.libs.json.JsArray)
>  to `IndexedSeq[JsValue]` 
> https://playframework.com/documentation/2.6.x/api/scala/index.html#play.api.libs.json.JsArray



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to