Github user Fokko commented on the issue:

    https://github.com/apache/spark/pull/21596
  
    When looking at the history server, we have a similar issue. From at the 
list command 
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/92231/testReport/org.apache.spark.deploy.history/HistoryServerSuite/stage_list_json/
 :
    
    I've formatted the json, expected:
    ```json
    [{
        "status": "COMPLETE",
        "stageId": 3,
        "attemptId": 0,
        "numTasks": 8,
        "numActiveTasks": 0,
        "numCompleteTasks": 8,
        "numFailedTasks": 0,
        "numKilledTasks": 0,
        "numCompletedIndices": 8,
        "executorRunTime": 162,
        "executorCpuTime": 0,
        "submissionTime": "2015-02-03T16:43:07.191GMT",
        "firstTaskLaunchedTime": "2015-02-03T16:43:07.191GMT",
        "completionTime": "2015-02-03T16:43:07.226GMT",
        "inputBytes": 160,
        "inputRecords": 0,
        "outputBytes": 0,
        "outputRecords": 0,
        "shuffleReadBytes": 0,
        "shuffleReadRecords": 0,
        "shuffleWriteBytes": 0,
        "shuffleWriteRecords": 0,
        "memoryBytesSpilled": 0,
        "diskBytesSpilled": 0,
        "name": "count at <console>:17",
        "details": 
"org.apache.spark.rdd.RDD.count(RDD.scala:910)\n$line19.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:17)\n$line19.$read$$iwC$$iwC$$iwC.<init>(<console>:22)\n$line19.$read$$iwC$$iwC.<init>(<console>:24)\n$line19.$read$$iwC.<init>(<console>:26)\n$line19.$read.<init>(<console>:28)\n$line19.$read$.<init>(<console>:32)\n$line19.$read$.<clinit>(<console>)\n$line19.$eval$.<init>(<console>:7)\n$line19.$eval$.<clinit>(<console>)\n$line19.$eval.$print(<console>)\nsun.reflect.NativeMethodAccessorImpl.invoke0(Native
 
Method)\nsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)\nsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\njava.lang.reflect.Method.invoke(Method.java:606)\norg.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)\norg.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)\norg.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)\norg.apache.spark.repl.Spar
 
kIMain.interpret(SparkIMain.scala:705)\norg.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)",
        "schedulingPool": "default",
        "rddIds": [6, 5],
        "accumulatorUpdates": [],
        "killedTasksSummary": {}
    }, ... ]
    ```
    found:
    ```json
    [{
        "status": "COMPLETE",
        "stageId": 3,
        "attemptId": 0,
        "numTasks": 8,
        "numActiveTasks": 0,
        "numCompleteTasks": 8,
        "numFailedTasks": 0,
        "numKilledTasks": 0,
        "numCompletedIndices": 8,
        "executorRunTime": 162,
        "executorCpuTime": 0,
        "submissionTime": "2015-02-03T16:43:07.191GMT",
        "firstTaskLaunchedTime": "2015-02-03T16:43:07.191GMT",
        "completionTime": "2015-02-03T16:43:07.226GMT",
        "failureReason": null,
        "inputBytes": 160,
        "inputRecords": 0,
        "outputBytes": 0,
        "outputRecords": 0,
        "shuffleReadBytes": 0,
        "shuffleReadRecords": 0,
        "shuffleWriteBytes": 0,
        "shuffleWriteRecords": 0,
        "memoryBytesSpilled": 0,
        "diskBytesSpilled": 0,
        "name": "count at <console>:17",
        "description": null,
        "details": 
"org.apache.spark.rdd.RDD.count(RDD.scala:910)\n$line19.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:17)\n$line19.$read$$iwC$$iwC$$iwC.<init>(<console>:22)\n$line19.$read$$iwC$$iwC.<init>(<console>:24)\n$line19.$read$$iwC.<init>(<console>:26)\n$line19.$read.<init>(<console>:28)\n$line19.$read$.<init>(<console>:32)\n$line19.$read$.<clinit>(<console>)\n$line19.$eval$.<init>(<console>:7)\n$line19.$eval$.<clinit>(<console>)\n$line19.$eval.$print(<console>)\nsun.reflect.NativeMethodAccessorImpl.invoke0(Native
 
Method)\nsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)\nsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\njava.lang.reflect.Method.invoke(Method.java:606)\norg.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)\norg.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)\norg.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)\norg.apache.spark.repl.Spar
 
kIMain.interpret(SparkIMain.scala:705)\norg.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)",
        "schedulingPool": "default",
        "rddIds": [6, 5],
        "accumulatorUpdates": [],
        "tasks": null,
        "executorSummary": null,
        "killedTasksSummary": {}
    }, ... ]
    ```
    
    We observe that the `"failureReason": null,` is listed, while it should 
have been omitted.



---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to