[ 
https://issues.apache.org/jira/browse/SPARK-38911?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-38911:
------------------------------------

    Assignee:     (was: Apache Spark)

> 'test 1 resource profile' throws exception when running it in IDEA separately
> -----------------------------------------------------------------------------
>
>                 Key: SPARK-38911
>                 URL: https://issues.apache.org/jira/browse/SPARK-38911
>             Project: Spark
>          Issue Type: Test
>          Components: Tests
>    Affects Versions: 3.2.1
>            Reporter: Bobby Wang
>            Priority: Minor
>
> The test `test 1 resource profile` of DAGSchedulerSuite will fail if I run it 
> in IDEA separately.
>     
> The root cause is the ResourceProfile is initialized before SparkContext, and 
> it will take `DEFAULT_RESOURCE_PROFILE_ID` as the resource profile id. But 
> the test asserts that the id is not equal  to DEFAULT_RESOURCE_PROFILE_ID.
>     
> {code:java}
> assert(expectedid.get != ResourceProfile.DEFAULT_RESOURCE_PROFILE_ID){code}
>     
> The exception is like below,
>     
> {code:java}
>     0 equaled 0
>     ScalaTestFailureLocation: org.apache.spark.scheduler.DAGSchedulerSuite at 
> (DAGSchedulerSuite.scala:3269)
>     org.scalatest.exceptions.TestFailedException: 0 equaled 0
>             at 
> org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:472)
>             at 
> org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:471)
>             at 
> org.scalatest.Assertions$.newAssertionFailedException(Assertions.scala:1231)
>             at 
> org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:1295)
>             at 
> org.apache.spark.scheduler.DAGSchedulerSuite.$anonfun$new$191(DAGSchedulerSuite.scala:3269)
>             at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85){code}
> This issue does not exist when running all DAGSchedulerSuite one by one, 
> since the SparkContext will be initialized at the very beginning.
>  
> I will submit a patch to fix it.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to