[
https://issues.apache.org/jira/browse/SPARK-35392?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17343684#comment-17343684
]
zhengruifeng commented on SPARK-35392:
--------------------------------------
This GMM test is highly unstable, it tend to fail if: change number of
partitions or just change the way to compute the sum of weights.
I think we can just disable this check of {{summary.logLikelihood}} for now,
and use another test in the future.++
as to this failure, is it related to
[https://github.com/apache/spark/pull/32415?] [~srowen]
> Flaky Test: spark/spark/python/pyspark/ml/clustering.py GaussianMixture
> -----------------------------------------------------------------------
>
> Key: SPARK-35392
> URL: https://issues.apache.org/jira/browse/SPARK-35392
> Project: Spark
> Issue Type: Bug
> Components: ML, PySpark
> Affects Versions: 3.2.0
> Reporter: Hyukjin Kwon
> Priority: Major
>
> https://github.com/apache/spark/runs/2568540411
> {code}
> **********************************************************************
> File "/__w/spark/spark/python/pyspark/ml/clustering.py", line 276, in
> __main__.GaussianMixture
> Failed example:
> summary.logLikelihood
> Expected:
> 65.02945...
> Got:
> 93.36008975083433
> **********************************************************************
> {code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]