[ 
https://issues.apache.org/jira/browse/SPARK-18548?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15691359#comment-15691359
 ] 

Xiaoye Sun commented on SPARK-18548:
------------------------------------

Hi Yuhao,

Sorry for misleading you.

expElogbetaBc is the broadcast variable of expElogbeta.
It is unpersisted between the line of

val statsSum: BDM[Double] = stats.map(_._1).reduce(_ += _)

and 

val gammat: BDM[Double] = 
breeze.linalg.DenseMatrix.vertcat(stats.map(_._2).reduce(_ ++ 
_).map(_.toDenseMatrix): _*)

However, the later line also uses the expElogbetaBc which cause the worker 
fetch expElogbetaBc again after it is unpersisted. 

I suggest move the line of 

expElogbetaBc.unpersist()

after the the line of 

val gammat: BDM[Double] = 
breeze.linalg.DenseMatrix.vertcat(stats.map(_._2).reduce(_ ++ 
_).map(_.toDenseMatrix): _*)




> OnlineLDAOptimizer reads the same broadcast data after deletion
> ---------------------------------------------------------------
>
>                 Key: SPARK-18548
>                 URL: https://issues.apache.org/jira/browse/SPARK-18548
>             Project: Spark
>          Issue Type: Improvement
>          Components: MLlib
>    Affects Versions: 1.6.1
>            Reporter: Xiaoye Sun
>            Priority: Trivial
>   Original Estimate: 1h
>  Remaining Estimate: 1h
>
> In submitMiniBatch() called by OnlineLDAOptimizer, broadcast variable 
> expElogbeta is deleted before its use in the second time, which causes the 
> executor reads the same large broadcast data twice. I suggest to move the 
> broadcast data deletion (expElogbetaBc.unpersist()) later. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to