kaknikhil commented on a change in pull request #432: MADLIB-1351 : Added 
stopping criteria on perplexity to LDA
URL: https://github.com/apache/madlib/pull/432#discussion_r333279341
 
 

 ##########
 File path: src/ports/postgres/modules/lda/lda.sql_in
 ##########
 @@ -212,6 +220,10 @@ lda_train( data_table,
     <dt>beta</dt>
     <dd>DOUBLE PRECISION. Dirichlet prior for the per-topic
     word multinomial (e.g., 0.01 is a reasonable value to start with).</dd>
+    <dt>evaluate_every</dt>
+    <dd>int, optional (default=0). How often to evaluate perplexity. Set it to 
0 or negative number to not evaluate perplexity in training at all. Evaluating 
perplexity can help you check convergence in training process, but it will also 
increase total training time. Evaluating perplexity in every iteration might 
increase training time up to two-fold.</dd>
+    <dt>perplexity_tol</dt>
+    <dd>float, optional (default=1e-1). Perplexity tolerance to stop 
iterating. Only used when evaluate_every is greater than 0.</dd>
 
 Review comment:
   maybe @fmcquillan99 can add a more verbose explanation here.
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to