Gurvinder created TOREE-315:
-------------------------------

             Summary: SparkR interpreter hangs when running on Spark 1.6.1
                 Key: TOREE-315
                 URL: https://issues.apache.org/jira/browse/TOREE-315
             Project: TOREE
          Issue Type: Bug
    Affects Versions: 0.1.0
         Environment: Spark 1.6.1
Jupyter notebook 4.2.0
R 3.2.2
Toree 0.1.0.dev7
            Reporter: Gurvinder
            Priority: Minor


Create notebook with scala interpreter and once notebook started with spark 
context, running SparkR magic as

%%SparkR sc

cause notebook to hang and no further execution of any cells. Need to restart 
the notebook. Issue seems to be caused by patching/updating install SparkR 
package in spark directory (/usr/local/spark/R/libs/SparkR , in my case). Here 
is the diff from default installed SparkR and updated one

diff SparkR/DESCRIPTION SparkR.bak/DESCRIPTION                                  
                     
4c4                                                                             
                     
< Version: 1.6.1                                                                
                     
---                                                                             
                     
> Version: 1.5.0                                                                
>                      
16,18c16,17                                                                     
                     
<         'functions.R' 'mllib.R' 'serialize.R' 'sparkR.R' 'stats.R'            
                     
<         'types.R' 'utils.R'                                                   
                     
< Built: R 3.1.1; ; 2016-02-27 04:45:19 UTC; unix                               
                     
---                                                                             
                     
>         'functions.R' 'mllib.R' 'serialize.R' 'sparkR.R' 'utils.R'            
>                      
> Built: R 3.2.1; ; 2015-10-01 18:38:50 UTC; unix                               
>                      
Only in SparkR: INDEX                                                           
                     
Common subdirectories: SparkR/Meta and SparkR.bak/Meta                          
                     
diff SparkR/NAMESPACE SparkR.bak/NAMESPACE                                      
                     
12a13,22                                                                        
                     
> # Needed exports for runner                                                   
>                      
> export("sparkR.connect")                                                      
>                      
> export("isInstanceOf")                                                        
>                      
> export("callJMethod")                                                         
>                      
> export("callJStatic")                                                         
>                      
> export("newJObject")                                                          
>                      
> export("removeJObject")                                                       
>                      
> export("isRemoveMethod")                                                      
>                      
> export("invokeJava")                                                          
>                      
>                                                                               
>                      
26,27d35                                                                        
                     
<               "as.data.frame",                                                
                     
<               "attach",                                                       
                     
30,33d37                                                                        
                     
<               "colnames",                                                     
                     
<               "colnames<-",                                                   
                     
<               "coltypes",                                                     
                     
<               "coltypes<-",                                                   
                     
36,37d39                                                                        
                     
<               "cov",                                                          
                     
<               "corr",                                                         
                     
49d50                                                                           
                     
<               "freqItems",

Also once the default version is patched, running sparkR from terminal also 
give this error which worked earlier

Launching java with spark-submit command /usr/local/spark/bin/spark-submit   
"sparkr-shell" /tmp/RtmpbfR7Ea/backend_port27b11cd7f3f 
16/05/17 18:26:02 ERROR RBackendHandler: createSparkContext on 
org.apache.spark.api.r.RRDD failed
Error in invokeJava(isStatic = TRUE, className, methodName, ...) :



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to