[ 
https://issues.apache.org/jira/browse/MAHOUT-1685?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14491871#comment-14491871
 ] 

Dmitriy Lyubimov commented on MAHOUT-1685:
------------------------------------------

here's the commit that added private stuff. 

{panel}
commit d05c9ee6e8441e54732e40de45d1d2311307908f                                 
                                                                                
                                                                                
        
Author: Chip Senkbeil <[email protected]>                                     
                                                                                
                                                                                
        
Date:   Fri Jan 16 12:56:40 2015 -0800                                          
                                                                                
                                                                                
        
                                                                                
                                                                                
                                                                                
        
    [SPARK-4923][REPL] Add Developer API to REPL to allow re-publishing the 
REPL jar                                                                        
                                                                                
            
                                                                                
                                                                                
                                                                                
        
    As requested in 
[SPARK-4923](https://issues.apache.org/jira/browse/SPARK-4923), I've provided a 
rough DeveloperApi for the repl. I've only done this for Scala 2.10 because it 
does not appear that Scala 2.11 is implemented. The Scala 2.11 repl s
                                                                                
                                                                                
                                                                                
        
    This marks the majority of methods in `SparkIMain` as _private_ with a few 
special cases being _private[repl]_ as other classes within the same package 
access them. Any public method has been marked with `DeveloperApi` as suggested 
by pwendell 
    
    As the Scala 2.11 REPL 
[conforms]((https://github.com/scala/scala/pull/2206)) to 
[JSR-223](http://docs.oracle.com/javase/8/docs/technotes/guides/scripting/), 
the [Spark Kernel](https://github.com/ibm-et/spark-kernel) uses the SparkIMain 
of Scal
    
    1. The ability to _get_ variables from the interpreter (and other 
information like class/symbol/type)
    2. The ability to _put_ variables into the interpreter
    3. The ability to _compile_ code
    4. The ability to _execute_ code
    5. The ability to get contextual information regarding the scripting 
environment
    
    Additional functionality that I marked as exposed included the following:
    
    1. The blocking initialization method (needed to actually start SparkIMain 
instance)
    2. The class server uri (needed to set the _spark.repl.class.uri_ property 
after initialization), reduced from the entire class server
    3. The class output directory (beneficial for tools like ours that need to 
inspect and use the directory where class files are served)
    4. Suppression (quiet/silence) mechanics for output
    5. Ability to add a jar to the compile/runtime classpath
    6. The reset/close functionality
    7. Metric information (last variable assignment, "needed" for extracting 
results from last execution, real variable name for better debugging)
    8. Execution wrapper (useful to have, but debatable)
    
    Aside from `SparkIMain`, I updated other classes/traits and their methods 
in the _repl_ package to be private/package protected where possible. A few odd 
cases (like the SparkHelper being in the scala.tools.nsc package to expose a 
private varia
    
    `SparkCommandLine` has proven useful to extract settings and 
`SparkJLineCompletion` has proven to be useful in implementing auto-completion 
in the [Spark Kernel](https://github.com/ibm-et/spark-kernel) project. Other 
than those - and `SparkIMai
    
    Tested via the following:
    
        $ export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M 
-XX:ReservedCodeCacheSize=512m"
        $ mvn -Phadoop-2.3 -DskipTests clean package && mvn -Phadoop-2.3 test
    
    Also did a quick verification that I could start the shell and execute some 
code:
    
        $ ./bin/spark-shell
        ...
    
        scala> val x = 3
        x: Int = 3
    
        scala> sc.parallelize(1 to 10).reduce(_+_)
        ...
        res1: Int = 55
    
    Author: Chip Senkbeil <[email protected]>
    Author: Chip Senkbeil <[email protected]>
    
    Closes #4034 from rcsenkbeil/AddDeveloperApiToRepl and squashes the 
following commits:
    
    053ca75 [Chip Senkbeil] Fixed failed build by adding missing DeveloperApi 
import
    c1b88aa [Chip Senkbeil] Added DeveloperApi to public classes in repl
    6dc1ee2 [Chip Senkbeil] Added missing method to expose error reporting flag
    26fd286 [Chip Senkbeil] Refactored other Scala 2.10 classes and methods to 
be private/package protected where possible
    925c112 [Chip Senkbeil] Added DeveloperApi and Scaladocs to SparkIMain for 
Scala 2.10
{panel}


> Move Mahout shell to Spark 1.3+
> -------------------------------
>
>                 Key: MAHOUT-1685
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1685
>             Project: Mahout
>          Issue Type: Improvement
>          Components: Mahout spark shell
>            Reporter: Pat Ferrel
>            Assignee: Dmitriy Lyubimov
>            Priority: Critical
>             Fix For: 0.11.0
>
>         Attachments: mahout-shell-spark-1.3-errors.txt
>
>
> Building for Spark 1.3 we found several important APIS used by the shell are 
> now marked package private in Spark, making them inaccessible.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to