[ 
https://issues.apache.org/jira/browse/MAHOUT-1685?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14492692#comment-14492692
 ] 

Andrew Palumbo edited comment on MAHOUT-1685 at 4/13/15 5:40 PM:
-----------------------------------------------------------------

bq. 3) Petition them to support the API we use. This is by far the easiest and 
seems like it might be worth writing a Jira in Spark if only to get their 
response.

It seems that there are several other projects doing this as well- so maybe it 
would be good to join them.  Agreed, This would be the best option by far..

Currently we are overriding {{SparkILoop}}.  We make direct calls to things 
like {{intp}}, {{echo(...)}}, and {{command(...)}}...these were all exposed 
pre-Spark-1.3 

{quote}
Any public method has been marked with `DeveloperApi` as suggested by pwendell 
...
4. The ability to execute code
...
Additional functionality that I marked as exposed included the following: 
...
5. Ability to add a jar to the compile/runtime classpath
{quote}

These (4,5) are referring to the  {{SparkIMain}} developer API..

I believe the only things that we really need are the ability to execute the 
imports(4) and the ability to add the Mahout spark jars jars to the classpath 
(5).  

I do remember trying to extend {{SparkIMain}} and hitting some problems there.. 
but did not look further at it after we decided to ship 0.10.0 with Spark-1.1.1.





was (Author: andrew_palumbo):
bq. 3) Petition them to support the API we use. This is by far the easiest and 
seems like it might be worth writing a Jira in Spark if only to get their 
response.

It seems that there are several other projects doing this as well- so maybe it 
would be good to join them.

I agree that we need to better understand their Developer API especially in 
light of the upcoming scala 2.11 support ..

Currently we are overriding {{SparkILoop}}.  We make direct calls to things 
like {{intp}}, {{echo(...)}}, and {{command(...)}}...these were all exposed 
pre-Spark-1.3 

{quote}
Any public method has been marked with `DeveloperApi` as suggested by pwendell 
...
4. The ability to execute code
...
Additional functionality that I marked as exposed included the following: 
...
5. Ability to add a jar to the compile/runtime classpath
{quote}

These (4,5) are referring to the  {{SparkIMain}} developer API..

I believe the only things that we really need are the ability to execute the 
imports(4) and the ability to add the Mahout spark jars jars to the classpath 
(5).  

I do remember trying to extend {{SparkIMain}} and hitting some problems there.. 
but did not look further at it after we decided to ship 0.10.0 with Spark-1.1.1.




> Move Mahout shell to Spark 1.3+
> -------------------------------
>
>                 Key: MAHOUT-1685
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1685
>             Project: Mahout
>          Issue Type: Improvement
>          Components: Mahout spark shell
>            Reporter: Pat Ferrel
>            Assignee: Dmitriy Lyubimov
>            Priority: Critical
>             Fix For: 0.11.0
>
>         Attachments: mahout-shell-spark-1.3-errors.txt
>
>
> Building for Spark 1.3 we found several important APIS used by the shell are 
> now marked package private in Spark, making them inaccessible.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to