Github user velvia commented on the pull request:

    https://github.com/apache/incubator-spark/pull/576#issuecomment-35039082
  
    Uri,
    
    What you can do is, in Scala, have an implicit conversion to your own
    class, effectively extending SparkContext yourself.  We do this for our own
    private inputs.   For example:
    
    implicit class MySparkContext(sc: SparkContext) {
      def parquetJsonFile(path: String): RDD[JsValue] = ....
    }
    
    
    
    On Thu, Feb 13, 2014 at 2:44 PM, Uri Laserson 
<notificati...@github.com>wrote:
    
    > Yes, I have since thought about it more and agree that this would actually
    > be a bad idea. No need to add additional dependencies on other specific
    > file formats. I'm closing this PR.
    >
    > --
    > Reply to this email directly or view it on 
GitHub<https://github.com/apache/incubator-spark/pull/576#issuecomment-35035595>
    > .
    >
    
    
    
    -- 
    The fruit of silence is prayer;
    the fruit of prayer is faith;
    the fruit of faith is love;
    the fruit of love is service;
    the fruit of service is peace.  -- Mother Teresa

Reply via email to