Github user Wenpei commented on the pull request:

    https://github.com/apache/spark/pull/11000#issuecomment-178381846
  
    @yanboliang Sorry for last PR that I didn't check scala side.
    
    For regression, there are only three algorithm support MLRead/MLWrite:
    LinearRegression
    IsotonicRegression
    AFTSurvivalRegression
    
    I add export/import api, and doc test currently.
    
    But there is one issues here that doctest failed with below exception. It 
was caused we didn't set default value for "weightCol" (IsotonicRegression), 
"quantilesCol"(AFTSurvivalRegression) on scala code side. I add value when 
constructure to pass doctest, but I thought we should submit a jira for this. 
How about your idea?
    
    Exception detail.
        ir2 = IsotonicRegression.load(ir_path)
    Exception raised:
        Traceback (most recent call last):
          File "C:\Python27\lib\doctest.py", line 1289, in __run
            compileflags, 1) in test.globs
          File "<doctest __main__.IsotonicRegression[11]>", line 1, in <module>
            ir2 = IsotonicRegression.load(ir_path)
          File 
"C:\aWorkFolder\spark\spark-1.6.0-bin-hadoop2.6\spark-1.6.0-bin-hadoop2.6\python\lib\pyspark.zip\pyspark\ml\util.py",
 line 194, in load
            return cls.read().load(path)
          File 
"C:\aWorkFolder\spark\spark-1.6.0-bin-hadoop2.6\spark-1.6.0-bin-hadoop2.6\python\lib\pyspark.zip\pyspark\ml\util.py",
 line 148, in load
            instance._transfer_params_from_java()
          File 
"C:\aWorkFolder\spark\spark-1.6.0-bin-hadoop2.6\spark-1.6.0-bin-hadoop2.6\python\lib\pyspark.zip\pyspark\ml\wrapper.py",
 line 82, in _tran
    fer_params_from_java
            value = _java2py(sc, self._java_obj.getOrDefault(java_param))
          File 
"C:\aWorkFolder\spark\spark-1.6.0-bin-hadoop2.6\spark-1.6.0-bin-hadoop2.6\python\lib\py4j-0.9-src.zip\py4j\java_gateway.py",
 line 813, in
    _call__
            answer, self.gateway_client, self.target_id, self.name)
          File 
"C:\aWorkFolder\spark\spark-1.6.0-bin-hadoop2.6\spark-1.6.0-bin-hadoop2.6\python\lib\pyspark.zip\pyspark\sql\utils.py",
 line 45, in deco
            return f(*a, **kw)
          File 
"C:\aWorkFolder\spark\spark-1.6.0-bin-hadoop2.6\spark-1.6.0-bin-hadoop2.6\python\lib\py4j-0.9-src.zip\py4j\protocol.py",
 line 308, in get_
    eturn_value
            format(target_id, ".", name), value)
        Py4JJavaError: An error occurred while calling o351.getOrDefault.
        : java.util.NoSuchElementException: Failed to find a default value for 
weightCol
            at 
org.apache.spark.ml.param.Params$$anonfun$getOrDefault$2.apply(params.scala:647)
            at 
org.apache.spark.ml.param.Params$$anonfun$getOrDefault$2.apply(params.scala:647)
            at scala.Option.getOrElse(Option.scala:120)
            at 
org.apache.spark.ml.param.Params$class.getOrDefault(params.scala:646)
            at org.apache.spark.ml.PipelineStage.getOrDefault(Pipeline.scala:43)
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
            at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.lang.reflect.Method.invoke(Method.java:483)
            at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
            at 
py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381)
            at py4j.Gateway.invoke(Gateway.java:259)
            at 
py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
            at py4j.commands.CallCommand.execute(CallCommand.java:79)
            at py4j.GatewayConnection.run(GatewayConnection.java:209)
            at java.lang.Thread.run(Thread.java:745)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to