Can you provide some code/data to reproduce the problem? On Fri, Feb 9, 2018 at 9:42 AM, nhamwey <nicholas.ham...@thehartford.com> wrote:
> I am using Spark 2.2.0 through Python. > > I am repeatedly getting a zero weight of sums error when trying to run a > model. This happens even when I do not specify a defined weightCol = > "variable" > > Py4JJavaError: An error occurred while calling o1295.fit. > : java.lang.AssertionError: assertion failed: Sum of weights cannot be > zero. > at scala.Predef$.assert(Predef.scala:170) > at > org.apache.spark.ml.optim.WeightedLeastSquares$Aggregator.validate( > WeightedLeastSquares.scala:418) > at > org.apache.spark.ml.optim.WeightedLeastSquares.fit( > WeightedLeastSquares.scala:101) > at > org.apache.spark.ml.optim.IterativelyReweightedLeastSquares.fit( > IterativelyReweightedLeastSquares.scala:86) > at > org.apache.spark.ml.regression.GeneralizedLinearRegression.train( > GeneralizedLinearRegression.scala:369) > at > org.apache.spark.ml.regression.GeneralizedLinearRegression.train( > GeneralizedLinearRegression.scala:203) > at org.apache.spark.ml.Predictor.fit(Predictor.scala:118) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java: > 62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke( > DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) > at py4j.reflection.ReflectionEngine.invoke( > ReflectionEngine.java:357) > at py4j.Gateway.invoke(Gateway.java:280) > at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand. > java:132) > at py4j.commands.CallCommand.execute(CallCommand.java:79) > at py4j.GatewayConnection.run(GatewayConnection.java:214) > at java.lang.Thread.run(Thread.java:745) > > > > > > -- > Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ > > --------------------------------------------------------------------- > To unsubscribe e-mail: user-unsubscr...@spark.apache.org > >