Github user WeichenXu123 commented on the issue:

    https://github.com/apache/spark/pull/19018
  
    cc @felixcheung 
    I encounter RTest failed again even when this seed added.
    
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/81350/console
    error:
    ```
    Failed 
-------------------------------------------------------------------------
    1. Error: spark.decisionTree (@test_mllib_tree.R#355) 
--------------------------
    java.lang.IllegalArgumentException: requirement failed: The input column 
stridx_629e10e5fd28 should have at least two distinct values.
        at scala.Predef$.require(Predef.scala:224)
        at 
org.apache.spark.ml.feature.OneHotEncoder$$anonfun$5.apply(OneHotEncoder.scala:113)
        at 
org.apache.spark.ml.feature.OneHotEncoder$$anonfun$5.apply(OneHotEncoder.scala:111)
        at scala.Option.map(Option.scala:146)
        at 
org.apache.spark.ml.feature.OneHotEncoder.transformSchema(OneHotEncoder.scala:111)
        at 
org.apache.spark.ml.feature.OneHotEncoder.transform(OneHotEncoder.scala:141)
        at org.apache.spark.ml.Pipeline$$anonfun$fit$2.apply(Pipeline.scala:161)
        at org.apache.spark.ml.Pipeline$$anonfun$fit$2.apply(Pipeline.scala:149)
        at scala.collection.Iterator$class.foreach(Iterator.scala:893)
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
        at 
scala.collection.IterableViewLike$Transformed$class.foreach(IterableViewLike.scala:44)
        at 
scala.collection.SeqViewLike$AbstractTransformed.foreach(SeqViewLike.scala:37)
        at org.apache.spark.ml.Pipeline.fit(Pipeline.scala:149)
        at org.apache.spark.ml.feature.RFormula.fit(RFormula.scala:270)
        at 
org.apache.spark.ml.r.DecisionTreeClassifierWrapper$.fit(DecisionTreeClassificationWrapper.scala:84)
        at 
org.apache.spark.ml.r.DecisionTreeClassifierWrapper.fit(DecisionTreeClassificationWrapper.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at 
org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:167)
        at 
org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:108)
        at 
org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:40)
        at 
io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
        at 
io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
        at 
io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
        at 
io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
        at 
io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
        at 
io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at 
io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
        at 
io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
        at 
io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
        at 
io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
        at 
io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
        at 
io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
        at java.lang.Thread.run(Thread.java:745)
    1: spark.decisionTree(traindf, clicked ~ ., type = "classification", 
maxDepth = 5, maxBins = 16, 
           seed = 46) at 
/home/jenkins/workspace/SparkPullRequestBuilder/R/pkg/tests/fulltests/test_mllib_tree.R:355
    2: spark.decisionTree(traindf, clicked ~ ., type = "classification", 
maxDepth = 5, maxBins = 16, 
           seed = 46)
    3: .local(data, formula, ...)
    4: callJStatic("org.apache.spark.ml.r.DecisionTreeClassifierWrapper", 
"fit", data@sdf, 
           formula, as.integer(maxDepth), as.integer(maxBins), impurity, 
as.integer(minInstancesPerNode), 
           as.numeric(minInfoGain), as.integer(checkpointInterval), seed, 
as.integer(maxMemoryInMB), 
           as.logical(cacheNodeIds), handleInvalid)
    5: invokeJava(isStatic = TRUE, className, methodName, ...)
    6: handleErrors(returnStatus, conn)
    7: stop(readString(conn))
    
    DONE 
===========================================================================
    Error: Test failures
    Execution halted
    ```
    
    Would you like take a look ?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to