Hi,

Just got this this morning using the fresh build of Spark
2.2.0-SNAPSHOT (with a few local modifications):

scala> Seq(0 to 8).toDF
scala.MatchError: scala.collection.immutable.Range.Inclusive (of class
scala.reflect.internal.Types$ClassNoArgsTypeRef)
  at 
org.apache.spark.sql.catalyst.ScalaReflection$.org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor(ScalaReflection.scala:520)
  at 
org.apache.spark.sql.catalyst.ScalaReflection$.serializerFor(ScalaReflection.scala:463)
  at 
org.apache.spark.sql.catalyst.encoders.ExpressionEncoder$.apply(ExpressionEncoder.scala:71)
  at 
org.apache.spark.sql.SQLImplicits.newIntSequenceEncoder(SQLImplicits.scala:168)
  ... 48 elided

Is this something I've introduced, a known issue or a bug?

./bin/spark-shell --version
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.2.0-SNAPSHOT
      /_/

Using Scala version 2.11.8, Java HotSpot(TM) 64-Bit Server VM, 1.8.0_112
Branch master
Compiled by user jacek on 2017-01-09T05:01:47Z
Revision 19d9d4c855eab8f647a5ec66b079172de81221d0
Url https://github.com/apache/spark.git
Type --help for more information.

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 https://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to