Rakesh Chalasani created SPARK-13253:
----------------------------------------

             Summary: Error aliasing array columns.
                 Key: SPARK-13253
                 URL: https://issues.apache.org/jira/browse/SPARK-13253
             Project: Spark
          Issue Type: Bug
          Components: SQL
            Reporter: Rakesh Chalasani


Getting an "UnsupportedOperationException" when trying to alias an
array column. 

The issue seems over "toString" on Column. "CreateArray" expression -> 
dataType, which checks for nullability of its children, while aliasing is 
creating a PrettyAttribute that does not implement nullability.

Code to reproduce the error:
{code}
import org.apache.spark.sql.SQLContext 
val sqlContext = new SQLContext(sparkContext) 
import sqlContext.implicits._ 
import org.apache.spark.sql.functions 

case class Test(a:Int, b:Int) 
val data = sparkContext.parallelize(Array.range(0, 10).map(x => Test(x, x+1))) 
val df = data.toDF() 
val arrayCol = functions.array(df("a"), df("b")).as("arrayCol")
{code}

Error message:
{code}
java.lang.UnsupportedOperationException
        at 
org.apache.spark.sql.catalyst.expressions.PrettyAttribute.nullable(namedExpressions.scala:289)
        at 
org.apache.spark.sql.catalyst.expressions.CreateArray$$anonfun$dataType$3.apply(complexTypeCreator.scala:40)
        at 
org.apache.spark.sql.catalyst.expressions.CreateArray$$anonfun$dataType$3.apply(complexTypeCreator.scala:40)
        at 
scala.collection.IndexedSeqOptimized$$anonfun$exists$1.apply(IndexedSeqOptimized.scala:40)
        at 
scala.collection.IndexedSeqOptimized$$anonfun$exists$1.apply(IndexedSeqOptimized.scala:40)
        at 
scala.collection.IndexedSeqOptimized$class.segmentLength(IndexedSeqOptimized.scala:189)
        at 
scala.collection.mutable.ArrayBuffer.segmentLength(ArrayBuffer.scala:47)
        at scala.collection.GenSeqLike$class.prefixLength(GenSeqLike.scala:92)
        at scala.collection.AbstractSeq.prefixLength(Seq.scala:40)
        at 
scala.collection.IndexedSeqOptimized$class.exists(IndexedSeqOptimized.scala:40)
        at scala.collection.mutable.ArrayBuffer.exists(ArrayBuffer.scala:47)
        at 
org.apache.spark.sql.catalyst.expressions.CreateArray.dataType(complexTypeCreator.scala:40)
        at 
org.apache.spark.sql.catalyst.expressions.Alias.dataType(namedExpressions.scala:136)
        at 
org.apache.spark.sql.catalyst.expressions.NamedExpression$class.typeSuffix(namedExpressions.scala:84)
        at 
org.apache.spark.sql.catalyst.expressions.Alias.typeSuffix(namedExpressions.scala:120)
        at 
org.apache.spark.sql.catalyst.expressions.Alias.toString(namedExpressions.scala:155)
        at 
org.apache.spark.sql.catalyst.expressions.Expression.prettyString(Expression.scala:207)
        at org.apache.spark.sql.Column.toString(Column.scala:138)
        at java.lang.String.valueOf(String.java:2994)
        at scala.runtime.ScalaRunTime$.stringOf(ScalaRunTime.scala:331)
        at scala.runtime.ScalaRunTime$.replStringOf(ScalaRunTime.scala:337)
        at .<init>(<console>:20)
        at .<clinit>(<console>)
        at $print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at 
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
        at 
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
        at 
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
{code}






--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to