milq created SPARK-4208:
---------------------------

             Summary: stack over flow error while using sqlContext.sql
                 Key: SPARK-4208
                 URL: https://issues.apache.org/jira/browse/SPARK-4208
             Project: Spark
          Issue Type: Bug
          Components: Spark Core, SQL
    Affects Versions: 1.1.0
         Environment: windows 7 , prebuilt spark-1.1.0-bin-hadoop2.3
            Reporter: milq


error happens when using sqlContext.sql

14/11/03 18:54:43 INFO BlockManager: Removing block broadcast_1
14/11/03 18:54:43 INFO MemoryStore: Block broadcast_1 of size 2976 dropped from 
memory (free 28010260
14/11/03 18:54:43 INFO ContextCleaner: Cleaned broadcast 1
root
 |--  firstName : string (nullable = true)
 |-- lastNameX: string (nullable = true)

Exception in thread "main" java.lang.StackOverflowError
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
        at 
scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
        at 
scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
        at 
scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to