[
https://issues.apache.org/jira/browse/SPARK-15221?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15344519#comment-15344519
]
Rick Bross commented on SPARK-15221:
------------------------------------
Also note that this download was 1.6.1 with prebuilt support for Hadoop 2.6.
I'm trying it again with the download for Hadoop 1.x.
It's curious to me that spark-ec2 has an argument for specifying the Hadoop
version; is this required? It would seem that you've already specified it when
downloading.
> error: not found: value sqlContext when starting Spark 1.6.1
> ------------------------------------------------------------
>
> Key: SPARK-15221
> URL: https://issues.apache.org/jira/browse/SPARK-15221
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 1.6.1
> Environment: Ubuntu 14.0.4, 8 GB RAM, 1 Processor
> Reporter: Vijay Parmar
> Priority: Blocker
> Labels: build, newbie
>
> When I start Spark (version 1.6.1), at the very end I am getting the
> following error message:
> <console>:16: error: not found: value sqlContext
> import sqlContext.implicits._
> ^
> <console>:16: error: not found: value sqlContext
> import sqlContext.sql
> I have gone through some content on the web about editing the /.bashrc file
> and including the "SPARK_LOCAL_IP=127.0.0.1" under SPARK variables.
> Also tried editing the /etc/hosts file with :-
> $ sudo vi /etc/hosts
> ...
> 127.0.0.1 <HOSTNAME>
> ...
> but still the issue persists. Is it the issue with the build or something
> else?
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]