Eugene,
The example I gave you was in Python. I used it on my end and it works fine.
Sorry, I don't know Scala.
Thanks
On Tuesday, December 29, 2015 5:24 AM, Eugene Morozov
wrote:
Annabel,
That might work in Scala, but I use Java. Three quotes just don't
Annabel,
That might work in Scala, but I use Java. Three quotes just don't compile =)
If your example is in Scala, then, I believe, semicolon is not required.
--
Be well!
Jean Morozov
On Mon, Dec 28, 2015 at 8:49 PM, Annabel Melongo
wrote:
> Jean,
>
> Try this:
>
>
Jean,
Try this:df.select("""select * from tmptable where x1 = '3.0'""").show();
Note: you have to use 3 double quotes as marked
On Friday, December 25, 2015 11:30 AM, Eugene Morozov
wrote:
Thanks for the comments, although the issue is not in limit()
Should not df.select just have the column names?
And sqlC.sql have the select statement?
Therefore perhaps we could use: df.select("COLUMN1, COLUMN2") and
sqlC.sql("select COLUMN1, COLUMN2 from tablename")
Why would someone want to do a select on a dataframe after registering it
as a table? I
Chris, thanks. That'd be great to try =)
--
Be well!
Jean Morozov
On Fri, Dec 25, 2015 at 10:50 PM, Chris Fregly wrote:
> oh, and it's worth noting that - starting with Spark 1.6 - you'll be able
> to just do the following:
>
> SELECT * FROM json.`/path/to/json/file`
>
>
Hello, I'm basically stuck as I have no idea where to look;
Following simple code, given that my Datasource is working gives me an
exception.
DataFrame df = sqlc.load(filename, "com.epam.parso.spark.ds.DefaultSource");
df.cache();
df.printSchema(); <-- prints the schema perfectly fine!
sqlContext.sql("select * from table limit 5").show() (not sure if limit 5
supported)
or use Dmitriy's solution. select() defines your projection when you've
specified entire query
On 25 December 2015 at 15:42, Василец Дмитрий
wrote:
> hello
> you can try to use
Ted, Igor,
Oh my... thanks a lot to both of you!
Igor was absolutely right, but I missed that I have to use sqlContext =(
Everything's perfect.
Thank you.
--
Be well!
Jean Morozov
On Fri, Dec 25, 2015 at 8:31 PM, Ted Yu wrote:
> DataFrame uses different syntax from SQL
I assume by "The same code perfectly works through Zeppelin 0.5.5" that
you're using the %sql interpreter with your regular SQL SELECT statement,
correct?
If so, the Zeppelin interpreter is converting the that
follows
%sql
to
sqlContext.sql()
per the following code:
Thanks for the comments, although the issue is not in limit() predicate.
It's something with spark being unable to resolve the expression.
I can do smth like this. It works as it suppose to:
df.select(df.col("*")).where(df.col("x1").equalTo(3.0)).show(5);
But I think old fashioned sql style
hello
you can try to use df.limit(5).show()
just trick :)
On Fri, Dec 25, 2015 at 2:34 PM, Eugene Morozov
wrote:
> Hello, I'm basically stuck as I have no idea where to look;
>
> Following simple code, given that my Datasource is working gives me an
> exception.
>
>
DataFrame uses different syntax from SQL query.
I searched unit tests but didn't find any in the form of df.select("select
...")
Looks like you should use sqlContext as other people suggested.
On Fri, Dec 25, 2015 at 8:29 AM, Eugene Morozov
wrote:
> Thanks for the
oh, and it's worth noting that - starting with Spark 1.6 - you'll be able
to just do the following:
SELECT * FROM json.`/path/to/json/file`
(note the back ticks)
instead of calling registerTempTable() for the sole purpose of using SQL.
https://issues.apache.org/jira/browse/SPARK-11197
On Fri,
13 matches
Mail list logo