Re: dataframe average error: Float does not take parameters

2015-10-21 Thread Carol McDonald
version 1.3.1 scala> auction.printSchema root |-- auctionid: string (nullable = true) |-- bid: float (nullable = false) |-- bidtime: float (nullable = false) |-- bidder: string (nullable = true) |-- bidderrate: integer (nullable = true) |-- openbid: float (nullable = false) |-- pric

Re: dataframe average error: Float does not take parameters

2015-10-21 Thread Ali Tajeldin EDU
Which version of Spark are you using? I just tried the example below on 1.5.1 and it seems to work as expected: scala> val res = df.groupBy("key").count.agg(min("count"), avg("count")) res: org.apache.spark.sql.DataFrame = [min(count): bigint, avg(count): double] scala> res.show +--+---

dataframe average error: Float does not take parameters

2015-10-21 Thread Carol McDonald
This used to work : // What's the min number of bids per item? what's the average? what's the max? auction.groupBy("item", "auctionid").count.agg(min("count"), avg("count"),max("count")).show // MIN(count) AVG(count)MAX(count) // 1 16.992025518341308 75 but this now gives an error val