hour becomes
very simple:
select 3600*floor(timestamp/3600) as timestamp, count(error) as
errors,from logsgroup by 3600*floor(timestamp/3600)
Hope this helps./Sim
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-groupby-timestamp-tp23470p23615.html
Hi,
You need to import Sum and Count like:
import org.apache.spark.sql.catalyst.expressions.{Sum,Count} // or
with wildcard _
or if you use current master branch build, you can use sum('colB)
instead of Sum('colB).
Thanks.
2014-07-03 16:09 GMT+09:00 Subacini B :
> Hi,
>
> Can someone provide
Hi,
Can someone provide me pointers for this issue.
Thanks
Subacini
On Wed, Jul 2, 2014 at 3:34 PM, Subacini B wrote:
> Hi,
>
> Below code throws compilation error , "not found: *value Sum*" . Can
> someone help me on this. Do i need to add any jars or imports ? even for
> Count , same error
Hi,
Below code throws compilation error , "not found: *value Sum*" . Can
someone help me on this. Do i need to add any jars or imports ? even for
Count , same error is thrown
val queryResult = sql("select * from Table)
queryResult.groupBy('colA)('colA,*Sum*('colB) as 'totB).aggregate(*Sum*