No you’re right, that’s exactly what I’m doing right now. The choice would have 
been *either* Parquet *or* a database.

What’s unfortunate is that apparently this only works with Playframework 2.2, 
not 2.3, because of the incompatible Akka versions.

On 16.09.2014, at 16:37, Yana Kadiyska <yana.kadiy...@gmail.com> wrote:

> If your dashboard is doing ajax/pull requests against say a REST API you can 
> always create a Spark context in your rest service and use SparkSQL to query 
> over the parquet files. The parquet files are already on disk so it seems 
> silly to write both to parquet and to a DB...unless I'm missing something in 
> your setup.
> 
> On Tue, Sep 16, 2014 at 4:18 AM, Marius Soutier <mps....@gmail.com> wrote:
> Writing to Parquet and querying the result via SparkSQL works great (except 
> for some strange SQL parser errors). However the problem remains, how do I 
> get that data back to a dashboard. So I guess I’ll have to use a database 
> after all.
> 
> 
> You can batch up data & store into parquet partitions as well. & query it 
> using another SparkSQL  shell, JDBC driver in SparkSQL is part 1.1 i believe. 
> 

Reply via email to