On Mon, Jan 2, 2017 at 5:23 AM, Job wrote:
> Hello guys and very good new year to everybody!
>
> We are now approaching some queries and statistics on very big table (about
> 180 millions of record).
> The table is partitioned by day (about ~3 Gb of data for every
Hi,
It is better to use a data warehouse software with columnar storage(clickhouse,
greenplum etc) for BI queries on large datasets but data offloading could be a
complicated task. It is possible to try tune postgres and the environment it
works in to process query such as yours faster. You
Hi,
i am trying "materialized views" in Postgresql 9.6.1.
Regarding Andy Calson suggestion:
>>I do very similar thing, log all my webstats to PG, but querying millions of
>>rows is always going to be slow. I use a summary table.
They seems to work fine.
One question: the materialized view is
On 01/02/2017 05:23 AM, Job wrote:
Hello guys and very good new year to everybody!
We are now approaching some queries and statistics on very big table (about 180
millions of record).
The table is partitioned by day (about ~3 Gb of data for every partition/day).
We use Postgresql 9.6.1
I am
Hello guys and very good new year to everybody!
We are now approaching some queries and statistics on very big table (about 180
millions of record).
The table is partitioned by day (about ~3 Gb of data for every partition/day).
We use Postgresql 9.6.1
I am experiencing quite important