One way to do this is to have a regular job that uses SQL Server
Integration Services package(s) to extract data from the source system,
transform it, cleanse, etc then load it into a destination database (ETL)

Often, but not always, the destination database has a star schema with fact
and dimension tables, that a data warehouse can be built upon. Tools like
SQL Server Analysis Services or Excel/Power BI can understand and slice the
data for the user interface

For large volumes of data there's usually an initial load, then subsequent
runs capture any changed data from the source system and load it
incrementally


On Fri, 1 Feb 2019, 08:27 Tom P <[email protected] wrote:

> Hi Folks
>
> I have a normalised database for an application that is working well but
> is not performant for reporting. I’ve tried and tried to optimise my
> reporting queries but they are just too slow as the data volume is large in
> the database. What techniques are there to tackle this?
>
> I was thinking of creating denormalised tables for the reports which would
> work but how to automate this so that whenever the main normalised tables
> are updated then the changes “flow into” the denormalised reporting tables
> and stay in synch?
>
> I’m sure this is not a new problem so surely some of the people here have
> been in this situation. Any advice would be appreciated.
>
> Cheer
> Tom
> --
> Thanks
> Tom
>

Reply via email to