On Monday, 22 February 2016 at 03:39:57 UTC, Stefan Koch wrote:
On Monday, 22 February 2016 at 03:33:27 UTC, Chris Wright wrote:
On Sun, 21 Feb 2016 21:15:01 +0000, Stefan Koch wrote:
On Sunday, 21 February 2016 at 19:55:38 UTC, Any wrote:
On Sunday, 21 February 2016 at 19:21:31 UTC, Stefan Koch
wrote:
where n is the number of rows.
That means your doing a full table scan. When the table gets
large enough, this gets problematic.
When the table get's large enough you probably don't worry
about CTFE'ing it anymore.
So you intend this to work *only* at compile time? Or would
you supply a different API for querying at runtime?
I intend to have a D-style SQLite compatible Database solution.
However I will not reinvent MySql or postgres. Performance is
important to me. Scaling to bigger Data however is secondary at
this point.
Great project, Stefan. Any idea what kind of maximum database
size will be feasible ? I realise it is early days so far and
not your main focus.
Laeeth