On 2016/03/06 1:39 PM, Paul Sanderson wrote:
> I understand this - but, there always a but, I still would like to do
> something. Applying the limit anyway and then telling them the query
> has been limited might be a solution.
>
> Time is usually not an issue but as the results are loaded into a grid
> for display memory can be the issue.

Yes, you are quite right.

I don't really have a "best way" solution for you. I could tell you how 
I solved it, or perhaps I should say, how I made it work, in SQLitespeed 
(Jury's out on whether it is a solution at all). I added an automatic 
(and user-configurable) LIMIT clause to every standard table query (to 
avoid loading tables with millions of rows) for simply speed purposes, 
but I display this limit to the user and they can adjust it. Once it is 
a user defined query, I don't add any LIMIT clause, but I limit the 
results set inside the GUI to 16 megabytes of output text (which 
displays whatever amount of rows fills up that many megabytes, usually 
quite a few more than can possibly be read in reasonable time-frames).  
Of course it is suggested that if more results are needed, to write the 
query to file since the GUI will get sluggish after that. (Hovering over 
the query buttons highlights this too).

You can download SQLitespeed[1] for windows easily and test these 
assumptions, play for a few minutes to see if the idea works or suits 
you or perhaps work out a better idea (which I won't mind hearing about) :)

As a note, the vast majority of user database queries seem to load up 
less than 16MB of query output for any query a user is interested in 
inspecting (i.e. might want to look at in a GUI). Any larger queries 
usually only gets used inside some system as a dataset - and 16MB is 
really easy to manage, even for a very basic GUI.

[1] http://www.sqlc.rifin.co.za/


>
> I'll give it some more thought, but thank you for your answers it has helped.
> Paul

Best of luck!
Ryan

Reply via email to