Hi there, > I am have a problem with a query which may well have over 200,000 records. > I > have building a website using PHP and PHP is timing out after 30secs due > the > the size of the call_data table (I think). Is there anyway I can improve > the > following query so that it is faster. I think I am using sqlite 2.8.14 > (not > sure). > > SELECT call_id, C.extn_no AS extn_no, dest, dest_name, call_time, > duration, > cost, U.firstname AS firstname, U.surname AS surname FROM call_data as C > LEFT JOIN users as U on C.extn_no = U.extn_no WHERE 1 = '1' AND > julianday(call_time) >= julianday('2004-10-16 09:00:00') AND > julianday(call_time) <= julianday('2004-11-16 17:29:59') AND direction = > 'Out' ORDER BY cost desc LIMIT 0,16;
Two things: 1) Why do you have the "1 = '1'" expression? It's just wasting processor time. 2) You can do preprocessing of the julian dates, converting the expressions (like '2004-11-16 17:29:59') into whatever format is stored in the database, then comparing that directly. This will allow you to use an index on call_time. Be aware, however, that maintaining an index can be time-consuming if you have many records going in and out. Right now, it is my guess that every record in the table will have to be read, in order to apply the juliandate() function. Thus no index is used, and every record is read. This is just an uneducated guess, though -- I haven't checked the EXPLAIN output. Ulrik P.