For things like this, I find that if I can't filter the data down the the
100 rows via criteria, then instead I use a stored procedure to get my
primary keys.  Then, if I am doing a prev next kinda thing, I just save my
results, and then pick whichever set of 100 I want, and use torque to get
that data.  This way I can do all my joins etc in a nice fast stored proc,
and then just have torque load based on primary key.

ERic

-----Original Message-----
From: Rooms, Christoph [mailto:[EMAIL PROTECTED]]
Sent: Wednesday, June 26, 2002 10:01 AM
To: 'Turbine Torque Users List'
Subject: Large list of data (Best Practice)


Hi,

What would be the best way to handle a table with for example 35000 rows ...

I noticed you can set a limit on the rows you want to get back but I noticed
this is handled on the level of torque ... So torque gets qll 35000 rows and
the only turns 100 into objects ...

This still takes like 30meg from my memory ... which is a lot :(

Any best practices ?

Thanks, Christoph

--
To unsubscribe, e-mail:
<mailto:[EMAIL PROTECTED]>
For additional commands, e-mail:
<mailto:[EMAIL PROTECTED]>


--
To unsubscribe, e-mail:   <mailto:[EMAIL PROTECTED]>
For additional commands, e-mail: <mailto:[EMAIL PROTECTED]>

Reply via email to