On Tue, Jan 26, 2010 at 9:18 AM, nair rajiv wrote:
>
>
> On Tue, Jan 26, 2010 at 6:19 AM, Andres Freund wrote:
>
>> On Tuesday 26 January 2010 01:39:48 nair rajiv wrote:
>> > On Tue, Jan 26, 2010 at 1:01 AM, Craig James
>> wrote:
>> > I am working on a project that will take out struct
On Tue, Jan 26, 2010 at 6:19 AM, Andres Freund wrote:
> On Tuesday 26 January 2010 01:39:48 nair rajiv wrote:
> > On Tue, Jan 26, 2010 at 1:01 AM, Craig James
> wrote:
> > I am working on a project that will take out structured content
> > from wikipedia
> > and put it in our database.
On Tuesday 26 January 2010 01:39:48 nair rajiv wrote:
> On Tue, Jan 26, 2010 at 1:01 AM, Craig James
wrote:
> I am working on a project that will take out structured content
> from wikipedia
> and put it in our database. Before putting the data into the database I
> wrote a script to
> f
On Tue, Jan 26, 2010 at 1:01 AM, Craig James wrote:
> Kevin Grittner wrote:
>
>> nair rajiv wrote:
>>
>>
>>> I found there is a table which will approximately have 5 crore
>>> entries after data harvesting.
>>> Is it advisable to keep so much data in one table ?
>>>
>> That's 50,000,000 rows, ri
On Mon, Jan 25, 2010 at 3:59 AM, Matthew Wakeling wrote:
> On Mon, 25 Jan 2010, Richard Huxton wrote:
>>
>> OK - so the first query processes 19,799 rows in 31,219 ms (about 1.5ms
>> per row)
>>
>> The second processes 2,606 rows in 3,813 ms (about 1.3ms per row).
>
> Agreed. One query is faster t
Kevin Grittner wrote:
nair rajiv wrote:
I found there is a table which will approximately have 5 crore
entries after data harvesting.
Is it advisable to keep so much data in one table ?
That's 50,000,000 rows, right?
You should remember that words like lac and crore are not English words,
nair rajiv wrote:
> I found there is a table which will approximately have 5 crore
> entries after data harvesting.
> Is it advisable to keep so much data in one table ?
That's 50,000,000 rows, right? At this site, you're looking at a
non-partitioned table with more than seven times that if y
On Mon, Jan 25, 2010 at 10:53 PM, nair rajiv wrote:
> Hello,
>
> I am working on a project that will take out structured content
> from wikipedia
> and put it in our database. Before putting the data into the database I
> wrote a script to
> find out the number of rows every table would
On Mon, Jan 25, 2010 at 10:53 PM, nair rajiv wrote:
> Hello,
>
> I am working on a project that will take out structured content
> from wikipedia
> and put it in our database. Before putting the data into the database I
> wrote a script to
> find out the number of rows every table would
Hello,
I am working on a project that will take out structured content
from wikipedia
and put it in our database. Before putting the data into the database I
wrote a script to
find out the number of rows every table would be having after the data is in
and I found
there is a table which
Scott Carey:
> > (2) The tests:
> >
> > Note: The standard speed was about 800MB/40s, so 20MB/s.
> >
> >
> > a)
> > What I changed: fsync=off
> > Result: 35s, so 5s faster.
> >
> >
> > b) like a) but:
> > checkpoint_segments=128 (was 3)
> > autovacuum=off
> >
> > Result: 35s (no change...?!)
In response to Matthew Wakeling :
> On Mon, 25 Jan 2010, A. Kretschmer wrote:
> >In response to ramasubramanian :
> >
> >Please, create a new mail for a new topic and don't hijack other
> >threads.
>
> Even more so - this isn't probably the right mailing list for generic sql
> help questions.
AC
On Mon, 25 Jan 2010, A. Kretschmer wrote:
In response to ramasubramanian :
Please, create a new mail for a new topic and don't hijack other
threads.
Even more so - this isn't probably the right mailing list for generic sql
help questions.
select ENAME,ORIG_SALARY from employee where (ename
On Mon, 25 Jan 2010, Richard Huxton wrote:
OK - so the first query processes 19,799 rows in 31,219 ms (about 1.5ms per
row)
The second processes 2,606 rows in 3,813 ms (about 1.3ms per row).
Agreed. One query is faster than the other because it has to do an eighth
the amount of work.
Matth
On 22/01/10 19:06, Tory M Blue wrote:
> Here is the explain plan for the query. Actual rows that the query
> returns is 6369
Actually, it processes 19,799 rows (see the actual rows= below).
SLOW
" -> Bitmap Heap Scan on userstats (cost=797.69..118850.46
rows=13399 width=8) (actual time=
In response to ramasubramanian :
Please, create a new mail for a new topic and don't hijack other
threads.
> Hi all,
>I have a table emp. using where condition can i get the result
> prioritized.
> Take the example below.
>
> select ENAME,ORIG_SALARY from employee where (ename='Tom' and
>
Hi all,
I have a table emp. using where condition can i get the result
prioritized.
Take the example below.
select ENAME,ORIG_SALARY from employee where (ename='Tom' and
orig_salary=2413)or(orig_salary=1234 )
if the fist condition(ename='Tom' and orig_salary=2413) is satified then 10
row
17 matches
Mail list logo