[PERFORM] Prepared statements vs. Stored Procedures

2005-06-21 Thread Oliver Crosby
I'm hoping someone can offer some advice here. I have a large perl script that employs prepared statements to do all its queries. I'm looking at using stored procedures to improve performance times for the script. Would making a stored procedure to replace each prepared statement be worthwhile?

[PERFORM] Looking for tips

2005-07-19 Thread Oliver Crosby
Hi, I'm running Postgres 7.4.6 on a dedicated server with about 1.5gigs of ram. Running scripts locally, it takes about 1.5x longer than mysql, and the load on the server is only about 21%. I upped the sort_mem to 8192 (kB), and shared_buffers and effective_cache_size to 65536 (512MB), but neither

Re: [PERFORM] Looking for tips

2005-07-19 Thread Oliver Crosby
, Joshua D. Drake [EMAIL PROTECTED] wrote: Oliver Crosby wrote: Hi, I'm running Postgres 7.4.6 on a dedicated server with about 1.5gigs of ram. Running scripts locally, it takes about 1.5x longer than mysql, and the load on the server is only about 21%. What queries? What is your structure

Re: [PERFORM] Looking for tips

2005-07-19 Thread Oliver Crosby
Identify what the problem is first of all. Some things to consider: - Are there particular queries giving you trouble? - Is your load mostly reads or mostly writes? - Do you have one user or 100? - Are you block-loading data efficiently where necessary? - Have you indexed both sides of

Re: [PERFORM] Looking for tips

2005-07-19 Thread Oliver Crosby
What programming language are these scripts written in ? perl. using the DBD:Pg interface instead of command-lining it through psql ---(end of broadcast)--- TIP 6: explain analyze is your friend

Re: [PERFORM] Looking for tips

2005-07-19 Thread Oliver Crosby
If you're running only a single query at a time (no multiple clients), then this is pretty much the definition of a MySQL-friendly workload; I'd have to say we are doing really well if we are only 50% slower. Postgres doesn't have any performance advantages until you get into complex queries

Re: [PERFORM] Looking for tips

2005-07-19 Thread Oliver Crosby
We had low resource utilization and poor throughput on inserts of thousands of rows within a single database transaction. There were a lot of configuration parameters we changed, but the one which helped the most was wal_buffers -- we wound up setting it to 1000. This may be higher than it

Re: [PERFORM] Looking for tips

2005-07-19 Thread Oliver Crosby
If it is possible try: 1) wrapping many inserts into one transaction (BEGIN;INSERT;INSERT;...INSERT;COMMIT;). As PostgreSQL will need to handle less transactions per second (each your insert is a transaction), it may work faster. Aye, that's what I have it doing right now. The transactions

Re: [PERFORM] Looking for tips

2005-07-19 Thread Oliver Crosby
since triggers work with COPY, you could probably write a trigger that looks for this condition and does the ID processsing you need; you could thereby enjoy the enormous speed gain resulting from COPY and maintain your data continuity. So... (bear with me here.. trying to make sense of

Re: [PERFORM] Looking for tips

2005-07-19 Thread Oliver Crosby
Sorry for the lack of specifics... We have a file generated as a list of events, one per line. Suppose lines 1,2,3,5,7,11,etc were related, then the last one would specify that it's the last event. Gradually this gets assembled by a perl script and when the last event is encountered, it gets