Hi,

The number of inserts into the database would be a minimum of 3000 records in 
one operation.. We do not have any stringent requirement of writing speed.
So we could make do with a slower write speed as long as the CPU usage is not 
heavy... :)
We will try reducing the priority and check once.
Our database file is located on a class 2 SD Card. So it is understandable if 
there is lot of IO activity and speed is less.
But we are stumped by the amount of CPU Postgres is eating up.
Any configuration settings we could check up? Given our Hardware config, are 
following settings ok?
Shared Buffers: 24MB
Effective Cache Size: 128MB

We are not experienced with database stuff. So some expert suggestions would be 
helpful :)

Thanks and Regards
Jayashankar

-----Original Message-----
From: pgsql-performance-ow...@postgresql.org 
[mailto:pgsql-performance-ow...@postgresql.org] On Behalf Of Heikki Linnakangas
Sent: Saturday, January 28, 2012 1:27 AM
To: Jayashankar K B
Cc: Andy Colson; pgsql-performance@postgresql.org
Subject: Re: [PERFORM] Postgress is taking lot of CPU on our embedded hardware.

On 27.01.2012 20:30, Jayashankar K B wrote:
> Hi Heikki Linnakangas: We are using series of Insert statements to insert the 
> records into database.
> Sending data in binary is not an option as the module that writes into DB has 
> been finalized.
> We do not have control over that.

That certainly limits your options.

> Please let me know how we can proceed. On the net I couldn't get hold of any 
> good example where Postgres has been used on limited Hardware system.

I don't think there's anything particular in postgres that would make it a poor 
choice on a small system, as far as CPU usage is concerned anyway. But 
inserting rows in a database is certainly slower than, say, writing them into a 
flat file.

At what rate are you doing the INSERTs? And how fast would they need to be? 
Remember that it's normal that while the INSERTs are running, postgres will use 
all the CPU it can to process them as fast as possible. So the question is, at 
what rate do they need to be processed to meet your target. Lowering the 
process priority with 'nice' might help too, to give the other important 
processes priority over postgres.

The easiest way to track down where the time is spent would be to run a 
profiler, if that's possible on your platform.

--
   Heikki Linnakangas
   EnterpriseDB   http://www.enterprisedb.com

--
Sent via pgsql-performance mailing list (pgsql-performance@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-performance


Larsen & Toubro Limited

www.larsentoubro.com

This Email may contain confidential or privileged information for the intended 
recipient (s) If you are not the intended recipient, please do not use or 
disseminate the information, notify the sender and delete it from your system.

-- 
Sent via pgsql-performance mailing list (pgsql-performance@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-performance

Reply via email to