Hello,

I am designing database for a web product with large number of data records.

- Few tables but number of objects is tens-hundreds of thousands.
- less than 100 queries per second.

The application has  basically tens thousands of (user) accounts,
every account has associated hundreds of items.

My initial thought is to design it like this:

Table: account
---------------------
account_id BIGSERIAL


Table: item
---------------------
account_id BIGINT
item_id INT

Questions:

Should table account be designed with BIGSERIAL key, or if it's going
to have six-digit number of records, other method should be used?

Should I use compound key for table item (account_id+item_id) or
item_id should be BIGSERIAL and global sequence with key being only
item_id?

How generally this design will hold up against this amount of data?

Thanks.

---------------------------(end of broadcast)---------------------------
TIP 9: In versions below 8.0, the planner will ignore your desire to
      choose an index scan if your joining column's datatypes do not
      match

Reply via email to