Hello Tim and Richard,
Sorry for the late reply but I was offline for the last days.
I asked the same question to Brazilian Django users and I've got the same
answer 'focus on your sgbd, Django is not going to be your problem' so I
decided to have a better look on Postgres and start to do some
> I'm developing a Django project that's going to handle with
> big sets of data and want you to advise me. I have 10 internal
> bureaus and each of then has a 1.5 million registers database
> and it really looks to keep growwing on size on and on. I
> intend to use Postgres.
>
> The question:
Denormalization can help with that, as can splitting the models up by
bureau, although that kind of makes me feel a little dirty. I would
consider, (obviously not knowing the details of your requirements), but I
would consider building one app and running multiple instances of it as
needs
Hi Richard,
I'm more concerned with the query response time...
thanks for your fast reply = )
On 9/25/07, Richard Dahl <[EMAIL PROTECTED]> wrote:
>
> Bruno,
> It is difficult to advise based on the information provided. Not sure
> exactly what you are concerned with, postgres database size?
Bruno,
It is difficult to advise based on the information provided. Not sure
exactly what you are concerned with, postgres database size? query response
time? network transfer time? All of the above? Each of these impacts can be
dealt with differently. Perhaps if you provided some detail on
Hi fellows,
I'm developing a Django project that's going to handle with big sets of data
and want you to advise me. I have 10 internal bureaus and each of then has a
1.5 million registers database and it really looks to keep growwing on size
on and on. I intend to use Postgres.
The question:
6 matches
Mail list logo