best way to write large data-streams quickly?

2018-04-09 Thread Mark Moellering
first to have this problem and there are common solutions but I can't find any. Does anyone know of some sort method, third party program, etc, that can accept data from a number of different sources, and push it into Postgres as fast as possible? Thanks in advance, Mark Moellering

Re: best way to write large data-streams quickly?

2018-04-10 Thread Mark Moellering
On Mon, Apr 9, 2018 at 12:01 PM, Steve Atkins wrote: > > > On Apr 9, 2018, at 8:49 AM, Mark Moellering com> wrote: > > > > Everyone, > > > > We are trying to architect a new system, which will have to take several > large datastreams (total of ~200,000 pa

db-connections (application architecture)

2018-11-15 Thread Mark Moellering
So, I am working on some system designs for a web application, and I wonder if there is any definitive answer on how to best connect to a postgres database. I could have it so that each time a query, or set of queries, for a particular request, needs to be run, a new connection is opened, queries

Re: db-connections (application architecture)

2018-11-15 Thread Mark Moellering
Oh, excellent. I knew I was about to reinvent the wheel. Sometimes, there are just too many new things to keep up on. Thank you so much! On Thu, Nov 15, 2018 at 10:16 AM Adrian Klaver wrote: > On 11/15/18 7:09 AM, Mark Moellering wrote: > > So, I am working on some system designs

Re: Where **not** to use PostgreSQL?

2019-02-28 Thread Mark Moellering
I wish more people would ask this question, to me, it is the true mark of experience. In general, I think of PostgreSQL as the leading Relational Database. The farther you get away from relational data and relational queries, the more I would say, you should look for other products or solutions.