On Mon, 29 Sep 2008 15:00:41 -0400
Mark Stosberg <[EMAIL PROTECTED]> wrote:

> This question isn't so much a mod_perl question, as it is a question
> about building high performance websites with Perl. 
> 
> We have a large, busy, database application that relates to millions
> of photos, which we also need to store and display. We've been
> keeping the meta data about the photos in PostgreSQL, and the files
> on the file system. (Implemented much like CGI::Uploader ).
> 
> This has worked great in terms of performance, but with so much data
> to manage, over time we have run into data inconsistency issues
> between the file system and the database.
> 
> So, I'm asking if anyone has had experience successfully storing
> photos (or othe files) directly in database? That would solve the
> consistency issue, but may create a performance issue. Perhaps the
> right kind of caching layer could solve that.

Actually you're already doing it correctly.  Andre already mentioned
many of the pitfalls of trying to store large binary data in a database,
so I won't rehash them again.

The only issue you seem to be having is the inconsistency.  That issue
is going to be much easier to solve than trying to scale by putting the
photos in the database. 

Usually people just make sure inserts/updates to the photo table is
done in a transaction and if that transaction succeeds or fails, it
does the appropriate write/delete on the file system.  

But since you're using PostgreSQL ( my favorite database and a large
part of my consulting practice ) you could even go so far as to write a
few pl/perl stored procedures to handle keeping the file system in sync 
with the database. 

 -------------------------------------------------------
   Frank Wiles, Revolution Systems, LLC. 
     Personal : [EMAIL PROTECTED]  http://www.wiles.org
     Work     : [EMAIL PROTECTED] http://www.revsys.com 

Reply via email to