Alright, so looks like I've traced down the problem but I'm still not sure
of how to resolve it.

Part of my application deals with a pretty huge file, it's basically a giant
list of all 23,000+ gems available on Gemcutter. Compressed, this is a
little less than 2 megs (with Gem.deflate, which uses Zlib) and
uncompressed, it comes down to ~14 megs if you write out the file. On my own
machine this turned out to be slow, but usable...on Heroku it's causing an
internal system error when deserializing it using Marshal.load. To recap,
the process looks something like this when pushing a new gem:

1) Download zipped index from S3
2) Unzip
3) Deserialize (boom, internal server error)
4) Add new gem into the index
5) Serialize
6) Zip
7) Push back to s3

I'm seeing a few options here...one, to try out different serialization
formats to see if that helps. Another is to stick it in Memcached, but my
main problem with that is I can't lose that data...so perhaps I could back
it up hourly to S3. I could even make a giant binary blob in Postgres, but I
doubt that would be ideal. If you've got other suggestions on how to deal
with a pretty massive file like this, I'd be glad to hear it.

Thanks,
-Nick

On Fri, Jul 31, 2009 at 12:18 AM, Trevor Turk <[email protected]> wrote:

>
> On Jul 30, 11:11 pm, Nick Quaranto <[email protected]> wrote:
> > Alright, I think I've tracked down in my codebase where the problem is.
> > Sorry for the red herring here, folks. Still though, it would be nice to
> see
> > the logs.
>
> Please do let us know if/when you figure it out!
>
> - Trevor
> >
>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Heroku" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/heroku?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to