-Original Message-
From: talk-boun...@lists.nyphp.org [mailto:talk-boun...@lists.nyphp.org]
On Behalf Of Peter Sawczynec
Sent: Wednesday, June 03, 2009 10:39 PM
To: 'NYPHP Talk'
Subject: Re: [nyphp-talk] Periodic Browsercam.com group availability
Now, me I don't guarantee anything (especia
You know, I'm not even on the frontend list. I missed that.
CSS/PHP MENU
This URL: http://www.giba.us
...is a simple but elegant PHP website I designed and built that has an
example of a very simple CSS one-level menu that also uses PHP to grab
the page name out of the URL request and changes th
On Wed, 3 Jun 2009, Artur Marnik wrote:
> Recently I found nice software to do QA on a web application.
> I find it very useful and it is free to use:
> http://seleniumhq.org/
http://www.opensourcetesting.org/functional.php
--
Aj.
___
New York PHP U
Heya,
So, I have this script that does the following:
1. Requests jpeg from origin CDN via cURL
2. If file doesnt exist... log error, continue.
3. Write jpeg to temp file
4. Resize original image (GD lib) to temp file. FTP to directory on
new CDN. Create directory structure if not present
Do you mind sharing the script?
- Original Message -
From: "Rahmin Pavlovic"
To: "NYPHP Talk"
Sent: Thursday, June 4, 2009 11:48:40 AM
Subject: [nyphp-talk] memory problems
Heya,
So, I have this script that does the following:
1. Requests jpeg from origin CDN via cURL
2. If file do
GD needs to operate on raw data so even if the jpegs are smaller than
your 500mb limit, when it expands it, it will go over.
A couple ideas... exec ImageMagick convert instead of using GD for the
resize.
On Jun 4, 2009, at 10:48 AM, Rahmin Pavlovic
wrote:
Heya,
So, I have this script
Sorry guys. iPhone prematurely sent that.
The other idea is to buffer the download of the file. You can use
fopen/fread/fclose to make sure you only keep say 1mb of data in php's
memory while you download the file. Usually urls can be treated like
files via php's url file wrappers unless
You are not releasing memory somewhere. Using sleep is not going to
help in clearing memory. Monitor your memory usage (memory_get_usage)
and see if your memory usage keeps climbing or if it's just 1 big
image that is causing the problem.
Alternatively, fork your script for each image if you are r
Just noticed you said the images are only 15k. Weird. Well, even
without seeing the code, I think the suggestions of switching to
imagemagick or splitting up the script will solve it.
On Jun 4, 2009, at 11:25 AM, Rob Marscher
wrote:
Sorry guys. iPhone prematurely sent that.
The other
yeah i think unset would work just fine.
~rob
-Original Message-
From: Eddie Drapkin
To: NYPHP Talk
Sent: Wed, 3 Jun 2009 9:26 pm
Subject: Re: [nyphp-talk] delete one element from array
Nope, because that'll unset the eighth element!
You want unset($array[6]) :P
And
wow, awesome brent, i like the forking idea.
question:
how are you getting $files2process.? i'm assuming you're reading a directory if
it were local, but can you get this number remotely from cURL?
thanks
~rob
-Original Message-
From: Brent Baisley
To: NYPHP Talk
Sent: Thu,
On Thu, Jun 4, 2009 at 12:39 PM, wrote:
> yeah i think unset would work just fine.
>
So much for my serialize() / preg_replace() / unserialize() hack.
___
New York PHP User Group Community Talk Mailing List
http://lists.nyphp.org/mailman/listinfo/talk
@chris, you forgot to json_encode, mail() to a remote service, and sleep()
while you wait for the results
On Thu, Jun 4, 2009 at 1:34 PM, Chris Snyder wrote:
> On Thu, Jun 4, 2009 at 12:39 PM, wrote:
> > yeah i think unset would work just fine.
> >
>
> So much for my serialize() / preg_replace
On Thu, Jun 4, 2009 at 11:48 AM, Rahmin Pavlovic wrote:
> Heya,
>
> So, I have this script that does the following:
>
> 1. Requests jpeg from origin CDN via cURL
> 2. If file doesnt exist... log error, continue.
> 3. Write jpeg to temp file
> 4. Resize original image (GD lib) to temp file. FTP
On Jun 4, 2009, at 2:14 PM, John Campbell wrote:
My guess is that you are not calling imagedestroy() after you are done
with the image handle. You have to manually free the memory with GD.
We're destroying each handle after we're done with it, so I'm not
exactly sure where the leak is, but
I had a similar problem and I added some old fashioned debug code
using "echo" and "memory_get_usage" to see where the leaks were.
It really helped, and it's very educational. (and fun...)
HTH
glenn
On Jun 4, 2009, at 5:23 PM, Rahmin Pavlovic wrote:
On Jun 4, 2009, at 2:14 PM, John Campbe
Chris Snyder wrote:
On Thu, Jun 4, 2009 at 12:39 PM, wrote:
yeah i think unset would work just fine.
So much for my serialize() / preg_replace() / unserialize() hack.
___
New York PHP User Group Community Talk Mailing List
http://lists.nyp
Elijah, you forgot to use Amazon Queues.
Sheesh. Amateurs.
-- Mitch, trying to keep a straight face, and failing
On Thu, Jun 4, 2009 at 1:55 PM, Elijah Insua wrote:
> @chris, you forgot to json_encode, mail() to a remote service, and sleep()
> while you wait for the results
>
> On Thu, Jun 4, 2
That's one of the "holes" in the script. You can make it an array, a
function, a counter, whatever. Some way of processing your list of
files. Change it from while to for loop if needed. You just need some
way to go through the list of files, assigning the url of the next one
to the $processFile va
19 matches
Mail list logo