Sam Smith wrote:
> I need a script that will crawl a list of websites and download all .jpg,
> .gif, .png files.
> 
> I can think of some ways how to start like, fopen() or maybe curl(). And it
> downed on me I'd need to handle files writing over with the same name. And
> it would be cool to save the full URL to the file in a database with the
> path on the local server where I'm saving them.
> 
> I was hoping someone might say, "Dude, that's simple, just do this..."
> before I spent hours guessing.
> 
> Anyone?
> 
> Thanks
> 
> 

Dude, that's simple, just do this...

- crawl the page extracting the src of images, regex or dom
- fopen and read it
- fput and save it, if it exists add an incremental number
- save url and local dir/filename in db

-- 
Thanks!
-Shawn
http://www.spidean.com

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to