http://www.catavitch.com

The Script I have written actually does that for predefined websites. 
The content is LIVE pull directly from the website listed.
MOST important thing to remember "GET PERMISSION" to scrape as you call it.

I can store the data if I want or dish it up in the example. I can even
display the entire website.

I do not use an fopen(); 
Some may suggest a wget(); 
I will disagree strongly.
Why waste the space with stored files.

Try curl.




-----Original Message-----
From: David Calkins [mailto:[EMAIL PROTECTED] 
Sent: Monday, November 12, 2007 9:40 AM
To: php-general@lists.php.net
Subject: [PHP] web page download question

I'm attempting to "scrape" a web page to pull out some pertinent info.
 The URL looks similar to the below.

http://www.someserver.com/user-info.xml?user=myusername

If I paste the above into my web browser, the page comes up and
displays the information.  If I try "view source", I get an XML
document.  Not an XHTML document, but a plain XML document with just
the data fields (no formatting info).

However, if I use PHP fopen() to read this same URL, I get the XHTML
file with all the formatting info, etc.

So, somehow the web browser (this happens in FireFox and IE7) is
showing something other than what I get with the plain fopen().

I'd like to get at the plain XML file with just the data fields as is
shown in the browser.

Any ideas how to do this?

Thanks!

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to