Re: [PHP] Using Curl to replicate a site

2009-12-11 Thread Ashley Sheridan
On Thu, 2009-12-10 at 16:25 +, Ashley Sheridan wrote:

 On Thu, 2009-12-10 at 11:25 -0500, Robert Cummings wrote:
 
  Joseph Thayne wrote:
   If the site can be a few minutes behind, (say 15-30 minutes), then what 
   I recommend is to create a caching script that will update the necessary 
   files if the md5 checksum has changed at all (or a specified time period 
   has past).  Then store those files locally, and run local copies of the 
   files.  Your performance will be much better than if you have to request 
   the page from another server every time.  You could run this script 
   every 15-30 minutes depending on your needs via a cron job.
  
  Use URL rewriting or capture 404 errors to handle the proxy request. No 
  need to download and cache the entire site if everyone is just 
  requesting the homepage.
  
  Cheers,
  Rob.
  -- 
  http://www.interjinn.com
  Application and Templating Framework for PHP
  
 
 
 Yeah, I was going to use the page request to trigger the caching
 mechanism, as it's unlikely that all pages are going to be equally as
 popular as one another. I'll let you all know how it goes on!
 
 Thanks,
 Ash
 http://www.ashleysheridan.co.uk
 
 


Well I got it working just great in the end. Aside from the odd issue
with relative URLs use in referencing images and Javascripts that I had
to sort out, everything seems to be working fine and is live. I've got
it on a 12-hour refresh, as the site will probably not be changing very
often at all. Thanks for all the pointers!

Thanks,
Ash
http://www.ashleysheridan.co.uk




[PHP] Using Curl to replicate a site

2009-12-10 Thread Ashley Sheridan
Hi,

I need to replicate a site on another domain, and in this case, an
iframe won't really do, as I need to remove some of the graphics, etc
around the content. The owner of the site I'm needing to copy has asked
for the site to be duplicated, and unfortunately in this case, because
of the CMS he's used (which is owned by the hosting he uses) I need a
way to have the site replicated on an already existing domain as a
microsite, but in a way that it is always up-to-date.

I'm fine using Curl to grab the site, and even alter the content that is
returned, but I was thinking about a caching mechanism. Has anyone any
suggestions on this?

Thanks,
Ash
http://www.ashleysheridan.co.uk




Re: [PHP] Using Curl to replicate a site

2009-12-10 Thread Robert Cummings

Ashley Sheridan wrote:

Hi,

I need to replicate a site on another domain, and in this case, an
iframe won't really do, as I need to remove some of the graphics, etc
around the content. The owner of the site I'm needing to copy has asked
for the site to be duplicated, and unfortunately in this case, because
of the CMS he's used (which is owned by the hosting he uses) I need a
way to have the site replicated on an already existing domain as a
microsite, but in a way that it is always up-to-date.

I'm fine using Curl to grab the site, and even alter the content that is
returned, but I was thinking about a caching mechanism. Has anyone any
suggestions on this?


Sounds like you're creating a proxy with post processing/caching on the 
forwarded content. It should be fairly straightforward to direct page 
requests to your proxy app, then make the remote request, and 
post-process, cache, then send to the browser. The only gotcha will be 
for forms if you do caching.


Cheers,
Rob.
--
http://www.interjinn.com
Application and Templating Framework for PHP

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Using Curl to replicate a site

2009-12-10 Thread Ashley Sheridan
On Thu, 2009-12-10 at 11:10 -0500, Robert Cummings wrote:

 Ashley Sheridan wrote:
  Hi,
  
  I need to replicate a site on another domain, and in this case, an
  iframe won't really do, as I need to remove some of the graphics, etc
  around the content. The owner of the site I'm needing to copy has asked
  for the site to be duplicated, and unfortunately in this case, because
  of the CMS he's used (which is owned by the hosting he uses) I need a
  way to have the site replicated on an already existing domain as a
  microsite, but in a way that it is always up-to-date.
  
  I'm fine using Curl to grab the site, and even alter the content that is
  returned, but I was thinking about a caching mechanism. Has anyone any
  suggestions on this?
 
 Sounds like you're creating a proxy with post processing/caching on the 
 forwarded content. It should be fairly straightforward to direct page 
 requests to your proxy app, then make the remote request, and 
 post-process, cache, then send to the browser. The only gotcha will be 
 for forms if you do caching.
 
 Cheers,
 Rob.
 -- 
 http://www.interjinn.com
 Application and Templating Framework for PHP
 


The only forms are processed on another site, so there's nothing I can
really do about that, as they return to the original site.

How would I go about doing what you suggested though? I'd assumed to use
Curl, but your email suggests not to?

Thanks,
Ash
http://www.ashleysheridan.co.uk




Re: [PHP] Using Curl to replicate a site

2009-12-10 Thread Joseph Thayne
If the site can be a few minutes behind, (say 15-30 minutes), then what 
I recommend is to create a caching script that will update the necessary 
files if the md5 checksum has changed at all (or a specified time period 
has past).  Then store those files locally, and run local copies of the 
files.  Your performance will be much better than if you have to request 
the page from another server every time.  You could run this script 
every 15-30 minutes depending on your needs via a cron job.


Joseph

Ashley Sheridan wrote:

Hi,

I need to replicate a site on another domain, and in this case, an
iframe won't really do, as I need to remove some of the graphics, etc
around the content. The owner of the site I'm needing to copy has asked
for the site to be duplicated, and unfortunately in this case, because
of the CMS he's used (which is owned by the hosting he uses) I need a
way to have the site replicated on an already existing domain as a
microsite, but in a way that it is always up-to-date.

I'm fine using Curl to grab the site, and even alter the content that is
returned, but I was thinking about a caching mechanism. Has anyone any
suggestions on this?

Thanks,
Ash
http://www.ashleysheridan.co.uk



  


--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Using Curl to replicate a site

2009-12-10 Thread Robert Cummings

Ashley Sheridan wrote:

On Thu, 2009-12-10 at 11:10 -0500, Robert Cummings wrote:

Ashley Sheridan wrote:
 Hi,
 
 I need to replicate a site on another domain, and in this case, an

 iframe won't really do, as I need to remove some of the graphics, etc
 around the content. The owner of the site I'm needing to copy has asked
 for the site to be duplicated, and unfortunately in this case, because
 of the CMS he's used (which is owned by the hosting he uses) I need a
 way to have the site replicated on an already existing domain as a
 microsite, but in a way that it is always up-to-date.
 
 I'm fine using Curl to grab the site, and even alter the content that is

 returned, but I was thinking about a caching mechanism. Has anyone any
 suggestions on this?

Sounds like you're creating a proxy with post processing/caching on the 
forwarded content. It should be fairly straightforward to direct page 
requests to your proxy app, then make the remote request, and 
post-process, cache, then send to the browser. The only gotcha will be 
for forms if you do caching.


Cheers,
Rob.
--
http://www.interjinn.com
Application and Templating Framework for PHP



The only forms are processed on another site, so there's nothing I can 
really do about that, as they return to the original site.


How would I go about doing what you suggested though? I'd assumed to use 
Curl, but your email suggests not to?


Nope, wasn't suggesting not to. You can use many techniques, but cURL is 
probably the most robust. The best way to facilitate this, IMHO, is to 
have a rewrite rule that directs all traffic for the proxy site to your 
application. Then rewrite the REQUEST_URI to point to the page on the 
real domain. Then check your cache for the content and if empty use cURL 
to retrieve the content, apply your post-processing (to strip out what 
you don't want and apply a new page layout or whatever), then cache (if 
not already cached) the content (this can be a simple database table 
with the request URI and a timestamp), then output the content.


Cheers,
Rob.
--
http://www.interjinn.com
Application and Templating Framework for PHP

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Using Curl to replicate a site

2009-12-10 Thread Robert Cummings

Joseph Thayne wrote:
If the site can be a few minutes behind, (say 15-30 minutes), then what 
I recommend is to create a caching script that will update the necessary 
files if the md5 checksum has changed at all (or a specified time period 
has past).  Then store those files locally, and run local copies of the 
files.  Your performance will be much better than if you have to request 
the page from another server every time.  You could run this script 
every 15-30 minutes depending on your needs via a cron job.


Use URL rewriting or capture 404 errors to handle the proxy request. No 
need to download and cache the entire site if everyone is just 
requesting the homepage.


Cheers,
Rob.
--
http://www.interjinn.com
Application and Templating Framework for PHP

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Using Curl to replicate a site

2009-12-10 Thread Ashley Sheridan
On Thu, 2009-12-10 at 11:25 -0500, Robert Cummings wrote:

 Joseph Thayne wrote:
  If the site can be a few minutes behind, (say 15-30 minutes), then what 
  I recommend is to create a caching script that will update the necessary 
  files if the md5 checksum has changed at all (or a specified time period 
  has past).  Then store those files locally, and run local copies of the 
  files.  Your performance will be much better than if you have to request 
  the page from another server every time.  You could run this script 
  every 15-30 minutes depending on your needs via a cron job.
 
 Use URL rewriting or capture 404 errors to handle the proxy request. No 
 need to download and cache the entire site if everyone is just 
 requesting the homepage.
 
 Cheers,
 Rob.
 -- 
 http://www.interjinn.com
 Application and Templating Framework for PHP
 


Yeah, I was going to use the page request to trigger the caching
mechanism, as it's unlikely that all pages are going to be equally as
popular as one another. I'll let you all know how it goes on!

Thanks,
Ash
http://www.ashleysheridan.co.uk




[PHP] using CURL or something similar to click on a link

2007-06-25 Thread Siavash Miri

Hi All,


I just recently started writing a php script that would get some search
results from some other sites and display them.

I believe the only php way to do this is using CURL. am I correct?? or is
there a more efficient way to do this?



Anyways, i got it working fine with curl, except one site where I need to
click on a link before doing the search. I can't find a way to click on a
link using CURL. is this possible?


Thanks in advance,
Siavash


Re: [PHP] using CURL or something similar to click on a link

2007-06-25 Thread Robert Cummings
On Mon, 2007-06-25 at 19:01 -0700, Siavash Miri wrote:
 Hi All,
 
 
 I just recently started writing a php script that would get some search
 results from some other sites and display them.
 
 I believe the only php way to do this is using CURL. am I correct?? or is
 there a more efficient way to do this?

cURL is one of the best ways to do it since it provides a fairly high
level interface for performing page retrieval via GET or POST. You can
also do it using sockets but that's lower level and probably less
efficient due to the way cURL encapsulates low level communication
within the C layer.

 Anyways, i got it working fine with curl, except one site where I need to
 click on a link before doing the search. I can't find a way to click on a
 link using CURL. is this possible?

You probably need cookies. Read about the following cURL options and see
if they change things for you:

CURLOPT_COOKIEJAR
CURLOPT_COOKIEFILE

Alternatively there may be other issues at play. Perhaps an onclick()
even in JavaScript or something else.

Cheers,
Rob.
-- 
..
| InterJinn Application Framework - http://www.interjinn.com |
::
| An application and templating framework for PHP. Boasting  |
| a powerful, scalable system for accessing system services  |
| such as forms, properties, sessions, and caches. InterJinn |
| also provides an extremely flexible architecture for   |
| creating re-usable components quickly and easily.  |
`'

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



[PHP] using curl to peform post

2005-12-04 Thread Brent Clark

Hi all

I have need to simulate the role of a web browser by submitting the following 
code.

form name=form1 action=propsearch.pl method=post
textarea rows=12 cols=70 name=msgGREENc/textarea
input type=submit value=Submit
/form

Heres my PHP code that got from the curl function.

?php

$url = www.example.com;
$page = /cgi-bin/ecco/scripts/h2h/glo/propsearch.pl;

$post_string = msg=GREE01;

$header = POST .$page. HTTP/1.0 ;
$header .= MIME-Version: 1.0 ;
$header .= Content-type: application/PTI26 ;
$header .= Content-length: .strlen($post_string). ;
$header .= Content-transfer-encoding: text ;
$header .= Request-number: 1 ;
$header .= Document-type: Request ;
$header .= Interface-Version: Test 1.4 ;
$header .= Connection: close ;
$header .= $post_string;

$ch = curl_init();

curl_setopt($ch, CURLOPT_URL,$url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_TIMEOUT, 4);
curl_setopt($ch, CURLOPT_CUSTOMREQUEST, $header);

$data = curl_exec($ch); if (curl_errno($ch)) {
print curl_error($ch);
} else {
curl_close($ch);
}

print $data;
?

Any assistant would gratefully be appreciated.

Kind Regards
Brent Clark

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] using curl to get part of the html

2002-10-25 Thread Marek Kilimajer
You have the page in $result, so you need $result=explode(\n,$result), 
and echo $result[0] - [99]

Patrick Hsieh wrote:

Hello list,

I am writing a php script to fetch a html page and verify its content which 
generated by a remote cgi program. The special cgi program generates endless 
content to the http client. Therefore, I need to figure out a solution for 
curl to fetch part of the html source code(In fact, I only need the first 100 
lines of the html source). I tried CURLOPT_TIMEOUT, but when curl_exec() 
timeouts, it will not return part of the html source it already 
fetched--actually it returns nothing at all.

Is there any way to work around this?


#!/usr/bin/php4 -q
?php

$url = http://www.example.com/cgi-bin/large_output.cgi;;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_TIMEOUT, 3);
$result = curl_exec($ch);
curl_close($ch);
echo $result;
?
 



--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php




[PHP] using curl to get part of the html

2002-10-24 Thread Patrick Hsieh
Hello list,

I am writing a php script to fetch a html page and verify its content which 
generated by a remote cgi program. The special cgi program generates endless 
content to the http client. Therefore, I need to figure out a solution for 
curl to fetch part of the html source code(In fact, I only need the first 100 
lines of the html source). I tried CURLOPT_TIMEOUT, but when curl_exec() 
timeouts, it will not return part of the html source it already 
fetched--actually it returns nothing at all.

Is there any way to work around this?


#!/usr/bin/php4 -q
?php

$url = http://www.example.com/cgi-bin/large_output.cgi;;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_TIMEOUT, 3);
$result = curl_exec($ch);
curl_close($ch);
echo $result;
?
-- 
Patrick Hsieh[EMAIL PROTECTED]
GnuPG Pubilc Key at http://www.ezplay.tv/~pahud/pahudatezplay.pubkey
MD5 checksum: b948362c94655b74b33e859d58b8de91

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php




[PHP] using CURL to post data to a search engine

2002-09-27 Thread rdkurth

 I am trying to use curl to post data to a search engine and then
 return the results in a variable so I can manipulate it.
  I have tried this but it does not seam to work properly.
  This is just an example of how I am trying to do this I don't have
  to use google for the search engine just thought if I can get it
  working with this I can get it to work with others
  
$ch = curl_init (http://www.google.com/;);
curl_setopt ($ch, CURLOPT_POST, 1);
$word=php;
$args=action=/searchhl=enie=UTF-8oe=UTF-8q=$wordbtnG=Google Search;
curl_setopt ($ch, CURLOPT_POSTFIELDS, $args);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1);
$data = curl_exec ($ch);
curl_close ($ch);
echo $data;

There does not seam to be much info or examples on using curl with php
is there anyplace that you know of that has more info than the manual
 


  

-- 
Best regards,
 rdkurth  mailto:[EMAIL PROTECTED]


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php




Re: [PHP] using CURL to post data to a search engine

2002-09-27 Thread Paul Nicholson

-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Hey,
I don't think you'll get php to work with google unless you use the google 
api - I heard that google has filtered out the php(and others) user-agent.
If you need more info on the google api just let me know.
~Pauly

On Friday 27 September 2002 07:25 pm, [EMAIL PROTECTED] wrote:
  I am trying to use curl to post data to a search engine and then
  return the results in a variable so I can manipulate it.
   I have tried this but it does not seam to work properly.
   This is just an example of how I am trying to do this I don't have
   to use google for the search engine just thought if I can get it
   working with this I can get it to work with others

 $ch = curl_init (http://www.google.com/;);
 curl_setopt ($ch, CURLOPT_POST, 1);
 $word=php;
 $args=action=/searchhl=enie=UTF-8oe=UTF-8q=$wordbtnG=Google Search;
 curl_setopt ($ch, CURLOPT_POSTFIELDS, $args);
 curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1);
 $data = curl_exec ($ch);
 curl_close ($ch);
 echo $data;

 There does not seam to be much info or examples on using curl with php
 is there anyplace that you know of that has more info than the manual

- -- 
~Paul Nicholson
Design Specialist @ WebPower Design
The webthe way you want it!
[EMAIL PROTECTED]

It said uses Windows 98 or better, so I loaded Linux!
Registered Linux User #183202 using Register Linux System # 81891
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.0.6 (GNU/Linux)
Comment: For info see http://www.gnupg.org

iD8DBQE9lQ4TDyXNIUN3+UQRAgkDAJ4tTM2U/c0giejT1DzdDEg5HbQm0gCdE1Ht
il6KWww8aId6IE9xpBage08=
=2miE
-END PGP SIGNATURE-

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php




Re: [PHP] using CURL to post data to a search engine

2002-09-27 Thread Chris Shiflett

Google uses GET anyway, not POST.

A simple fopen() is all you need, but then you'll be parsing HTML. The 
API is better for reliability.

Paul Nicholson wrote:

I don't think you'll get php to work with google unless you use the google 
api - I heard that google has filtered out the php(and others) user-agent.
If you need more info on the google api just let me know.

On Friday 27 September 2002 07:25 pm, [EMAIL PROTECTED] wrote:
  

 I am trying to use curl to post data to a search engine and then
 return the results in a variable so I can manipulate it.
  I have tried this but it does not seam to work properly.
  This is just an example of how I am trying to do this I don't have
  to use google for the search engine just thought if I can get it
  working with this I can get it to work with others

$ch = curl_init (http://www.google.com/;);
curl_setopt ($ch, CURLOPT_POST, 1);
$word=php;
$args=action=/searchhl=enie=UTF-8oe=UTF-8q=$wordbtnG=Google Search;
curl_setopt ($ch, CURLOPT_POSTFIELDS, $args);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1);
$data = curl_exec ($ch);
curl_close ($ch);
echo $data;



-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php




RE: [PHP] Using cURL

2002-08-30 Thread Jonathan Rosenberg

Can you telnet/SSH to the machine  install CURL in your home directory?
You shouldn't have to be root to do this.

Or, are you trying to get CURL loaded into PHP?

--
JR

 -Original Message-
 From: Henry [mailto:[EMAIL PROTECTED]]
 Sent: Friday, August 30, 2002 2:21 AM
 To: [EMAIL PROTECTED]
 Subject: [PHP] Using cURL


 Dear All,

 I'm using a shared server hosted by an ISP. I cannot get PHP recompiled
 with --with-curl. I've read the information about cURL but it
 appears that I
 need to be root in order to install it? I cannot do this. Does this mean
 that I cannot use cURL or CURL at all? If it doesn't what do I
 need to do so
 I can start using it?

 Henry



 --
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php



-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php




[PHP] Using cURL

2002-08-29 Thread Henry

Dear All,

I'm using a shared server hosted by an ISP. I cannot get PHP recompiled
with --with-curl. I've read the information about cURL but it appears that I
need to be root in order to install it? I cannot do this. Does this mean
that I cannot use cURL or CURL at all? If it doesn't what do I need to do so
I can start using it?

Henry



-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php




[PHP] Using cURL on windows?

2001-08-17 Thread Brandon Orther

Hello,
 
How would I go about installing cURL for PHP on  windows box?