php-general Digest 11 Dec 2009 05:29:52 -0000 Issue 6481

Topics (messages 300359 through 300380):

Re: I have not seen any messages for a couple of days...
        300359 by: Robert Cummings
        300361 by: Ashley Sheridan
        300362 by: reda khyatti
        300363 by: Tom Calpin
        300369 by: tedd
        300372 by: Robert Cummings
        300374 by: Ashley Sheridan
        300376 by: Jochem Maas
        300377 by: Paul M Foster

Re: file_get_contents ($file) works -- file_get_contents ($url)  returns false
        300360 by: René Fournier

Re: PHP 5.3 Code Documentor
        300364 by: Lester Caine

Using Curl to replicate a site
        300365 by: Ashley Sheridan
        300366 by: Robert Cummings
        300367 by: Ashley Sheridan
        300368 by: Joseph Thayne
        300370 by: Robert Cummings
        300371 by: Robert Cummings
        300373 by: Ashley Sheridan

Re: mysterious include problem
        300375 by: Jochem Maas
        300378 by: Kim Emax
        300379 by: Kim Madsen

Upload dir
        300380 by: kranthi

Administrivia:

To subscribe to the digest, e-mail:
        php-general-digest-subscr...@lists.php.net

To unsubscribe from the digest, e-mail:
        php-general-digest-unsubscr...@lists.php.net

To post to the list, e-mail:
        php-gene...@lists.php.net


----------------------------------------------------------------------
--- Begin Message --- No, it's been broken for days. You won't get any emails for at least another week.




Jay Blanchard wrote:
...is this thing on?


--
http://www.interjinn.com
Application and Templating Framework for PHP

--- End Message ---
--- Begin Message ---
On Thu, 2009-12-10 at 10:29 -0500, Robert Cummings wrote:

> No, it's been broken for days. You won't get any emails for at least 
> another week.
> 
> 
> 
> 
> Jay Blanchard wrote:
> > ...is this thing on?
> > 
> 
> -- 
> http://www.interjinn.com
> Application and Templating Framework for PHP
> 


Erm? I'm confused now...

Thanks,
Ash
http://www.ashleysheridan.co.uk



--- End Message ---
--- Begin Message ---
yes it is

On Thu, Dec 10, 2009 at 11:27 AM, Jay Blanchard <jblanch...@pocket.com>wrote:

> ...is this thing on?
>
> --
> PHP General Mailing List (http://www.php.net/)
> To unsubscribe, visit: http://www.php.net/unsub.php
>
>


-- 
khyatti Reda
Tel: +21255880898

--- End Message ---
--- Begin Message ---
Yep, although Christmas shopping seems to be higher on the agenda than PHP

Wait till a week before Christmas and we'll have a slew of messages from
desperate developers trying to complete to a client deadline :)


-----Original Message-----
From: Jay Blanchard [mailto:jblanch...@pocket.com] 
Subject: I have not seen any messages for a couple of days...

...is this thing on?


--- End Message ---
--- Begin Message ---
At 10:29 AM -0500 12/10/09, Robert Cummings wrote:
No, it's been broken for days. You won't get any emails for at least another week.

What's been broken?

I've been receiving [PHP] post everyday.

Cheers,

tedd

--
-------
http://sperling.com  http://ancientstones.com  http://earthstones.com

--- End Message ---
--- Begin Message ---
tedd wrote:
At 10:29 AM -0500 12/10/09, Robert Cummings wrote:
No, it's been broken for days. You won't get any emails for at least another week.

What's been broken?

I've been receiving [PHP] post everyday.

Nothing is broken... it was a joke. How could it be broken if you receive my response... ;)

Cheers,
Rob.
--
http://www.interjinn.com
Application and Templating Framework for PHP

--- End Message ---
--- Begin Message ---
On Thu, 2009-12-10 at 11:26 -0500, Robert Cummings wrote:

> tedd wrote:
> > At 10:29 AM -0500 12/10/09, Robert Cummings wrote:
> >> No, it's been broken for days. You won't get any emails for at least 
> >> another week.
> > 
> > What's been broken?
> > 
> > I've been receiving [PHP] post everyday.
> 
> Nothing is broken... it was a joke. How could it be broken if you 
> receive my response... ;)
> 
> Cheers,
> Rob.
> -- 
> http://www.interjinn.com
> Application and Templating Framework for PHP
> 


Telepathic email servers? I've heard they're becoming more popular these
days.

Thanks,
Ash
http://www.ashleysheridan.co.uk



--- End Message ---
--- Begin Message ---
Ashley Sheridan schreef:
> On Thu, 2009-12-10 at 11:26 -0500, Robert Cummings wrote:
> 
>> tedd wrote:
>>> At 10:29 AM -0500 12/10/09, Robert Cummings wrote:
>>>> No, it's been broken for days. You won't get any emails for at least 
>>>> another week.
>>> What's been broken?
>>>
>>> I've been receiving [PHP] post everyday.
>> Nothing is broken... it was a joke. How could it be broken if you 
>> receive my response... ;)
>>
>> Cheers,
>> Rob.
>> -- 
>> http://www.interjinn.com
>> Application and Templating Framework for PHP
>>
> 
> 
> Telepathic email servers? I've heard they're becoming more popular these
> days.

no, your confused with that other list php-psychics. :-) (STA)

> 
> Thanks,
> Ash
> http://www.ashleysheridan.co.uk
> 
> 
> 


--- End Message ---
--- Begin Message ---
On Thu, Dec 10, 2009 at 08:01:51PM +0100, Jochem Maas wrote:

> Ashley Sheridan schreef:
> > On Thu, 2009-12-10 at 11:26 -0500, Robert Cummings wrote:
> >

<snip>

> >
> > Telepathic email servers? I've heard they're becoming more popular these
> > days.
> 
> no, your confused with that other list php-psychics. :-) (STA)

I thought that list was swallowed by a .Net black hole? ;-}

Paul

-- 
Paul M. Foster

--- End Message ---
--- Begin Message ---
I thought error_reporting would display them, but I guess php.ini had them 
suppressed. Anyway, with:

<?php

error_reporting(-1);
ini_set('display_errors', 1);
set_time_limit(0);
var_dump (file_get_contents ('http://www.google.com'));

?>

I get:

Warning: file_get_contents(http://www.google.com): failed to open stream: 
Operation now in progress in /____/____.php on line 7 bool(false)

Does that help with the diagnosis?


On 2009-12-10, at 12:28 AM, Richard Quadling wrote:

> 2009/12/9 René Fournier <m...@renefournier.com>:
>> It is, and I use curl elsewhere in the same script to fetch remote content.
>> This exact same function works fine on my MacBook Pro (10.6 client, PHP 
>> 5.3), and *was* previously working fine under Server 10.4.11 and PHP 5.3,
>> 
>> On 2009-12-09, at 11:10 PM, laruence wrote:
>> 
>>> try
>>> wget http://www.google.com in your command line to see whether the network 
>>> is reachable
>>> 
>>> LinuxManMikeC wrote:
>>>> 
>>>> On Wed, Dec 9, 2009 at 8:02 AM, LinuxManMikeC <linuxmanmi...@gmail.com> 
>>>> wrote:
>>>> 
>>>>> On Wed, Dec 9, 2009 at 6:45 AM, René Fournier <m...@renefournier.com> 
>>>>> wrote:
>>>>> 
>>>>>> Strange problem I'm having on Mac OS X Server 10.6 running PHP 5.3. Any 
>>>>>> call of file_get_contents() on a local file works fine -- the file is 
>>>>>> read and returned. But any call of file_get_contents on a url -- any 
>>>>>> url, local or remote -- always returns false.
>>>>>> 
>>>>>> var_dump (file_get_contents ('http://www.google.com/'));
>>>>>> 
>>>>>> bool(false)
>>>>>> 
>>>>>> I've checked php.ini, and the obvious seems okay:
>>>>>> 
>>>>>>        allow_url_fopen => On => On
>>>>>> 
>>>>>> Any ideas?
>>>>>> 
>>>>>> ...Rene
>>>>>> 
>>>>> http://us2.php.net/manual/en/filesystem.configuration.php#ini.allow-url-fopen
>>>>> 
>>>>> 
>>>> 
>>>> "I've checked php.ini"
>>>> Right, must remember not to reply to stuff till I'm awake. :-D
>>>> 
>>>> --
>>>> PHP General Mailing List (http://www.php.net/)
>>>> To unsubscribe, visit: http://www.php.net/unsub.php
>>>> 
>>>> 
>>> 
>>> --
>>> <2866791487_dbbbdddf9e.jpg>惠 新宸 xinchen.hui | 商务搜索部 | (+8610)82602112-7974 
>>> | <2866349865_203e53a6c6.jpg>:laruence
>> 
>> 
> 
> Do you have ANY errors/warning/notices?
> 
> 
> 
> -- 
> -----
> Richard Quadling
> "Standing on the shoulders of some very clever giants!"
> EE : http://www.experts-exchange.com/M_248814.html
> Zend Certified Engineer : http://zend.com/zce.php?c=ZEND002498&r=213474731
> ZOPA : http://uk.zopa.com/member/RQuadling


--- End Message ---
--- Begin Message ---
Andrew Mason wrote:
Hi all,
Is anyone aware of a code documentation generator like phpdoc or
doxygen that supports the PHP 5.3 namespaces ?

I tried adding support to doxygen myself but didn't have a whole lot
of luck and didn't have huge amounts of time to spend on learning
flex/yacc.

Well it is on the todo list for phpdoc's but I don't think anybody has time to implement it :( That and the fact that many of us are still on 5.2 for various reasons and can't test it ...

--
Lester Caine - G8HFL
-----------------------------
Contact - http://lsces.co.uk/wiki/?page=contact
L.S.Caine Electronic Services - http://lsces.co.uk
EnquirySolve - http://enquirysolve.com/
Model Engineers Digital Workshop - http://medw.co.uk//
Firebird - http://www.firebirdsql.org/index.php

--- End Message ---
--- Begin Message ---
Hi,

I need to replicate a site on another domain, and in this case, an
iframe won't really do, as I need to remove some of the graphics, etc
around the content. The owner of the site I'm needing to copy has asked
for the site to be duplicated, and unfortunately in this case, because
of the CMS he's used (which is owned by the hosting he uses) I need a
way to have the site replicated on an already existing domain as a
microsite, but in a way that it is always up-to-date.

I'm fine using Curl to grab the site, and even alter the content that is
returned, but I was thinking about a caching mechanism. Has anyone any
suggestions on this?

Thanks,
Ash
http://www.ashleysheridan.co.uk



--- End Message ---
--- Begin Message ---
Ashley Sheridan wrote:
Hi,

I need to replicate a site on another domain, and in this case, an
iframe won't really do, as I need to remove some of the graphics, etc
around the content. The owner of the site I'm needing to copy has asked
for the site to be duplicated, and unfortunately in this case, because
of the CMS he's used (which is owned by the hosting he uses) I need a
way to have the site replicated on an already existing domain as a
microsite, but in a way that it is always up-to-date.

I'm fine using Curl to grab the site, and even alter the content that is
returned, but I was thinking about a caching mechanism. Has anyone any
suggestions on this?

Sounds like you're creating a proxy with post processing/caching on the forwarded content. It should be fairly straightforward to direct page requests to your proxy app, then make the remote request, and post-process, cache, then send to the browser. The only gotcha will be for forms if you do caching.

Cheers,
Rob.
--
http://www.interjinn.com
Application and Templating Framework for PHP

--- End Message ---
--- Begin Message ---
On Thu, 2009-12-10 at 11:10 -0500, Robert Cummings wrote:

> Ashley Sheridan wrote:
> > Hi,
> > 
> > I need to replicate a site on another domain, and in this case, an
> > iframe won't really do, as I need to remove some of the graphics, etc
> > around the content. The owner of the site I'm needing to copy has asked
> > for the site to be duplicated, and unfortunately in this case, because
> > of the CMS he's used (which is owned by the hosting he uses) I need a
> > way to have the site replicated on an already existing domain as a
> > microsite, but in a way that it is always up-to-date.
> > 
> > I'm fine using Curl to grab the site, and even alter the content that is
> > returned, but I was thinking about a caching mechanism. Has anyone any
> > suggestions on this?
> 
> Sounds like you're creating a proxy with post processing/caching on the 
> forwarded content. It should be fairly straightforward to direct page 
> requests to your proxy app, then make the remote request, and 
> post-process, cache, then send to the browser. The only gotcha will be 
> for forms if you do caching.
> 
> Cheers,
> Rob.
> -- 
> http://www.interjinn.com
> Application and Templating Framework for PHP
> 


The only forms are processed on another site, so there's nothing I can
really do about that, as they return to the original site.

How would I go about doing what you suggested though? I'd assumed to use
Curl, but your email suggests not to?

Thanks,
Ash
http://www.ashleysheridan.co.uk



--- End Message ---
--- Begin Message --- If the site can be a few minutes behind, (say 15-30 minutes), then what I recommend is to create a caching script that will update the necessary files if the md5 checksum has changed at all (or a specified time period has past). Then store those files locally, and run local copies of the files. Your performance will be much better than if you have to request the page from another server every time. You could run this script every 15-30 minutes depending on your needs via a cron job.

Joseph

Ashley Sheridan wrote:
Hi,

I need to replicate a site on another domain, and in this case, an
iframe won't really do, as I need to remove some of the graphics, etc
around the content. The owner of the site I'm needing to copy has asked
for the site to be duplicated, and unfortunately in this case, because
of the CMS he's used (which is owned by the hosting he uses) I need a
way to have the site replicated on an already existing domain as a
microsite, but in a way that it is always up-to-date.

I'm fine using Curl to grab the site, and even alter the content that is
returned, but I was thinking about a caching mechanism. Has anyone any
suggestions on this?

Thanks,
Ash
http://www.ashleysheridan.co.uk




--- End Message ---
--- Begin Message ---
Ashley Sheridan wrote:
On Thu, 2009-12-10 at 11:10 -0500, Robert Cummings wrote:
Ashley Sheridan wrote:
> Hi,
> > I need to replicate a site on another domain, and in this case, an
> iframe won't really do, as I need to remove some of the graphics, etc
> around the content. The owner of the site I'm needing to copy has asked
> for the site to be duplicated, and unfortunately in this case, because
> of the CMS he's used (which is owned by the hosting he uses) I need a
> way to have the site replicated on an already existing domain as a
> microsite, but in a way that it is always up-to-date.
> > I'm fine using Curl to grab the site, and even alter the content that is
> returned, but I was thinking about a caching mechanism. Has anyone any
> suggestions on this?

Sounds like you're creating a proxy with post processing/caching on the forwarded content. It should be fairly straightforward to direct page requests to your proxy app, then make the remote request, and post-process, cache, then send to the browser. The only gotcha will be for forms if you do caching.

Cheers,
Rob.
--
http://www.interjinn.com
Application and Templating Framework for PHP


The only forms are processed on another site, so there's nothing I can really do about that, as they return to the original site.

How would I go about doing what you suggested though? I'd assumed to use Curl, but your email suggests not to?

Nope, wasn't suggesting not to. You can use many techniques, but cURL is probably the most robust. The best way to facilitate this, IMHO, is to have a rewrite rule that directs all traffic for the proxy site to your application. Then rewrite the REQUEST_URI to point to the page on the real domain. Then check your cache for the content and if empty use cURL to retrieve the content, apply your post-processing (to strip out what you don't want and apply a new page layout or whatever), then cache (if not already cached) the content (this can be a simple database table with the request URI and a timestamp), then output the content.

Cheers,
Rob.
--
http://www.interjinn.com
Application and Templating Framework for PHP

--- End Message ---
--- Begin Message ---
Joseph Thayne wrote:
If the site can be a few minutes behind, (say 15-30 minutes), then what I recommend is to create a caching script that will update the necessary files if the md5 checksum has changed at all (or a specified time period has past). Then store those files locally, and run local copies of the files. Your performance will be much better than if you have to request the page from another server every time. You could run this script every 15-30 minutes depending on your needs via a cron job.

Use URL rewriting or capture 404 errors to handle the proxy request. No need to download and cache the entire site if everyone is just requesting the homepage.

Cheers,
Rob.
--
http://www.interjinn.com
Application and Templating Framework for PHP

--- End Message ---
--- Begin Message ---
On Thu, 2009-12-10 at 11:25 -0500, Robert Cummings wrote:

> Joseph Thayne wrote:
> > If the site can be a few minutes behind, (say 15-30 minutes), then what 
> > I recommend is to create a caching script that will update the necessary 
> > files if the md5 checksum has changed at all (or a specified time period 
> > has past).  Then store those files locally, and run local copies of the 
> > files.  Your performance will be much better than if you have to request 
> > the page from another server every time.  You could run this script 
> > every 15-30 minutes depending on your needs via a cron job.
> 
> Use URL rewriting or capture 404 errors to handle the proxy request. No 
> need to download and cache the entire site if everyone is just 
> requesting the homepage.
> 
> Cheers,
> Rob.
> -- 
> http://www.interjinn.com
> Application and Templating Framework for PHP
> 


Yeah, I was going to use the page request to trigger the caching
mechanism, as it's unlikely that all pages are going to be equally as
popular as one another. I'll let you all know how it goes on!

Thanks,
Ash
http://www.ashleysheridan.co.uk



--- End Message ---
--- Begin Message ---
Ashley Sheridan schreef:
> On Tue, 2009-12-08 at 17:32 +0100, Jochem Maas wrote:
> 
>> Hi Allen,
>>
>> gonna be a bit ruthless with you :).
>>
>> 1. your not filtering your input (your open to include being hacked)
>> 2. your not validating or error checking (e.g. does the include file exist??)
>> 3. keeping large numbers of content pages with numerical filenames is a 
>> maintenance
>> nightmare and incidentally not very SEO friendly
>> 4. your not doing much debugging (I guess) - try using var_dump(), echo, 
>> print_r(),
>> etc all over your code to figure out what it's doing (e.g. var_dump($_GET, 
>> $_POST) and
>> print("HELLO - I THINK \$_GET['page'] is set."))
>>
>> personally I never rely on relative paths - I always have the app determine a
>> full path to the application root (either at install/update or at the 
>> beginning
>> of a request)
>>
>> also I would suggest you use 1 include file for all your scripts (rather than
>> per dir) ... copy/past code sucks (read up on the DRY principe).
>>
>> additionally look into FrontController patterns and the possibility to
>> stuff all that content into a database which gives all sorts of opportunities
>> for management/editing.
>>
>> <?php
>>
>> $page        = isset($_GET['page']) && strlen($_GET['page'])
>>      ? basename($_GET['page'])
>>      : null
>>      ;
>>
>> if (!$page || !preg_match('#^[a-z0-9]+$#i', $page))
>>      $page = 'default';
>>
>> $file = dirname(__FILE__) . '/content/' . $page . '.inc';
>>
>> if (!file_exists($file) || !is_readable($file)) {
>>      error_log('Hack attempt? page = '.$page.', file = '.$file);
>>      header('Status: 404');
>>      exit;
>> }
>>
>> // echo header
>> include $file;
>> // echo header
>>
>> ?>
>>
>> maybe I've bombarded you with unfamiliar concepts, functions and/or syntax.
>> if so please take time to look it all up ... and then come back with 
>> questions :)
>>
>> have fun.
>>
>> Allen McCabe schreef:
>>> I have been using includes for my content for a while now with no problems.
>>> Suddenly it has stopped working, and it may or may not be from some changes
>>> I made in my code structure.
>>>
>>> I use default.php for most or all of my pages within a given directory,
>>> changing the content via page numbers in the query string.
>>>
>>>
>>> So on default.php, I have the following code:
>>>
>>>
>>> <?php
>>> if(isset($_GET['page']))
>>> {
>>>   $thispage = $_GET['page'];
>>>   $content = 'content/'.$_GET['page'].'.inc';
>>> }
>>> else
>>> {
>>>   $thispage = "default";
>>>   $content = 'content/default.inc';
>>> }
>>> ?>
>>> <html>, <body>, <div> etc.
>>> <?php include($content); ?>
>>>
>>>
>>> I have a content subdirectory where I store all the pages with files such as
>>> "default.inc, 101.inc, 102.inc, etc.
>>>
>>> As I said, this has been working fine up until now, if I use the url
>>> "user/default.php" or just "user/" I get this error:
>>>
>>>
>>> *Warning*: include(content/.inc)
>>> [function.include<http://lpacmarketing.hostzi.com/user/function.include>]:
>>> failed to open stream: No such file or directory in *
>>> /home/a9066165/public_html/user/default.php* on line *89*
>>>
>>> AND
>>>
>>> *Warning*: include()
>>> [function.include<http://lpacmarketing.hostzi.com/user/function.include>]:
>>> Failed opening 'content/.inc' for inclusion
>>> (include_path='.:/usr/lib/php:/usr/local/lib/php') in *
>>> /home/a9066165/public_html/user/default.php* on line *89*
>>>
>>> But if I use "user/default.php?page=default"  I get the correct content.
>>>
>>> It's acting as if page is set, but set to NULL, and then trying to find an
>>> include at path "content/.inc"  what's going on??
>>>
>>
> 
> 
> The SEO factor here is only minor. Very little weight is given to the
> filename of a page, much more is given to the content and the way it is
> marked up.

'friendly' - i.e. humanreadable URLs are ++

with regard to SEO, I only know it has impact on real estate sites.

> Thanks,
> Ash
> http://www.ashleysheridan.co.uk
> 
> 
> 


--- End Message ---
--- Begin Message ---
LinuxManMikeC wrote on 2009-12-07 22:48:
Instead of hard coding cases you can validate and constrain the input
with a regex.  Much more flexible when adding content.  I would also
add code to make sure the file exists, otherwise fall through to the
default.

In huge sites with a lot of include files I agree, in small sites this solution gives me an overview of the setup.

In this case I have an idea that the RegEx solution could be another problem for Allen, but it's just an idea :-)

--
Take Care
Kim Emax - master|minds - Vi tænker IT for dig...
Konsulentbistand, programmering, design & hosting af websites.
http://www.masterminds.dk - http://www.emax.dk
Køb din vin online på http://www.gmvin.dk

--- End Message ---
--- Begin Message ---
LinuxManMikeC wrote on 2009-12-07 22:48:
> Instead of hard coding cases you can validate and constrain the input
> with a regex.  Much more flexible when adding content.  I would also
> add code to make sure the file exists, otherwise fall through to the
> default.

In huge sites with a lot of include files I agree, in small sites this solution gives me an overview of the setup.

In this case I have an idea that the RegEx solution could be another problem for Allen, but it's just an idea :-)

--
Take Care
Kim Emax - master|minds - Vi tænker IT for dig...
Konsulentbistand, programmering, design & hosting af websites.
http://www.masterminds.dk - http://www.emax.dk
Køb din vin online på http://www.gmvin.dk


--- End Message ---
--- Begin Message ---
How can i change the temporary upload directory?
var_dump(ini_get('upload_tmp_dir'));      gives me (and that is set in php.ini)
string '/var/www/cgi-bin' (length=16)

but
var_dump($_FILES)                                                     gives me
'tmp_name' => string '/tmp/phpbSZ6WP' (length=14)

var_dump(file_exists($_FILES['file']['tmp_name']));  gives me  (/tmp
has permissions drwxrwxrwt and i never used file_move_upload or any
similar functions)
boolean false

am I missing something here?
Kranthi.

--- End Message ---

Reply via email to