Re: [PHP] simplest way in php to get our content on another site / included javascript question

2006-12-18 Thread Richard Lynch
On Sun, December 17, 2006 8:44 pm, jonathan wrote:
 I'm working on a project where we'd want partner sites to get our
 content on the other web sites.

Might I suggest that you support several options?...

 A key priority is that they won't have much technical sophistication
 (probably no db experience or php experience).

Offering more formats gives the partners more options to choose what
suits their experience/needs/skills.

 Also, these would be third party sites so there would also be an
 issue of requirements (differnet xsl processors,
 no allow_url_fopen, etc...)

Without allow_url_fopen(), you're down to curl or no PHP at all, with
an iFrame or JS...

 I tried looking at sites that get their content on other sites to see
 how they do it:

 the possible solutions seem to be:
 1. rss / xml - this is well known but would have a pretty high
 technical hurdles on the other sites. We could create
 every walk through and code sample but I still think it would be too
 involved. Only way out is either a xsl transformation (not
 going to happen), custom parsing library in php (maybe) or writing to
 db with associated libraries (too complex)

You should probably provide your content as RSS, no matter what else
you do, because it's so ubiquitous.

Assuming your content is in a db, dumping out an RSS feed is pretty
much a no-brainer, really.

 2. serialized php with custom library - this seems more feasible than
 #1 in terms of requirements

I don't even know what this means...

 3. php proxy and ajax call / json - i think this would be too complex
 and would require too  much manipulation of apache to handle
 subdomains, etc.

I can sort of see where you are going, where they'd use Ajax to input
the parameters of the content they want, but, really, just a
well-designed GET and an iFrame or Ajax or whatever the partner wants
to use to snarf and display the content is probably easier.

 4. included javascript - kinda how digg handles syndication (see
 http://www.digg.com/add-digg for example).
 This would seem to be the lowest barrier to entry. The main concern
 is preventing unauthorized bots from crawling. It would seem
 our only option would be using the referer property but this could be
 forged relatively easily.

It's very common to require a username/password in the URL for content
partners.

And you can require them to pre-register the IP of their server[s]
that will be crawling for the content -- Only their servers need to
access your content, not all their visitors' machines.  So they are
static IP and they should not have TOO many of them to register, even
in a mutli-data-center setup.

I worked for a content aggregator for awhile, and virtually all their
content partners required a username/password in the URL, and we had
to register the IPs of the servers that were getting the data.  I
suppose you could do a reverse DNS lookup and cache the results, but,
really, it's easier to get the IP address up front.

 5. SOAP - no way

Again, if you have the db already, and if you have PHP5, this is
actually not difficult at all to implement on your side.

So maybe it only helps you get one more content partner (sale) that
loves using SOAP instead of RSS or whatever.

It's a pretty small investment on your side, for relatively good odds
on payoff.

It's a bit tougher in PHP4, though there are libraries out there that
can take care of 99% of it.

 6. REST - if meant to just mean xml, i see this as an extension of #1.

Somewhere between #1 and #5, sort of.

Same answer though.

A PHP REST library and a couple hours of hacking and a couple days of
serious development and a couple weeks of hard-core QA, and you're
good to go.

So I'm suggesting that you could support MOST of the above, probably
with the same core set of functions and minimal effort for the output
layers:
   RSS / XML
   SOAP
   REST

You could also provide a nice CSS Zen Garden span/div only HTML output
for an iframe so that a CSS hack could make decent output.

Perhaps even a CSV or tab-delimited data dump for somebody who wants
to store-and-forward the data.

If you plan well, you could build the application so that you can roll
out a new API layer every month or so, and not have to do *ALL* of
them from the get-go, but always be in touch with your partners,
bragging on yourself about how you now support XYZ and giving them
more options and the warm fuzzies about how much you cater to them.

-- 
Some people have a gift link here.
Know what I want?
I want you to buy a CD from some starving artist.
http://cdbaby.com/browse/from/lynch
Yeah, I get a buck. So?

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



[PHP] simplest way in php to get our content on another site / included javascript question

2006-12-17 Thread jonathan
I'm working on a project where we'd want partner sites to get our  
content on the other web sites.
A key priority is that they won't have much technical sophistication  
(probably no db experience or php experience).
Also, these would be third party sites so there would also be an  
issue of requirements (differnet xsl processors,

no allow_url_fopen, etc...)

I tried looking at sites that get their content on other sites to see  
how they do it:


the possible solutions seem to be:
1. rss / xml - this is well known but would have a pretty high  
technical hurdles on the other sites. We could create
every walk through and code sample but I still think it would be too  
involved. Only way out is either a xsl transformation (not
going to happen), custom parsing library in php (maybe) or writing to  
db with associated libraries (too complex)
2. serialized php with custom library - this seems more feasible than  
#1 in terms of requirements
3. php proxy and ajax call / json - i think this would be too complex  
and would require too  much manipulation of apache to handle

subdomains, etc.
4. included javascript - kinda how digg handles syndication (see  
http://www.digg.com/add-digg for example).
This would seem to be the lowest barrier to entry. The main concern  
is preventing unauthorized bots from crawling. It would seem
our only option would be using the referer property but this could be  
forged relatively easily.

5. SOAP - no way
6. REST - if meant to just mean xml, i see this as an extension of #1.

 any thoughts or opinions would be appreciated.

-jt

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] simplest way in php to get our content on another site / included javascript question

2006-12-17 Thread Casey Chu

If this is about search engine optimization, I'd suggest using Javascript.

Something like :

script type=text/javascript language=JavaScript
src=http://www.yoursite.com/include.php?article=1536f;/scripta
href=http://www.yoursite.com; id=yoursiteProvided by YourSite/a

And inside the script, Ajax to fetch and display the content, and also add

if (var link = document.getElementById('yoursite') 
link.href=='http://www.yoursite.com');
else 
window.location.replace(http://www.yoursite.com/youremovedthelinkyouwillpay.php;);


On 12/17/06, jonathan [EMAIL PROTECTED] wrote:

I'm working on a project where we'd want partner sites to get our
content on the other web sites.
A key priority is that they won't have much technical sophistication
(probably no db experience or php experience).
Also, these would be third party sites so there would also be an
issue of requirements (differnet xsl processors,
no allow_url_fopen, etc...)

I tried looking at sites that get their content on other sites to see
how they do it:

the possible solutions seem to be:
1. rss / xml - this is well known but would have a pretty high
technical hurdles on the other sites. We could create
every walk through and code sample but I still think it would be too
involved. Only way out is either a xsl transformation (not
going to happen), custom parsing library in php (maybe) or writing to
db with associated libraries (too complex)
2. serialized php with custom library - this seems more feasible than
#1 in terms of requirements
3. php proxy and ajax call / json - i think this would be too complex
and would require too  much manipulation of apache to handle
subdomains, etc.
4. included javascript - kinda how digg handles syndication (see
http://www.digg.com/add-digg for example).
This would seem to be the lowest barrier to entry. The main concern
is preventing unauthorized bots from crawling. It would seem
our only option would be using the referer property but this could be
forged relatively easily.
5. SOAP - no way
6. REST - if meant to just mean xml, i see this as an extension of #1.

  any thoughts or opinions would be appreciated.

-jt

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php




--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php