On Tue, Feb 1, 2011 at 10:59 AM, Jon Hood <squink...@gmail.com> wrote:

> I have a website that is currently pulling from more than 30 databases,
> combining the data, and displaying it to the user. As more and more
> databases are added, the script continues to get slower and slower, and
> I've
> realized that I need to either find a way to pull these data in parallel.
> So
> - what is the preferred method of pulling data from multiple locations in
> parallel?
> Thanks,
> Jon

Well, you could turn the calls into REST-based web requests that produce
json. Then you could use curl_multi to grab the results in parallel and
quick decode the json.


P.S. - Sorry for the duplicate, Jon, I forgot to copy the list.

Nephtali:  A simple, flexible, fast, and security-focused PHP framework

Reply via email to