On Tue, Feb 8, 2011 at 7:36 AM, Arno Kuhl <ak...@telkomsa.net> wrote:
> I'm hoping some clever php gurus have been here before and are willing to
> share some ideas.
> I have a site where articles are assigned to categories in containers. An
> article can be assigned to only one category per container, but one or more
> containers. Access permissions can be set per article, per category and/or
> per container, for one or more users and/or user groups. If an article is
> assigned to 10 categories and only one of those has a permission denying
> access, then the article can't be accessed even if browsing through one of
> the other 9 categories. Currently everything works fine, with article titles
> showing when browsing through category or search result lists, and a message
> is displayed when the article is clicked if it cannot be viewed because of a
> Now there's a requirement to not display the article title in category lists
> and search results if it cannot be viewed. I'm stuck with how to determine
> the number of results for paging at the start of the list or search. The
> site is quite large (20,000+ articles and growing) so reading the entire
> result set and sifting through it with permission rules for each request is
> not an option. But it might be an option if done once at the start of each
> search or list request, and then use that temporary modified result set for
> subsequent requests on the same set. I thought of saving the set to a
> temporary db table or file (not sure about overhead of
> serializing/unserializing large arrays). A sizing exercise based on the
> recordset returned for searches and lists shows a max of about 150MB for
> 20,000 articles and 380MB for 50,000 articles that needs to be saved
> temporarily per search or list request - in the vast majority of cases the
> set will be *much* smaller but it needs to cope with the worst case, and
> still do so a year down the line.
> All this extra work because I can't simply get an accurate number of results
> for paging, because of permissions!
> So my questions are:
> 1. Which is better (performance) for this situation: file or db?
have you timed it yourself?
> 2. How do I prepare a potentially very large data set for file or fast
> writing to a new table (ie I obviously don't want to write it record by
Even the db's cant insert as fast as the function is presented to it,
and it can respond, so again...timeit
> 3. Are there any other alternatives worth looking at?
This is a question for the experienced php developers. But the above
According to theoretical physics, the division of spatial intervals as
the universe evolves gives rise to the fact that in another timeline,
your interdimensional counterpart received helpful advice from me...so
be eternally pleased for them.
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php