| <Zacker> my idea is we figure out a weighting system
| <Zacker> that can assign a number to every piece of content
| <Zacker> like page rank
| <Zacker> 1 to 10 or something of the like
| <Zacker> and that number is calculated by
| <Zacker> how many people syndicate it / view it or something like that
| <Zacker> basically its popularity rating
| <Zacker> and it meta dean keeps track of it.. and this is used in the
| "bubble up syndication" stuff
| <Zacker> so admins can flag feeds to watch for high ranked content
| <Zacker> to auto syndicate
| <Zacker> or flag for them to look at... or the like....
I don't think this is practical (at least in this form). The
description suggests that ratings for *every* piece of content
will be housed all in one central database -- and that this
central database must be updated whenever *anyone* reads or
syndicates any piece of content. We can't have all the hundreds
(thousands? How many are we expecting?) of sites out there
constantly hitting our database with updates.
I'm not saying ratings are totally unrealistic, just that a lot
more architecture would need to be designed to support it.
Key Question: What kind of scale are we expecting? It would be
good to have rough estimates for:
- number of sites
- number of users per site
- number of articles per site per day
Sorry to keep nagging but i'm going to raise this question again
because IMHO we have to make this critical design decision in
order to proceed:
* What determines the set of article pointers that each
site will cache?
And a related question:
* How does content "bubble up"? Will parent nodes have to
poll all of their child nodes to look for content to promote?
-- ?!ng