Such a tool should, from my inexpert view, not be to difficult to implement as most of the required information is already publicly available trough Squid logs. For example, http://stats.grok.se is a tool which shows the amount of page requests that were made for a specific page (Currently used for enwiki's DYKSTATS <http://en.wikipedia.org/wiki/Wikipedia:DYKSTATS>). Its currently supported range is 2007-today, which would allow for immediate historical searches as well. I believe that the tool also tracks requests for non distant pages, as the redlink page<http://stats.grok.se/en/201005/redlink>shows one or two hits every day, even though it is non existent. Since all the raw data seems already available i presume the only required modification would be a cross reference with the created pages list to filter already made pages, along with an output method that generates a list of most accessed redlinks.
Note that there may be some issues with this method. I believe that requesting a page twice from the same IP will result in two requests being counted (Thus the actual amount of people looking at the page may be less). Even so i believe this is as accurate as things will become, since the privacy policy will likely forbid the inclusion of IP related data. it may be worth looking into this; Perhaps user:hendrik<http://en.wikipedia.org/wiki/User:Henrik>(The tools creator) would be willing to assist with this issue. It may be worth the attempt to ask it. ~Excirial On Fri, Jun 11, 2010 at 8:39 PM, Sage Ross <[email protected]<ragesoss%[email protected]> > wrote: > On Fri, Jun 11, 2010 at 6:54 AM, Shiju Alex <[email protected]> > wrote: > > [snip] > > > Some feature is required in the MediaWiki software that enable us to see > a > > list of keywords used most frequently by the users to search for > non-exist > > articles. If we get such a list then some users like him can concentrate > on > > creating articles using that key words. > > > > Of course, I know that this feature may not be helpful for big wikis like > > English. But for small wikis (especially small non-Latin language wikis), > > this will be of great help. It is almost like* creating wiki articles > based > > on user requirement*. > > > > Actually, this kind of tool is very helpful for big wikis as well. We > had some manually-updated data for English Wikipedia in past years, > and it revealed some interesting things, especially in terms of needed > redirects, where people were searching for an article that exists but > didn't have quite the right name for it. Many of these terms were > getting hundreds of hits per day. > > Making such data more easily available for all projects would be a > great boon, I think. > > -Sage > > _______________________________________________ > foundation-l mailing list > [email protected] > Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l > _______________________________________________ foundation-l mailing list [email protected] Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
