[Savannah-help-public] [sr #106304] Bug spam from logged in spammers?
Follow-up Comment #32, sr #106304 (project administration): So at least one spammer took at look at the GNU manifesto publication date (which is the current TextCHA question). I need to add more questions, at once - any suggestions? :) ___ Reply to this item at: http://savannah.gnu.org/support/?106304 ___ Message sent via/by Savannah http://savannah.gnu.org/
[Savannah-help-public] [sr #106304] Bug spam from logged in spammers?
Follow-up Comment #33, sr #106304 (project administration): Could it be the case that the spammer already existed in the system before the textcha? ___ Reply to this item at: http://savannah.gnu.org/support/?106304 ___ Message sent via/by Savannah http://savannah.gnu.org/
Re: [Savannah-hackers-public] Re: GNU Planet and Savannah
Hi all. I was reading about planet and thinking about how reduce the bandwidth consumption in GNUplanet and savannah. The main problem I see is that we must to check 360 RSS feed from savannah for complete the information of GNUplanet. I think we have two options: 1.- Implement a GNU RSS feed in savannah for check. This feed must contain latest news of all GNU projects. We will only check this feed and use it to feed our Planet. This is a little bit hard to implement, because we must to change both software (savannah and planet). 2.- Install a planet locally in savannah for use planet cache. When planet checks a feed, it stores all new data in a cache and then builds the webpage with this information. We could install a planet locally in savannah for get all feeds in local (saving bandwidth) and later rsync this directory from GNUchapters machine. This must get only the updated feeds. You also can ignore GNUplanet hits in your webstats, because the access will be local :). I think the second option is easiest and we only must to make little changes in planet (prevent planet to update the cache file if it isn't updated or make a intelligent script that syncs only only really updated files). No changes in savannah sources or platform are needed :) If you are agree with me and let me access to savannah webserver (a non-privileged account must be enough), I'll install and maintain this mirror :). I already have a savannah account (nacho) but I think it's only a web account without any shell access. Best regards, Nacho.
Re: [Savannah-hackers-public] Re: Google weirdness
On Thu, Jan 15, 2009 at 08:49:16PM +0200, Yavor Doganov wrote: John Sullivan wrote: I noticed today that pages living at www-test.gnu.org are being indexed by Google. I've asked the sysadmins to correct that. It was up until recently, AFAIK. I trashed it recently and asked sysadmin@ to remove it from the DNS no sooner than last week :) It was used before we had the on-commit cvs synchro, but now it doesn't have any purpose (and it was indexed indeed). It got little non-search-engine traffic too. I should have carbon-copied that, my apologizes. However, I also came across results for http://www.gnu.org/savannah-checkouts/... Does anyone know where this comes from? If not, I will pose this question for the FSF sysadmins to figure out as well. No clue. Maybe Sylvain knows? I think this folder should be hidden, and it's probably used internally for serving all the GNU and non-GNU webpages. I think you can ask sysadmin@ to mask it. -- Sylvain
[Savannah-hackers-public] Conditions for approving GNU project submissions
Hi, the documentation for the admins says we should check whether a package has been listed on some fencepost list, before we approve a GNU package on Savannah. Is this a synonym for these I dub this project a GNU package mails from Richard, or are there further conditions? Sebastian
Re: [Savannah-hackers-public] Re: GNU Planet and Savannah
Hi, In about 4 days 1/2, GNU Planet downloaded ~200MB from the Savannah frontend out of ~35GB, so ~0.5%. At the previous 4x rate = 2%. In the same interval, GNU Planet made ~56000 requests out of 1.6M, so around 3.5%. At the previous 4x rate = 14%. So the issue isn't the bandwidth, but the number of requests. Hence I am more interested in optimizations that lower the number of requests. I'll try to have a look at planetplanet and see what can be done in that regard. -- Sylvain On Thu, Jan 15, 2009 at 11:01:18AM +0100, Nacho Gonzalez Lopez wrote: Hi all. I was reading about planet and thinking about how reduce the bandwidth consumption in GNUplanet and savannah. The main problem I see is that we must to check 360 RSS feed from savannah for complete the information of GNUplanet. I think we have two options: 1.- Implement a GNU RSS feed in savannah for check. This feed must contain latest news of all GNU projects. We will only check this feed and use it to feed our Planet. This is a little bit hard to implement, because we must to change both software (savannah and planet). 2.- Install a planet locally in savannah for use planet cache. When planet checks a feed, it stores all new data in a cache and then builds the webpage with this information. We could install a planet locally in savannah for get all feeds in local (saving bandwidth) and later rsync this directory from GNUchapters machine. This must get only the updated feeds. You also can ignore GNUplanet hits in your webstats, because the access will be local :). I think the second option is easiest and we only must to make little changes in planet (prevent planet to update the cache file if it isn't updated or make a intelligent script that syncs only only really updated files). No changes in savannah sources or platform are needed :) If you are agree with me and let me access to savannah webserver (a non-privileged account must be enough), I'll install and maintain this mirror :). I already have a savannah account (nacho) but I think it's only a web account without any shell access. Best regards, Nacho.
Re: [Savannah-hackers-public] Re: GNU Planet and Savannah
Incidentally I implemented Last-Modified/If-Modified support just for kicks :) -- Sylvain On Thu, Jan 15, 2009 at 07:40:59PM +0100, Sylvain Beucler wrote: Hi, In about 4 days 1/2, GNU Planet downloaded ~200MB from the Savannah frontend out of ~35GB, so ~0.5%. At the previous 4x rate = 2%. In the same interval, GNU Planet made ~56000 requests out of 1.6M, so around 3.5%. At the previous 4x rate = 14%. So the issue isn't the bandwidth, but the number of requests. Hence I am more interested in optimizations that lower the number of requests. I'll try to have a look at planetplanet and see what can be done in that regard. -- Sylvain On Thu, Jan 15, 2009 at 11:01:18AM +0100, Nacho Gonzalez Lopez wrote: Hi all. I was reading about planet and thinking about how reduce the bandwidth consumption in GNUplanet and savannah. The main problem I see is that we must to check 360 RSS feed from savannah for complete the information of GNUplanet. I think we have two options: 1.- Implement a GNU RSS feed in savannah for check. This feed must contain latest news of all GNU projects. We will only check this feed and use it to feed our Planet. This is a little bit hard to implement, because we must to change both software (savannah and planet). 2.- Install a planet locally in savannah for use planet cache. When planet checks a feed, it stores all new data in a cache and then builds the webpage with this information. We could install a planet locally in savannah for get all feeds in local (saving bandwidth) and later rsync this directory from GNUchapters machine. This must get only the updated feeds. You also can ignore GNUplanet hits in your webstats, because the access will be local :). I think the second option is easiest and we only must to make little changes in planet (prevent planet to update the cache file if it isn't updated or make a intelligent script that syncs only only really updated files). No changes in savannah sources or platform are needed :) If you are agree with me and let me access to savannah webserver (a non-privileged account must be enough), I'll install and maintain this mirror :). I already have a savannah account (nacho) but I think it's only a web account without any shell access. Best regards, Nacho.