robots.txt wicket urls
How can I add an entry to robots.txt for a page of my site that is not currently mounted and cannot be made bookmarkable. The url is the wicket url of ?wicket:interface. I guess I could say block everything that contains ?wicket:interface but that seems kind of brute force. I tried using IndexedHybridUrlCodingStrategy and HybridUrlCodingStrategy to mount the page as a named page. This works but then if someone changes the number after the . in the url, they may request a page out of the pagemap and an exception is thrown and we show our error page. Is this the only way to do it? Thanks.
Re: robots.txt wicket urls
in your robots.txt put the url without the ., that way a new version of the page will get created. -igor On Tue, Oct 12, 2010 at 8:30 AM, Jeffrey Schneller jeffrey.schnel...@envisa.com wrote: How can I add an entry to robots.txt for a page of my site that is not currently mounted and cannot be made bookmarkable. The url is the wicket url of ?wicket:interface. I guess I could say block everything that contains ?wicket:interface but that seems kind of brute force. I tried using IndexedHybridUrlCodingStrategy and HybridUrlCodingStrategy to mount the page as a named page. This works but then if someone changes the number after the . in the url, they may request a page out of the pagemap and an exception is thrown and we show our error page. Is this the only way to do it? Thanks. - To unsubscribe, e-mail: users-unsubscr...@wicket.apache.org For additional commands, e-mail: users-h...@wicket.apache.org
RE: robots.txt wicket urls
So using the HybridUrlCodingStrategy is correct and if users want to muck with the URL they can and they may just get an error? Then for the robots.txt I just say to block XYZ without the .. For example I have a page called Login that is mounted with the HybridUrlCodingStrategy. So the url to the user is Login.1 or Login.7 depending on how many times they hit the page. In the robots.txt I just need to say to ignore the url of Login Can I set the pagemap to only store N number of pages per session? I tried using: getSessionSettings().setMaxPageMaps(1) but what does this really do? Is it 1 page per session or 1 of each page per session. Thanks. -Original Message- From: Igor Vaynberg [mailto:igor.vaynb...@gmail.com] Sent: Tuesday, October 12, 2010 12:21 PM To: users@wicket.apache.org Subject: Re: robots.txt wicket urls in your robots.txt put the url without the ., that way a new version of the page will get created. -igor On Tue, Oct 12, 2010 at 8:30 AM, Jeffrey Schneller jeffrey.schnel...@envisa.com wrote: How can I add an entry to robots.txt for a page of my site that is not currently mounted and cannot be made bookmarkable. The url is the wicket url of ?wicket:interface. I guess I could say block everything that contains ?wicket:interface but that seems kind of brute force. I tried using IndexedHybridUrlCodingStrategy and HybridUrlCodingStrategy to mount the page as a named page. This works but then if someone changes the number after the . in the url, they may request a page out of the pagemap and an exception is thrown and we show our error page. Is this the only way to do it? Thanks. - To unsubscribe, e-mail: users-unsubscr...@wicket.apache.org For additional commands, e-mail: users-h...@wicket.apache.org - To unsubscribe, e-mail: users-unsubscr...@wicket.apache.org For additional commands, e-mail: users-h...@wicket.apache.org
Re: robots.txt wicket urls
the url to the user is still /Login, its just that when they go to it they are redirected to /Login.n in your robots.txt if you want to reference the page use /Login - this is the url the robot has to hit to get to /Login.n also when the user tweaks the n they should simply get a new instance of the page, not an error. -igor On Tue, Oct 12, 2010 at 9:32 AM, Jeffrey Schneller jeffrey.schnel...@envisa.com wrote: So using the HybridUrlCodingStrategy is correct and if users want to muck with the URL they can and they may just get an error? Then for the robots.txt I just say to block XYZ without the .. For example I have a page called Login that is mounted with the HybridUrlCodingStrategy. So the url to the user is Login.1 or Login.7 depending on how many times they hit the page. In the robots.txt I just need to say to ignore the url of Login Can I set the pagemap to only store N number of pages per session? I tried using: getSessionSettings().setMaxPageMaps(1) but what does this really do? Is it 1 page per session or 1 of each page per session. Thanks. -Original Message- From: Igor Vaynberg [mailto:igor.vaynb...@gmail.com] Sent: Tuesday, October 12, 2010 12:21 PM To: users@wicket.apache.org Subject: Re: robots.txt wicket urls in your robots.txt put the url without the ., that way a new version of the page will get created. -igor On Tue, Oct 12, 2010 at 8:30 AM, Jeffrey Schneller jeffrey.schnel...@envisa.com wrote: How can I add an entry to robots.txt for a page of my site that is not currently mounted and cannot be made bookmarkable. The url is the wicket url of ?wicket:interface. I guess I could say block everything that contains ?wicket:interface but that seems kind of brute force. I tried using IndexedHybridUrlCodingStrategy and HybridUrlCodingStrategy to mount the page as a named page. This works but then if someone changes the number after the . in the url, they may request a page out of the pagemap and an exception is thrown and we show our error page. Is this the only way to do it? Thanks. - To unsubscribe, e-mail: users-unsubscr...@wicket.apache.org For additional commands, e-mail: users-h...@wicket.apache.org - To unsubscribe, e-mail: users-unsubscr...@wicket.apache.org For additional commands, e-mail: users-h...@wicket.apache.org - To unsubscribe, e-mail: users-unsubscr...@wicket.apache.org For additional commands, e-mail: users-h...@wicket.apache.org
Re: robots.txt
WicketApplication.mountBookmarkablePage(String path, ClassT page)? /Per On Mon, Aug 9, 2010 at 5:42 PM, Sefa Irken sefair...@gmail.com wrote: Thank you everyone, that works. But a bit of curiosity, is there a wicket or servlet way? More clearly, how can a singe file mounted to a single url ? like /bob/static.html. - To unsubscribe, e-mail: users-unsubscr...@wicket.apache.org For additional commands, e-mail: users-h...@wicket.apache.org
Re: robots.txt
Thank you everyone, that works. But a bit of curiosity, is there a wicket or servlet way? More clearly, how can a singe file mounted to a single url ? like /bob/static.html.
robots.txt
This is a ridicilous question, I am pretty new to web and servlet world (coming from desktop java). I couldn't figure out, how to serve my robots.txt under www.site.com/robots.txt. How can I do that in Wicket ?
Re: robots.txt
Hi! You can insert static stuff in webapp/. directory. ** Martin 2010/8/8 Sefa Irken sefair...@gmail.com: This is a ridicilous question, I am pretty new to web and servlet world (coming from desktop java). I couldn't figure out, how to serve my robots.txt under www.site.com/robots.txt. How can I do that in Wicket ? - To unsubscribe, e-mail: users-unsubscr...@wicket.apache.org For additional commands, e-mail: users-h...@wicket.apache.org
Re: robots.txt
..that assumes that the app's context path is /. On Sun, Aug 8, 2010 at 3:03 AM, Martin Makundi martin.maku...@koodaripalvelut.com wrote: Hi! You can insert static stuff in webapp/. directory. ** Martin 2010/8/8 Sefa Irken sefair...@gmail.com: This is a ridicilous question, I am pretty new to web and servlet world (coming from desktop java). I couldn't figure out, how to serve my robots.txt under www.site.com/robots.txt. How can I do that in Wicket ? - To unsubscribe, e-mail: users-unsubscr...@wicket.apache.org For additional commands, e-mail: users-h...@wicket.apache.org
Re: robots.txt
Not really... 2010/8/8 Nikita Tovstoles nikita.tovsto...@gmail.com: ..that assumes that the app's context path is /. On Sun, Aug 8, 2010 at 3:03 AM, Martin Makundi martin.maku...@koodaripalvelut.com wrote: Hi! You can insert static stuff in webapp/. directory. ** Martin 2010/8/8 Sefa Irken sefair...@gmail.com: This is a ridicilous question, I am pretty new to web and servlet world (coming from desktop java). I couldn't figure out, how to serve my robots.txt under www.site.com/robots.txt. How can I do that in Wicket ? - To unsubscribe, e-mail: users-unsubscr...@wicket.apache.org For additional commands, e-mail: users-h...@wicket.apache.org - To unsubscribe, e-mail: users-unsubscr...@wicket.apache.org For additional commands, e-mail: users-h...@wicket.apache.org
Re: robots.txt
my bad - let me clarify: if robots.txt is being stored in appserver's webapp directory, then yes, that'll work regardless app's context path. If it's being packaged with the webapp - and is part of, say, a mvn project, it can be stored in {project}/src/main/webapp. However, the webapp should then be deployed @ root context. -nikita On Sun, Aug 8, 2010 at 11:28 AM, Martin Makundi martin.maku...@koodaripalvelut.com wrote: Not really... 2010/8/8 Nikita Tovstoles nikita.tovsto...@gmail.com: ..that assumes that the app's context path is /. On Sun, Aug 8, 2010 at 3:03 AM, Martin Makundi martin.maku...@koodaripalvelut.com wrote: Hi! You can insert static stuff in webapp/. directory. ** Martin 2010/8/8 Sefa Irken sefair...@gmail.com: This is a ridicilous question, I am pretty new to web and servlet world (coming from desktop java). I couldn't figure out, how to serve my robots.txt under www.site.com/robots.txt. How can I do that in Wicket ? - To unsubscribe, e-mail: users-unsubscr...@wicket.apache.org For additional commands, e-mail: users-h...@wicket.apache.org - To unsubscribe, e-mail: users-unsubscr...@wicket.apache.org For additional commands, e-mail: users-h...@wicket.apache.org