From: Alexandre Alapetite <[email protected]>

Hello,
Here is a suggestion of patch for additional security by opting-out by
default from Web search-engine crawling and indexing.
This will prevent exposure to so-called "Google hacking" [1].
Currently, the Web administration interface (LuCI and others) of many
OpenWRT systems can be discovered this way [2],
which may be a problem when users have not changed the default
password or used a strong one, or when vulnerabilities are discovered.

The patch consists of the creation of a standard /www/robots.txt file
with a global "Disallow" instruction, as per the convention [3],
and which I suggest doing in the Makefile of the base-files package
[4] just after the creation of the parent /www directory.

[1] Google hacking: http://en.wikipedia.org/wiki/Google_hacking
[2] Example of Google search:
http://www.google.com/search?q=inurl:"/cgi-bin/luci/";
[3] The Robots Exclusion Protocol: http://www.robotstxt.org/robotstxt.html
[4] Location of the patch:
https://dev.openwrt.org/browser/trunk/package/base-files/Makefile#L130

Signed-off-by: Alexandre Alapetite <[email protected]>,
http://alexandre.alapetite.fr

---

Index: package/base-files/Makefile
===================================================================
--- package/base-files/Makefile    (revision 37132)
+++ package/base-files/Makefile    (working copy)
@@ -128,6 +128,7 @@
     mkdir -p $(1)/usr/bin
     mkdir -p $(1)/sys
     mkdir -p $(1)/www
+    echo -e "User-agent: *\nDisallow: /\n" > $(1)/www/robots.txt
     mkdir -p $(1)/root
     ln -sf /proc/mounts $(1)/etc/mtab
     rm -f $(1)/var
_______________________________________________
openwrt-devel mailing list
[email protected]
https://lists.openwrt.org/mailman/listinfo/openwrt-devel

Reply via email to