> Also, see the Tor technical FAQ wiki entry for this: > > > https://wiki.torproject.org/noreply/TheOnionRouter/TorFAQ#head-5e18f8a8f98fa9e69ffac725e96f39641bec7ac1
Which says: We'd like to make it still work even if the service is nearby the Tor relay but not on the same IP address. But there are a variety of technical problems we need to overcome first (the main one being "how does the Tor client learn which relays are associated with which websites in a decentralized yet non-gamable way?"). Here's a simple idea. Just as search engines added a "robots.txt" file, how about a web server providing a "torexit.txt" file, which is simply the list of tor exit nodes that the server considers "close" to itself? In other words, if I run a web server, I would list the tor nodes that I consider safe and nearby. Maybe they're something I run and control on a physically nearby machine. (Maybe even on an adjacent computer?). It's not attackable, without being able to attack the web server itself. Granted, that's not perfect, but if you can attack the web server, then tricking tor users is probably not the biggest fish to catch.

