A mechanism which empowers detecting and stopping what you and I consider to be 'evil' could be harnessed and used to target non-evil things, and that's where the problem is.
Let's pretend that tomorrow, Tor gained the ability to filter out evil images. Shortly thereafter, governments might start making demands upon the Tor designers to utilize the underlying technical components of that system according to their own designs and demands.
Such a scenario could be dangerous for everyone, including the designers personally. What happens to them if/when they refuse a demand like that, or potentially do a Lavabit?
What if the existence of such a function can be corrupted and used in a rogue sort of relay to track things it wasn't intended to?
Not to mention, at the point where anyone becomes able to note or affect the content, the question of who is in control of and responsible for it comes into play. Assuming Tor could exist in such a state, you may personally choose to filter CP, but then your government or ISP demands that you also filter activists X, Y, and Z, and list of topics A. Your line at 'evil' would be forceably drawn by other people, just like it is on the public internet today.
Btw our exit node blocks random unknown ports ('bittorrent') to aid the node's longevity. It wasn't a moral stand.
On Tuesday 27/08/2013 at 12:09 pm, Jon Gardner wrote:
On Aug 22, 2013, at 11:56 AM, mick mbm@rlogin.net wrote:
The other thing that I am weighing is just a moral question regarding misuse of the Tor network for despicable things like child porn. I understand that of all the traffic it is a small percentage and that ISPs essentially face the same dilemma, but I wonder if more can be done to make Tor resistant to evil usage.
Tor is neutral. You and I may agree that certain usage is unwelcome, even abhorrent, but we cannot dictate how others may use an anonymising service we agree to provide. If you have a problem with that, you probably should not be running a tor node.
Then why have exit policies? Exit nodes regularly block "unwelcome" traffic like bittorrent, and there's only a slight functional difference between that and using a filter in front of the node to block things like porn (which, come to think of it, also tends to be a bandwidth hog like bittorrent--so it doesn't have to be just a moral question). If someone has a problem with exit nodes blocking things like porn (or bittorrent, or...), then they probably should not be using Tor.
The very idea of Tor is based on moral convictions (e.g., that personal privacy is a good thing, that human rights violations and abuse of power are bad things, etc.). So Tor is most definitely not neutral, nor can it be--because, if it is to exist and flourish, those moral convictions must remain at its foundation. One cannot on the one hand claim that human rights violations are "wrong" while on the other hand claiming that pornography (especially child porn) is "right." If one wants further proof that Tor has a moral component, one has only to visit http://www.torproject.org, click the "About Tor" link, and notice the discussion points. I doubt that anyone could convince the Tor team to add "...for unfettered access to pornography..." as a bullet point under "Why we need Tor."
The Tor devs go to great lengths to try to keep "evil" governments from using Tor against itself. Why not devote some effort toward keeping "evil" traffic off of Tor? Given the fact that "we need more relays" is the common mantra, it seems to me that if the Tor community could come up with a technical answer to address at least some of the most egregious abuses of Tor--things like child porn, or even porn in general, that either have nothing to do with Tor's foundational mission, or (like child porn) are antithetical to it--the result would be greater public support for the technology, and a wider deployment base.
It's worth discussion.
Jon
tor-relays mailing list tor-relays@lists.torproject.org https://lists.torproject.org/cgi-bin/mailman/listinfo/tor-relays
tor-relays@lists.torproject.org