Hi Tom, thanks for the great summary.
I want to comment on one element of your writeup, the hidden service on box A, webserver on box B. My weak belief is that this is no different than the "SSL added and removed here" issue which impacts many 'secure sites.'
Imposing a requirement that a person must swat away error messages which may or may not be accurate seems like a good way to train people to further ignore those messages, or dive into the ratholes of making it hard to do so.
Adam
On Fri, Nov 14, 2014 at 11:08:38AM -0600, Tom Ritter wrote: | There's been a spirited debate on irc, so I thought I would try and | capture my thoughts in long form. I think it's important to look at | the long-term goals rather than how to get there, so that's where I'm | going to start, and then at each item maybe talk a little bit about | how to get there. So I think the Tor Project and Tor Browser should: | | a) Eliminate self-signed certificate errors when browsing https:// on | an onion site | b) Consider how Mixed Content should interact with .onion browsing | c) Get .onion IANA reserved | d) Address the problems that Facebook is/was concerned about when | deploying a .onion | e) Consider how EV treatment could be used to improve poor .onion readability | | (If you're not familiar with DV [Domain Validated] and EV [Extended | Validation] certificates and their UI differences, you should take a | peek. For example [0]. There are other subtleties and requirements on | EV certs like OCSP checking that removes the indicator, and the | forthcoming CT effort in Chrome, but that's mostly orthogonal.) | | -------------------- | a) Self Signed Errors on .onion | | A .onion specifies the key needed. As far as little-t tor is | concerned, it got you to the correct endpoint safely, so whatever SSL | certificate is presented could be considered valid. | | However, if the Hidden Service is on box A and the webserver on box B | - you'd need to do some out-of-application tricks (like stunnel) to | prevent a MITM from attacking that connection. So as Roger suggested, | perhaps requiring the SSL certificate to be signed by the .onion key | would be a reasonable choice. But if you make that requirement, it | also implies that HTTP .onions are less secure than HTTPS .onions. | Which may or not be the case - you don't know. | | I'm not religious about anything other than getting rid of the error: | I don't like that users are trained to click through security errors. | | This is a weakly held opinion right now - but I think it's fair to | give DV treatment to http://.onion because it is, from little-t tor's | point of view, secure. Following that conclusion, it is therefore | fair to accept self-signed certificates and _not_ require a | certificate for a https://.onion be signed by the .onion key. | (Because otherwise, we're saying that SSL on .onion requires more | hoops to achieve security than HTTP on .onion, which isn't the case.) | | -------------------- | b) Mixed Content on .onion | | This is a can of worms I'm not going to open in this mail. But it's | there, and I think it's worth thinking about whether a .onion | requesting resources from http://.com or https://.com is acceptable. | | -------------------- | c) Get .onion IANA reserved | | I think this is fairly apparent in itself, and is in the works [1]. | Not sure its status but I would be happy to lend time in whatever IETF | group/work is needed if it will help. | | -------------------- | d) Address the problems that Facebook is/was concerned about when | deploying a .onion | | There are reasons, technical and political, why Facebook went and got | a HTTPS cert for their .onion. I've copied Alec so hopefully he'll | agree, refute, or add. But from my perspective, if I were Facebook or | another large company like that: | | i) I don't want to train my users to click through HTTPS warnings. | (Conversely, I like training my users to type https://mysite.com) | ii) I don't want to have to do the development and QA work to cut my | site over to be sometimes-HTTP if it's normally always HTTPS | iii) It would be convenient if I didn't have to do stunnel tricks to | encrypt the connection between my Hidden Service server and (e.g.) | load balancer, which is on another box | iv) I'd really like to get a green box showing my org name, and it's | even better that it'd be very difficult a phisher to get that | | (iii) can contradict with (A) above of course. Because I came to the | conclusion of allowing invalid certificates, a MITM could attack | Facebook between the HS server and load balancer. I'm not sure there | is an elegant solution there. One would probably have to tunnel the | connection over a mutually authenticated stunnel connection to prevent | a MITM. But frankly, if we assume users are used to clicking through | self-signed certs and we want to start the process of training them | _not_ to, Facebook would have to do this now _anyway_. So... =/ I | guess documenting the crap out of this concern and providing examples | may be the best solution based off my mindset right now. | | It's awesome that Facebook set up a Hidden Service. I'd love to get a | lot more large orgs doing that. We should reach out and figure out | what the blockers are, what's painful about it, and what we can do to | help. I would love doing that, it would be awesome. (And I'm not | afraid to NDA myself up if necessary, seeing as I'm under NDA with | half of the Bay Area anyway.) | | -------------------- | e) Consider how EV treatment could be used to improve poor .onion readability | | This is the trickiest one, and it overlaps the most with the question | of "Should we encourage CAs to issue certificates?" | | EV treatment in Tor Browser is a tool in the toolbox. I think it would | be wasteful of written code and users who are accustomed to seeing it | to not make use of it. I also think it dovetails nicely with how | unreadable HS addresses are and how much more unreadable they're going | to get soon when they get longer. | | I don't want a system that _requires_ participating in the DNS or CA | model. Free or Paid, you still have to provide identifying information | - and for an anonymity project I think we can all agree that's | unacceptable. But as we hopefully expand hidden services to more and | more corporate services - these organizations are legitimately | concerned about (e.g.) phishing, and it's unreasonable to expect users | to meticulously validate a .onion address. (Let alone how you find | what the address should be validated against.) | | But a problem is that if we allow a .onion to certify anything it | wants, it can certify any fraudulent information it wants. | Bootstrapping off the other axis of Zooko's Triangle (Secure and Human | Meaningful, but Centralized) is a way to combat that fraudulent | information. (Not the only way, but a way.) | | Syrup-tan had an idea on irc: Have a DV certificate sign a certificate | that is valid for the .onion URL, and display the URL of the DV | certificate. This doesn't eliminate phishing - I can register | facebok.com and then get that displayed. But doing bootstapping off | DNS and DV certificates is a fairly low bar in terms of the cost to a | .onion operator. (There are other concerns here, I'm not completely | comfortable with repurposing the EV indicator in this way. Asa on irc | had the good point that if we did this, maybe we'd want to change the | EV green to another color just to be a little bit different. Not that | I really expect users to notice that though...) | | Allowing an organization to purchase an EV certificate from a CA, and | display the organization's name in the address bar, is another way - | albeit a very high bar in terms of cost to an onion operator. | | A petname system based off who-knows-what (for example the | namecoin/sovereign-keys like system of a land-grab, first-to-the-name | approach) is a third, and would meet the goal of not requiring | participating in the DNS and CA systems. but a high bar in terms of | engineering effort for Tor. | | I think Tor Browser should do several of them. I think the EV | certificates + partnering with CAs is dead simple and requires no | engineering effort on behalf of Tor Browser. So that's a win, and I | think worth doing. But there should be at least one more solution in | the short to long term (e.g. a petname approach). Unfortunately, if | the time between now and the 'long term' solution is too long, it | locks out everyone who can't get an EV cert - which is a legitimate | concern. Perhaps after there's a spec Tor likes, some large | organization concerned about preventing phishing could throw some | engineering time at the problem. | | Anyway, if it's not clear, I am volunteering to work on these things | as I'm able. | | -tom | | [0] https://ftt-uploads.s3.amazonaws.com/browser-ssl-ui-comparison.png | [1] https://datatracker.ietf.org/doc/draft-grothoff-iesg-special-use-p2p-names/ | _______________________________________________ | tor-dev mailing list | tor-dev@lists.torproject.org | https://lists.torproject.org/cgi-bin/mailman/listinfo/tor-dev