Hi everyone,
We've been continuing our work on the download site (current version: https://wpapper.github.io/tor-download-web/
). We're now looking for suggestions on providing downloads for censored countries.
Currently, we have a few ideas for making downloads accessible in censored countries:
1. Host the downloads directly on each mirror While this would work, the combined size of all of the files is greater than GitHub's 1GB limit per repository. This means that we'd need to setup a script to pull from the mirror, setup a cron job to update the files, cull unnecessary files, etc. This solution gets complex very quickly, so it's not ideal.
2. Use an external download mirror that is not torproject.org Could we use something like Amazon S3 or Sourceforge? It's possible, but then we're just relying on another single point of failure. This isn't ideal, and it would be better if the downloads were distributed.
3. Provide torrents to users in censored countries This seems much more difficult to block, which is good. I couldn't find any official TBB torrents, though.
4. Assume that the user is not living in a censored country We could always assume that torproject.org is accessible, but we don't want to make censored users unable to use the mirror.
Does anyone have any comments? Any other proposed solutions?
Thanks for your help.
Sincerely, Will Papper
On Sat, May 03, 2014 at 08:35:56PM -0400, william@papper.me wrote 4.0K bytes in 0 lines about: : We've been continuing our work on the download site (current version: : https://wpapper.github.io/tor-download-web/
Looks great.
: ). We're now looking for suggestions on providing downloads for censored : countries.
See https://trac.torproject.org/projects/tor/ticket/11568 for the general ticket.
Should we loop in tor-talk on this? They might have some additional ideas =)
William Papper wrote:
We're now looking for suggestions on providing downloads for censored countries.
I've been working on this recently with Satori [1][2], and decided to mirror on AWS, Github, and Chrome Web Store. (that last one is a logistical nightmare and not recommended).[4]
The reason is that these are places where there's a strong financial incentive for countries to not block them or MITM. Doesn't mean that they won't wind up blocked or tampered with, but makes it less likely. Both AWS and Github are also accessible in Iran and China.
- Host the downloads directly on each mirror
While this would work, the combined size of all of the files is greater than GitHub's 1GB limit per repository.
I've talked to github about this -- specifically about distributing software -- and they said that it's a soft limit. I have repositories that are ~2GB which are fine. Might be better to divide into individual repos by language if you're concerned they might change their policies.
- Use an external download mirror that is not torproject.org
Could we use something like Amazon S3 or Sourceforge?
AWS s pretty straightforward, but I would not suggest Sourceforge due to their advertising policies.
- Provide torrents to users in censored countries
This seems much more difficult to block, which is good. I couldn't find any official TBB torrents, though.
Potential problem[3] with this is that if an adversary becomes a seeder, they can tally IP addresses of people trying to get ahold of circumvention software. Highly problematic for people who might get a knock at the door. Also, not sure how likely it is that the torrent trackers would just get blocked.
- Assume that the user is not living in a censored country
Can you expand on this a bit?
best, Griffin
[1] https://github.com/glamrock/satori [2] https://chrome.google.com/webstore/detail/satori/oncomejlklhkbffpdhpmhldlfam... [3] https://mailman.stanford.edu/pipermail/liberationtech/2014-March/013158.html [4] So the process here is that one is distributing unlisted "apps" which are .crx files. Within those compressed files are the TBB and a required manifest.json file. That's pretty straightforward, and nigh-unblockable, but downloading a crx as a zip automatically is difficult for windows/mac (easy for linux). And there are currently 60 bundles total (30 for linux). Making these could be scripted. Every Google Chrome Developer account maxes out at 20 apps or extensions, so we'd still need to create/verify 2-3 accounts if we wanted full language support. Like I said, logistical nightmare, but I do it for Arabic, Farsi, and Chinese because the tradeoffs are IMO worth it (and 6 is no big deal).
If we can go up to 1.7GB, then that's not a problem. There could also be a simple script setup to clone tbb-bin https://github.com/glamrock/tbb-binif GitHub does start to enforce the limit on our repo, or we could start looking at external sources. My ideal is that someone can just use "git clone" and have a working mirror, so I'd prefer for the script to be a backup plan.
Is tbb-bin currently updated by a script, or is everything done manually?
On Sun, May 4, 2014 at 1:52 AM, Griffin Boyce griffin@cryptolab.net wrote:
Should we loop in tor-talk on this? They might have some additional ideas =)
William Papper wrote:
We're now looking for suggestions on providing downloads for
censored countries.
I've been working on this recently with Satori [1][2], and decided to mirror on AWS, Github, and Chrome Web Store. (that last one is a logistical nightmare and not recommended).[4]
The reason is that these are places where there's a strong financial incentive for countries to not block them or MITM. Doesn't mean that they won't wind up blocked or tampered with, but makes it less likely. Both AWS and Github are also accessible in Iran and China.
- Host the downloads directly on each mirror
While this would work, the combined size of all of the files is greater than GitHub's 1GB limit per repository.
I've talked to github about this -- specifically about distributing software -- and they said that it's a soft limit. I have repositories that are ~2GB which are fine. Might be better to divide into individual repos by language if you're concerned they might change their policies.
- Use an external download mirror that is not torproject.org
Could we use something like Amazon S3 or Sourceforge?
AWS s pretty straightforward, but I would not suggest Sourceforge due to their advertising policies.
- Provide torrents to users in censored countries
This seems much more difficult to block, which is good. I couldn't find any official TBB torrents, though.
Potential problem[3] with this is that if an adversary becomes a seeder, they can tally IP addresses of people trying to get ahold of circumvention software. Highly problematic for people who might get a knock at the door. Also, not sure how likely it is that the torrent trackers would just get blocked.
- Assume that the user is not living in a censored country
Can you expand on this a bit?
best, Griffin
[1] https://github.com/glamrock/satori [2] https://chrome.google.com/webstore/detail/satori/ oncomejlklhkbffpdhpmhldlfambmjlf [3] https://mailman.stanford.edu/pipermail/liberationtech/2014- March/013158.html [4] So the process here is that one is distributing unlisted "apps" which are .crx files. Within those compressed files are the TBB and a required manifest.json file. That's pretty straightforward, and nigh-unblockable, but downloading a crx as a zip automatically is difficult for windows/mac (easy for linux). And there are currently 60 bundles total (30 for linux). Making these could be scripted. Every Google Chrome Developer account maxes out at 20 apps or extensions, so we'd still need to create/verify 2-3 accounts if we wanted full language support. Like I said, logistical nightmare, but I do it for Arabic, Farsi, and Chinese because the tradeoffs are IMO worth it (and 6 is no big deal).
Tor Website Team coordination mailing-list
To unsubscribe or change other options, please visit: https://lists.torproject.org/cgi-bin/mailman/listinfo/www-team
William Papper wrote:
If we can go up to 1.7GB, then that's not a problem. There could also be a simple script setup to clone tbb-bin [6] if GitHub does start to enforce the limit on our repo, or we could start looking at external sources. My ideal is that someone can just use "git clone" and have a working mirror, so I'd prefer for the script to be a backup plan.
Is tbb-bin currently updated by a script, or is everything done manually?
Everything is done manually. If this gets used as the source for mirror downloads, I'd likely remove the version numbers (with a mention in the readme) so that out-of-date pages will continue to link to working bundles. And right now I blow away the repo entirely instead of updating it, but I can just set up a different process to redact the old bundle commits and add new ones. (Rather than have git store all the outdated bundles, which could get problematic when cloning).
~Griffin
That would be great. I could then add your repo as a submodule, and someone would just need to pull both for updates. I assume that it would be pretty easy for you to reset the commits, since only new versions will be added and there would be nothing in the commit history you would need to preserve.
Alternatively, if the commits themselves cannot be removed easily, a new repo could be created for each version. Then, the submodules reference can just be updated instead of updating the submodule itself.
We could also use releases on the tor-download-web repo, and then have a script download the files for each release automatically. The script can be updated through git to take care of updating the files.
Which solution seems best?
On Sun, May 4, 2014 at 6:00 PM, Griffin Boyce griffin@cryptolab.net wrote:
William Papper wrote:
If we can go up to 1.7GB, then that's not a problem. There could also be a simple script setup to clone tbb-bin [6] if GitHub does start to
enforce the limit on our repo, or we could start looking at external sources. My ideal is that someone can just use "git clone" and have a working mirror, so I'd prefer for the script to be a backup plan.
Is tbb-bin currently updated by a script, or is everything done manually?
Everything is done manually. If this gets used as the source for mirror downloads, I'd likely remove the version numbers (with a mention in the readme) so that out-of-date pages will continue to link to working bundles. And right now I blow away the repo entirely instead of updating it, but I can just set up a different process to redact the old bundle commits and add new ones. (Rather than have git store all the outdated bundles, which could get problematic when cloning).
~Griffin
Tor Website Team coordination mailing-list
To unsubscribe or change other options, please visit: https://lists.torproject.org/cgi-bin/mailman/listinfo/www-team
William Papper wrote:
That would be great. I could then add your repo as a submodule, and someone would just need to pull both for updates. I assume that it would be pretty easy for you to reset the commits, since only new versions will be added and there would be nothing in the commit history you would need to preserve.
I'd obliterate old commits with
git filter-branch --index-filter 'git update-index --remove filename'
so the old binaries don't stick around.
Alternatively, if the commits themselves cannot be removed easily, a new repo could be created for each version. Then, the submodules reference can just be updated instead of updating the submodule itself.
I like the idea of having it all in one repo (or at most divided by language), so that it's easier to maintain/get, and easy for githubbers to track changes to. Though might change the repository name.
We could also use releases on the tor-download-web repo, and then have a script download the files for each release automatically. The script can be updated through git to take care of updating the files.
Which solution seems best?
Might be good to check in with Runa and Weasel. Not sure if they're on this list.
~Griffin
Tor image from the landing page reduced by 19% with 0 loss or resolution
On Mon, May 5, 2014 at 2:01 AM, Griffin Boyce griffin@cryptolab.net wrote:
William Papper wrote:
That would be great. I could then add your repo as a submodule, and someone would just need to pull both for updates. I assume that it would be pretty easy for you to reset the commits, since only new versions will be added and there would be nothing in the commit history you would need to preserve.
I'd obliterate old commits with
git filter-branch --index-filter 'git update-index --remove filename'
so the old binaries don't stick around.
Alternatively, if the commits themselves cannot be removed easily, a
new repo could be created for each version. Then, the submodules reference can just be updated instead of updating the submodule itself.
I like the idea of having it all in one repo (or at most divided by language), so that it's easier to maintain/get, and easy for githubbers to track changes to. Though might change the repository name.
We could also use releases on the tor-download-web repo, and then have
a script download the files for each release automatically. The script can be updated through git to take care of updating the files.
Which solution seems best?
Might be good to check in with Runa and Weasel. Not sure if they're on this list.
~Griffin ________________________________________________________________________ Tor Website Team coordination mailing-list
To unsubscribe or change other options, please visit: https://lists.torproject.org/cgi-bin/mailman/listinfo/www-team
Thanks Earl. What techniques did you use?
On Sun, May 4, 2014 at 8:34 PM, Earl G globallogins@gmail.com wrote:
Tor image from the landing page reduced by 19% with 0 loss or resolution
On Mon, May 5, 2014 at 2:01 AM, Griffin Boyce griffin@cryptolab.netwrote:
William Papper wrote:
That would be great. I could then add your repo as a submodule, and someone would just need to pull both for updates. I assume that it would be pretty easy for you to reset the commits, since only new versions will be added and there would be nothing in the commit history you would need to preserve.
I'd obliterate old commits with
git filter-branch --index-filter 'git update-index --remove filename'
so the old binaries don't stick around.
Alternatively, if the commits themselves cannot be removed easily, a
new repo could be created for each version. Then, the submodules reference can just be updated instead of updating the submodule itself.
I like the idea of having it all in one repo (or at most divided by language), so that it's easier to maintain/get, and easy for githubbers to track changes to. Though might change the repository name.
We could also use releases on the tor-download-web repo, and then have
a script download the files for each release automatically. The script can be updated through git to take care of updating the files.
Which solution seems best?
Might be good to check in with Runa and Weasel. Not sure if they're on this list.
~Griffin ________________________________________________________________________ Tor Website Team coordination mailing-list
To unsubscribe or change other options, please visit: https://lists.torproject.org/cgi-bin/mailman/listinfo/www-team
Tor Website Team coordination mailing-list
To unsubscribe or change other options, please visit: https://lists.torproject.org/cgi-bin/mailman/listinfo/www-team
The Google pagespeed plugin for chrome. It also shows other optimizations but they are things such as minifying the js. In my opinion keeping the code readable for security reasons is much better than minifying and optimizing to speed up the loading 10%
On Mon, May 5, 2014 at 3:02 AM, William Papper william@papper.me wrote:
Thanks Earl. What techniques did you use?
On Sun, May 4, 2014 at 8:34 PM, Earl G globallogins@gmail.com wrote:
Tor image from the landing page reduced by 19% with 0 loss or resolution
On Mon, May 5, 2014 at 2:01 AM, Griffin Boyce griffin@cryptolab.netwrote:
William Papper wrote:
That would be great. I could then add your repo as a submodule, and someone would just need to pull both for updates. I assume that it would be pretty easy for you to reset the commits, since only new versions will be added and there would be nothing in the commit history you would need to preserve.
I'd obliterate old commits with
git filter-branch --index-filter 'git update-index --remove filename'
so the old binaries don't stick around.
Alternatively, if the commits themselves cannot be removed easily, a
new repo could be created for each version. Then, the submodules reference can just be updated instead of updating the submodule itself.
I like the idea of having it all in one repo (or at most divided by language), so that it's easier to maintain/get, and easy for githubbers to track changes to. Though might change the repository name.
We could also use releases on the tor-download-web repo, and then have
a script download the files for each release automatically. The script can be updated through git to take care of updating the files.
Which solution seems best?
Might be good to check in with Runa and Weasel. Not sure if they're on this list.
~Griffin ________________________________________________________________________ Tor Website Team coordination mailing-list
To unsubscribe or change other options, please visit: https://lists.torproject.org/cgi-bin/mailman/listinfo/www-team
Tor Website Team coordination mailing-list
To unsubscribe or change other options, please visit: https://lists.torproject.org/cgi-bin/mailman/listinfo/www-team
Tor Website Team coordination mailing-list
To unsubscribe or change other options, please visit: https://lists.torproject.org/cgi-bin/mailman/listinfo/www-team
I was trying out filter-branch on a test repo, and it appeared that the repo needed to be re-cloned after using filter-branch. Will this be a problem? It seems that in a worst case scenario I could just create a new submodule each time.
On Sun, May 4, 2014 at 8:01 PM, Griffin Boyce griffin@cryptolab.net wrote:
William Papper wrote:
That would be great. I could then add your repo as a submodule, and someone would just need to pull both for updates. I assume that it would be pretty easy for you to reset the commits, since only new versions will be added and there would be nothing in the commit history you would need to preserve.
I'd obliterate old commits with
git filter-branch --index-filter 'git update-index --remove filename'
so the old binaries don't stick around.
Alternatively, if the commits themselves cannot be removed easily, a
new repo could be created for each version. Then, the submodules reference can just be updated instead of updating the submodule itself.
I like the idea of having it all in one repo (or at most divided by language), so that it's easier to maintain/get, and easy for githubbers to track changes to. Though might change the repository name.
We could also use releases on the tor-download-web repo, and then have
a script download the files for each release automatically. The script can be updated through git to take care of updating the files.
Which solution seems best?
Might be good to check in with Runa and Weasel. Not sure if they're on this list.
~Griffin ________________________________________________________________________ Tor Website Team coordination mailing-list
To unsubscribe or change other options, please visit: https://lists.torproject.org/cgi-bin/mailman/listinfo/www-team
El sáb, 03-05-2014 a las 20:35 -0400, William Papper escribió:
Hi everyone,
We've been continuing our work on the download site (current version: https://wpapper.github.io/tor-download-web/
). We're now looking for suggestions on providing downloads for censored countries.
Currently, we have a few ideas for making downloads accessible in censored countries:
- Host the downloads directly on each mirror
While this would work, the combined size of all of the files is greater than GitHub's 1GB limit per repository. This means that we'd need to setup a script to pull from the mirror, setup a cron job to update the files, cull unnecessary files, etc. This solution gets complex very quickly, so it's not ideal.
Indeed, this is the way I'd use.
A properly set up system with just one directory to sync, everything inside to be sync'd sounds ideal to me. We can easily organize it that way and use rsync (by hand or scripted) to get mirrors in sync. We can also have four layers of mirrors:
1 Official push mirrors (we will have the key to update them) 2 Official pull mirrors (mirrors listed in our page with a pull agenda, updated twice a day, once a day, once a week...) 3 Non-official listed pull mirrors (volunteers and organizations like universities can set up these on their own) 4 Hidden pull mirrors (volunteer run mirrors that can be provided to needing users in the same way as bridges)
Regards
er Envite