Hi,
Clodo clodo@clodo.it wrote:
On 05/05/2015 16:19, balooni@espiv.net wrote:
Clodo wrote:
I'm the creator of the no-profit service http://www.neumon.org . It's a project similar to OONI, but focused only on DNS and HTTP.
Do you have the code published somewhere?
We release the source of the probe here: https://github.com/AirVPN/neumon-probe Written in C#/Mono. I run it from RaspBian on Raspberry PI.
We are creating a Rasberry Pi image for ooniprobe [1]. You are more than welcome to use ooniprobe and start submiting reports to the OONI backend.. we are about to launch a more effective database solution for the pipeline.
But it's not a great piece of software. Simply it fetch from our backend the list of domains to try to resolve/fetch, do it, and resend the results. All detection are server-side based.
How did you compile this list of domains was reported by other users or you just used some public blacklists?
The backend it's written in php, sources never released. Contain basically a lot of mysql queries to detect stuffs and generate report.
Interesting, care to share these queries?
Generally, i have a lot of data, catched automatically, that require manual works to obtain nice and clean report, i'm in stall on this kind of works.
Let me know if you would like help on this.
In any case it would be very interesting to see these results or of the ones that can be made public.
We also build a probe software, to allow other activists connected to the ISP directly to launch it and detect censorship not based on DNS.
It will be very interesting to instruct the probe software submit results to an ooni backend [2]. In any case the probe software can maybe even written as an ooniprobe test [3].
I understand you already have some DNS tests on ooniprobe. I will study them. But actually i don't understand what are the lists of domain tested by OONI, how you detect spoof, and where/if you results are published.
Currently we use the URL lists maintained by citizenlab [2]. These lists are far from complete but they cover a variety of potential blocked websites per country or globally. But every user can provide its own list of websites/domains.
An example of a relevant test is the http_requests test [3] that probes a website/domain from the probes Internet connection and via Tor and then compares and checks if the body proportions matches.
The reports are being published here [4]. As I mentioned above we are working in a newer database and pipeline implementation.
My mysql data it's around 25 gb. I think maybe better (for maintenance and independency) not to create OONI tests linked to neumon.org project. I think maybe better if i create some webservices in neumon.org to expose my data, where OONI backend can fetch interesting data for your research.
It will be really nice if you can provide and/or submit this data to the OONI database pipeline. In any case we could collaborate on analyzing these data.
For example, i can provide a list of DNS servers we detect (open to query and with recursion enabled). Or i can provide a list of "open/recursive DNS Server IP -> query domain "xxx" -> the result "ip address" it's probably a blocking page.
Indeed this list will be very useful.
[1] https://github.com/anadahz/lepidopter [2] https://github.com/citizenlab/test-lists [3] https://github.com/TheTorProject/ooni-spec/blob/master/test-specs/ts-003-htt... [4] https://ooni.torproject.org/reports/
Cheers ~anadahz