Yikes, thanks for getting these Karsten! I don't think we should omit the earlier results since the python community is still very much split between 2.7 and 3.x. I'll include both so users know they can upgrade their interpreter to get a nice little speed boost.
Thanks!
On Fri, Jan 15, 2016 at 5:43 AM, Karsten Loesing karsten@torproject.org wrote:
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
On 14/01/16 17:22, Damian Johnson wrote:
Oh, forgot to talk about compression. You can run the stem script against compressed tarballs but python didn't add lzma support until python 3.3...
https://stem.torproject.org/faq.html#how-do-i-read-tar-xz-descriptor-archive...
I suppose we could run over bz2 or gz tarballs, or upgrade python. But can't say the compressed benchmark is overly important.
I just ran all the Stem measurements using Python 3, which now includes xz tarballs. The table below contains all results:
server-descriptors-2015-11.tar.xz:
- metrics-lib: 0.334261 ms
- Stem[**]: 0.63 ms (188%)
server-descriptors-2015-11.tar:
- metrics-lib: 0.28543 ms
- Stem: 1.02 ms (357%)
- Stem[**]: 0.63 ms (221%)
server-descriptors-2015-11/:
- metrics-lib: 0.682293 ms
- Stem: 1.11 ms (163%)
- Stem[**]: 1.03 ms (151%)
- Zoossh: 0.458566 ms (67%)
extra-infos-2015-11.tar.xz:
- metrics-lib: 0.274610 ms
- Stem[**]: 0.46 ms (168%)
extra-infos-2015-11.tar:
- metrics-lib: 0.2155 ms
- Stem: 0.68 ms (316%)
- Stem[**]: 0.42 ms (195%)
consensuses-2015-11.tar.xz:
- metrics-lib: 255.760446 ms
- Stem[**]: 913.12 ms (357%)
consensuses-2015-11.tar:
- metrics-lib: 246.713092 ms
- Stem: 1393.10 ms (565%)
- Stem[**]: 876.09 ms (355%)
consensuses-2015-11/:
- metrics-lib: 283.910864 ms
- Stem: 1303.53 ms (459%)
- Stem[**]: 873.45 ms (308%)
- Zoossh: 83 ms (29%)
microdescs-2015-11.tar.xz[*]:
- metrics-lib: 0.099397 ms
- Stem[**]: 0.33 ms (332%)
microdescs-2015-11.tar[*]:
- metrics-lib: 0.066566 ms
- Stem: 0.66 ms (991%)
- Stem[**]: 0.34 ms (511%)
[*] The microdescs* tarballs contain microdesc consensuses and microdescriptors, but I only cared about the latter; what I did is extract tarballs, delete microdesc consensuses, and re-create and re-compress tarballs
[**] Run with Python 3.5.1
Is Python 3 really that much faster than Python 2? Should we just omit Python 2 results from this comparison?
All the best, Karsten -----BEGIN PGP SIGNATURE----- Comment: GPGTools - http://gpgtools.org
iQEcBAEBAgAGBQJWmPd6AAoJEJD5dJfVqbCrW2IIAL7KyVxDbLczXjtzgwLxFjzw s9AjhRILb4cBUwr4N4bFAe6x2rXT5w0dEOweMqjcki7IQ4+/gcjok3PLvT6z6lUW 5pKHppU8OmaZItARvGRNlDxWt4E2SSP597GwTWr7rPwwjRRjXmqNPrWAUzq1eteB S8os9M2whsEntfUF+aPmZbu2oNzJYdnOL/B139MA72nuo9d6no3CXyTFfvT4a9kV K8vg1w54yDtyp15+uVGaJjfbQRJdPRmpjzkSntngnvSL098g1Rq7coRARMrIJ4BB 8+WjqtoU5IlnuMS3U/aC/FaXFWLz0vHoXci33ZP+kwmX4GywC1mC/QGbvinlkPk= =WQF6 -----END PGP SIGNATURE-----