On 05/04/14 17:46, Lukas Erlacher wrote:
Hello Nikita, Karsten,
On 04/05/2014 05:03 PM, Nikita Borisov wrote:
On Sat, Apr 5, 2014 at 3:58 PM, Karsten Loesing karsten@torproject.org wrote:
Installing packages using Python-specific package managers is going to make our sysadmins sad, so we should have a very good reason for wanting such a package. In general, we don't need the latest and greatest package. Unless we do.
What about virtualenv? Part of the premise behind it is that you can configure appropriate packages as a developer / operator without having to bother sysadmins and making them worried about system-wide effects.
- Nikita
I was going to mention virtualenv as well, but I have to admit that I find it weird and scary, especially since I haven't found good documentation for it. If there is somebody who is familiar with virtualenv that would probably be the best solution.
I'm afraid I don't know enough about Python or virtualenv. So far, it was almost zero effort for our sysadmins to install a package from the repositories and keep that up-to-date. I'd like to stick with the apt-get approach and save the virtualenv approach for situations when we really need a package that is not contained in the repositories.
Thanks for the suggestion, though!
On 04/05/2014 04:58 PM, Karsten Loesing wrote:
My hope with challenger is that it's written quickly, working quietly for a year, and then disappearing without anybody noticing. I'd rather not want to maintain yet another thing. So, maybe Weather is a better candidate for using onion-py than challenger.
Yes, I understand.
Yeah, I think we'll want to define a maximum lifetime of cache entries, or the poor cache will explode pretty soon.
What usage patterns do we have to expect? Do we want to hit onionoo to check if the cache is still valid for every request, or should we do "hard caching" for several minutes? The best UX solution would be to have a background task that keeps the cache current so user requests can be delivered without hitting onionoo at all.
That's a fine question. I can see various caching approaches here. But I just realize that this is premature optimization. Let's first build the thing and download whatever we need and whenever we need it. And once we know what caching needs we have, let's build the cache.
In other words, unless we do something intelligent with the cache, the cache is not actually going to be very useful.
Valid point. :)
Great, your help would be much appreciated! Want to send me a pull request whenever you have something to merge?
Will do.
Great. Thanks!
All the best, Karsten