On Fri, 22 Apr 2016 14:58:45 +0200 Jeff Burdges burdges@gnunet.org wrote:
On Fri, 2016-04-22 at 11:10 +0000, Yawning Angel wrote:
On Fri, 22 Apr 2016 11:41:30 +0200 Jeff Burdges burdges@gnunet.org wrote:
I'd imagine everyone in this thread knows this, but New Hope requires that "both parties use fresh secrets for each instantiation".
Yep. Alice can cache the public 'a' parameter, but everything else needs to be fresh, or really really bad things happen.
I'd assume that 'a' could be generated by both parties from a seed parameter declared by Alice? I haven't noticed it being secretly tweaked by Alice.
Indeed. The paper (and code) uses a 256 bit seed value, and SHAKE128.
If 'a' must be sent, then 'a' would double the key size from one side, no? In that case, one should consider if reversing the Alice and Bob roles of the client and server helped by either (a) adjusting the traffic flow to mask circuit building, or (b) allowing one to stash 'a' in the descriptor. I donno if there are any concerns like the client needing to influence 'a'.
There's still an asymmetry in the amount of data sent by each party because the (b, seed) and (u, r) are different lengths, with r being larger than seed (So Bob's response is longer).
The easiest thing to do would be to append padding to (b, seed) (which is what the code I'm writing that uses this primitive does. Stashing 'a' in the descriptor won't save much (eliminating one SHAKE-128 call per side isn't enough of a performance gain to justify not randomizing 'a' for every key exchange IMO).
There is some chance SIDH might wind up being preferable for key exchanges with long term key material.
Maybe. Some people have hinted at an improved version of SPHINCS256 being in the works as well.
Ain't clear how that'd work.
There are homomorphic MAC like constructions, like accumulators, which might let you reuse your Merkle tree. I though these usually needed assumptions that Shor's algorithm nukes, like one is f(x,y) = x^7 + 3 y^7 mod N with N=pq secret, for example.
I suppose Nyberg's accumulator scheme is post-quantum, but I though it's huge too. I'm not completely sure if just an accumulator helps anyways.
I didn't get details unfortunately.
My current assumption is that by the time we need to seriously start thinking about auth and need to design/deploy a new construct around that as part of our threat model, both signatures and key exchange algorithms will be more fleshed out, but in the immediate future, going by what's available and practical pushes me towards NTRUEncrypt/Ring-LWE.
Regards,