david wong

Hey! I'm David, cofounder of zkSecurity and the author of the Real-World Cryptography book. I was previously a crypto architect at O(1) Labs (working on the Mina cryptocurrency), before that I was the security lead for Diem (formerly Libra) at Novi (Facebook), and a security consultant for the Cryptography Services of NCC Group. This is my blog about cryptography and security and other related topics that I find interesting.

Another look on Public key crypto posted January 2016

I was watching this excellent video on the birth of elliptic curves by Dan Boneh, and I felt like the explanation of Diffie-Hellman (DH) felt short. In the video, Dan goes on to explain the typical DH key exchange:

Alice and Bob each choose a public point \(g\) and a public modulus \(N\).

By the way. If you ever heard of "DH-1024" or some big number associated to Diffie-Hellman, that was probably the bitsize of this public modulus \(N\).

The exchange then goes like this:

  1. Alice generates her private key \(a\) and sends her public key \(g^a\pmod{N}\) to Bob.

  2. Bob generates his own private key \(b\) and sends his public key \(g^b\pmod{N}\) to Alice.

  3. They can both generate their shared key by doing either \((g^b)^a \pmod{N}\) for Alice, or \((g^a)^b \pmod{N}\) for Bob.

Dan then explains why this is secure: because given \((g, g^a, g^b)\) (the only values an eavesdropper can observe from this exchange) it's very hard to compute \(g^{ab}\), and this is called the Computational Diffie-Hellman problem (CDH).

But this doesn't really explain how the scheme works. You could wonder: but why doesn't the attacker do the exact same thing Bob and alice just did? He could just iterate the powers of \(g\) until \(g^a\) or \(g^b\) is found, right?

A key exchange with hash functions

Let's replace the exponentiation by a hash function. Don't worry I'll explain:

\(g\) will be our public input and \(h\) will be our hash function (e.g. sha256). One more thing: \(h^{3}(g)\) translates to \(h(h(h(g)))\).

So now our key exchange looks like this:

  1. Alice generates an integer \(a\) large enough and compute \(a\) iteration of the hash function \(h\) over \(g\), then sends it to Bob.

  2. Bob does the same with an integer \(b\) and sends \(h^b(g)\) to Alice (exact same thing that Alice did, different phrasing.)

  3. They both compute the share private key by doing either \((h^b(g))^a\) for Alice, or \((h^a(g))^b\) for Bob.

So if you understood the last part: Alice and Bob both iterated the hash functions on the starting input \(g\) a number of \(a+b\) times. If Alice's public key was \(h(h(g))\) and Bob's public key was \(h(h(h(g)))\) then they both end up computing \(h(h(h(h(h(g)))))\).

That seems to work. But how is this scheme secure?

You're right, it is not. The attacker can just hash \(g\) over and over until he finds either Alice's or Bob's public key.

So let's ask ourselves this question: how could we make it secure?

If Bob or Alice had a way to compute \(h^c(x)\) without computing every single hash (\(c\) hash computations) then he or she would take way less time to compute their public key than an attacker would take to retrieve it.

Back to our discrete logarithm in Finite groups

This makes it easier to understand how the normal DH exchange in finite groups is secure.

The usual assumptions we want for DH to works were nicely summed up in Boneh's talk

dh

The point of view here is that discrete log is difficult AND CDH holds.

Another way to see this, is to see that we have algorithm to quickly calculate \(g^c \pmod{n}\) without having to iterate through every integers before \(c\).

To be more accurate: the algorithms we have to quickly exponentiate numbers in finite groups are way faster than the ones we have to compute the discrete logarithm of elements of finite groups. Thanks to these shortcuts the good folks can quickly compute their public keys while the bad folks have to do all the work.

Well done! You've reached the end of my post. Now you can leave a comment or read something else.

Comments

Samuel Laferriere

This article was super enlightening to me, thank you. I've seen cryptography summarized as "hiding things in the exponent" (https://crypto.stackexchange.com/a/3671/103619), but your blog adds the needed emphasis that "crypto is all about EFFICIENTLY hiding things in the exponent". And trapdoor/one-way functions are my current mental models for thinking about this efficiency (given a, there's a fast way to compute g^a, but an attacker that doesn't know a needs to try all possibilities, aka the discrete logarithm is hard).

The comparison between the exponential map and iterative hash helped me understand the point, but looking back there's two things I'd like to point out:
1) I don't like the notation (h^a(g))^b because it makes it look like we are exponentiating the result of the hash by b, whereas we really want h^b(h^a(g))
2) The comparison between iterated hashing and the exponential map is somewhat unfair, since it is hashing itself that should be compared to the exponential map (both of them are one-way functions!). Thus another way to write this/compare both is to compare the functions
g: x -> g*x (note that we overload g here to both refer to the function and the group element)
h: x -> h(x)
and then write exponentiation (which for functions represents iteration) as h^(ab)(1) vs g^(ab)(1). Here the conclusion is that the exponentiation (iteration) of the map g is efficiently computable, whereas the exponentiation (iteration) of the map h is not. (Also note that we don't need to hash g, we can just repeatedly hash 1. This makes the analogy much stronger!)

leave a comment...