In contrast with his previous posts, this one does not seem to contain much in the way of novelty, as nearly everything is already contained in the ed25519 paper [1]. To summarize, ECDSA is bad because (in order of gravity):
- It requires cryptographic randomness for each individual signature;
- As a NIST standard, it is defined over the NIST elliptic curves, which are not particularly implementation-friendly;
- The standards give little to no guidance on implementation issues;
- It is slower than necessary, requiring inversions during signing.
DSA was a step backwards from the Schnorr scheme, which was the superior option at the time. The blog post (but not the ed25519 paper) seems to forget that Schnorr was never practically adopted due to patent issues, similarly to IDEA, OCB, and many other schemes left on the patented algorithm wasteland. Legend goes that the DSA was designed with the express purpose of avoiding Schnorr's patent, while still resulting in a similar scheme. Since the patent expired in 2008, this is no longer a concern, and certainly not for a signature scheme designed in 2011.
> It requires cryptographic randomness for each individual signature
This is false. Although the default algorithm does work that way, RFC6979 lets you do ECDSA entirely with pseudorandom numbers seeded by the message and key.
It's a little weird to refer to this as "false", since Pornin's RFC was published after EdDSA. Bernstein isn't writing an elaborate blog post about Bitcoin; he references Bitcoin as a way to provide context for readers who might not be familiar with actual protocols that use ECDSA.
The overwhelming majority of ECDSA applications do not use deterministic DSA; deterministic DSA is a novelty.
EdDSA uses something very similar to RFV6979, so the author had to have realized that just retroactively adding that small piece to ECDSA was a possibility. So I don't think it's really fair to call determinism an intrinsic property of either algorithm.
It's not clear to me what the variables are in his equations:
> The ElGamal signature system: BH(M) = AR RS.
H = hash, M = message, S = secret... What about all the others? I'll spend some time today reading more about this, if anybody has suggestions where to start I'd be grateful.
Meta: this has front page for more than an hour but has no comments, are you 54 up-voters reading it?
R and S are the two signature elements. A is the public key, B is the common base. That is, A = B^a in prime fields, or A = a * B on elliptic curves. R is a random element used for each signature, usually R = k * B for randomly generated k.
Note that those equations are only covering the verification part of the scheme.
I don't think they are related per se. Given A = B^a and R = B^r (not explicitly said in the first paragraph about El-Gamal, S is calculated by S = (H(R,M) - a)/r at the "Merge the hashes" step.
At the next step, he's giving a completely new formula that looks similar: B^S = R * A^H(R,M) = B^r * B^a^H(R,M). However, this one is solved by S = r+a*H(R,M), which is a much easier thing to calculate, since there are no divisions.
- It requires cryptographic randomness for each individual signature;
- As a NIST standard, it is defined over the NIST elliptic curves, which are not particularly implementation-friendly;
- The standards give little to no guidance on implementation issues;
- It is slower than necessary, requiring inversions during signing.
DSA was a step backwards from the Schnorr scheme, which was the superior option at the time. The blog post (but not the ed25519 paper) seems to forget that Schnorr was never practically adopted due to patent issues, similarly to IDEA, OCB, and many other schemes left on the patented algorithm wasteland. Legend goes that the DSA was designed with the express purpose of avoiding Schnorr's patent, while still resulting in a similar scheme. Since the patent expired in 2008, this is no longer a concern, and certainly not for a signature scheme designed in 2011.
[1] http://ed25519.cr.yp.to/ed25519-20110926.pdf