The National Institute of Standards and Technology (NIST) lacks independence and uncritically adhered to the wishes of US electronic eavesdroppers in releasing a weakened random-number generator in 2006. So says a group of mathematicians and computer scientists in a new report commissioned by the lab following the leaking of documents last year by the former National Security Agency (NSA) contractor Edward Snowden. According to those documents, the NSA designed an encryption algorithm to include a "back door" so that it could copy encryption keys from internet users without their knowledge. The algorithm was approved by NIST, which itself develops cryptography technology and advises US companies and government agencies on electronic security issues.

Random-number generators are at the heart of encryption on the Internet. In particular, they provide the 1s and 0s that make up many of the keys that are used to encipher and decipher communications – in e-mail exchange, banking and medicine, for example. While sequences of truly random numbers are notoriously difficult to generate, online randomization relies more on data-based "pseudo-random" processes. These usually involve taking a more-or-less random "seed" – such as the data associated with timings of key strokes or hard-drive access – and then stretching that seed into a sequence using a specially designed algorithm.

The NSA algorithm – Dual Elliptic Curve Deterministic Random Bit Generation (Dual_EC_DRBG) – relies on the fact that elliptic curves can be used to construct "one-way functions". This means that while it is straightforward to multiply a pair of x, y co-ordinates on such a curve to generate a second set of co-ordinates, it is very hard to reverse the operation to arrive back at the original values of x and y. DUAL_EC uses elliptic curves with two publicly declared co-ordinates – P and Q. The algorithm multiplies Q by a factor that is initially dependent on a pseudo-random seed and then removes some of the bits from the resulting x co-ordinate. The programme then re-sets the factor by multiplying it by P, and the new factor is subsequently multiplied by Q to produce the next output sequence. Repeating this cycle many times, Dual_EC should produce a long string of pseudo-random bits.

Trivial for an attacker

However, there is a snag. It turns out that all future (and past) outputs can be predicted if an attacker is able to work out just one set of output co-ordinates from the truncated x value associated with them – a relatively trivial task given the fact that Dual_EC, unlike other similar algorithms, cuts off very few of the 1s and 0s describing the x co-ordinate (just two out of 32 bytes) – and, crucially, if that attacker knows the mathematical relationship between P and Q.

If P and Q were themselves selected purely at random then this kind of attack, cryptographers say, would be practically impossible. In contrast, however, the algorithm is vulnerable if the person setting it up chooses the values non-randomly – in other words, they set up a back door. Many cryptographers believe that the NSA probably knows the relationship between P and Q, and therefore has a back door allowing them to decipher encoded communications.

Being more open

When the story about DUAL_EC and its alleged back door broke last September, NIST responded to the "community concern" by putting the standard containing the algorithm – SP 800-90 (which contains three other random-number generators not under suspicion) open for "public comment". NIST then announced in April that it had decided to remove the offending program from the standard. In the meantime, NIST also asked its Visiting Committee on Advanced Technology to investigate how the organization could improve its standards work in the future. The committee then entrusted that task to a specially appointed panel of seven experts.

The panel commended NIST for being "forthcoming, open and transparent" in responding to its enquiries, but concluded that these were qualities sometimes lacking when it is developing cryptographic standards. In individually submitted assessments of what had gone wrong, many of the panel members also said that NIST had made a mistake in approving the algorithm in the first place, arguing that it had done so because it had been overly trusting of the NSA.

NIST failed to exercise independent judgment but instead deferred extensively to the NSA
Edward Felten, Princeton University

Panellist Edward Felten, a computer scientist at Princeton University, argues that NIST should not have allowed the NSA to provide the values of P and Q, or, as a minimum, should have asked the agency to provide evidence of the variables' randomness. "NIST failed to exercise independent judgment but instead deferred extensively to the NSA," he wrote.

Fellow panellist Bart Preneel of KU Leuven University in Belgium believes that NIST has "lost its credibility" and estimates that it will need "several years" to regain the trust of providers and users of Internet services. "It is clear that this could only happen because in some sense NIST was misled by the NSA," he says