Alan Turing’s Everlasting Contributions to Computing, AI and Cryptography

Museum display shows Enigma coding machine, black with keys like a typewriter, on left and info card on right.
An Enigma machine on display outside the Alan Turing Institute entrance inside the British Library, London. Credit: ©Shutterstock/William Barton

By René Peralta, a scientist with the Computer Security Division at the National Institute of Standards and Technology (NIST)

Suppose someone asked you to devise the most powerful computer possible. Alan Turing, whose reputation as a central figure in computer science and artificial intelligence has only grown since his untimely death in 1954, applied his genius to problems such as this one in an age before computers as we know them existed. His theoretical work on this problem and others remains a foundation of computing, AI and modern cryptographic standards, including those NIST recommends.

The road from devising the most powerful computer possible to cryptographic standards has a few twists and turns, as does Turing’s brief life.

Historical black-and-white photo shows Alan Turing posing for a head shot.
Alan Turing. Credit: © National Portrait Gallery, London

In Turing’s time, mathematicians debated whether it was possible to build a single, all-purpose machine that could solve all problems that are computable. For example, we can compute a car’s most energy-efficient route to a destination, and (in principle) the most likely way in which a string of amino acids will fold into a three-dimensional protein. Another example of a computable problem, important to modern encryption, is whether or not bigger numbers can be expressed as the product of two smaller numbers. For example, 6 can be expressed as the product of 2 and 3, but 7 cannot be factored into smaller integers and is therefore a prime number.

Some prominent mathematicians proposed elaborate designs for universal computers that would operate by following very complicated mathematical rules. It seemed overwhelmingly difficult to build such machines. It took the genius of Turing to show that a very simple machine could in fact compute all that is computable.

His hypothetical device is now known as a “Turing machine.” The centerpiece of the machine is a strip of tape, divided into individual boxes. Each box contains a symbol (such as A,C,T, G for the letters of genetic code) or a blank space. The strip of tape is analogous to today’s hard drives that store bits of data. Initially, the string of symbols on the tape corresponds to the input, containing the data for the problem to be solved. The string also serves as the memory of the computer. The Turing machine writes onto the tape data that it needs to access later in the computation.

Diagram has row of boxes with letters T, C, A, A, G, C and an arrow pointing to the second A.
Credit: NIST

The device reads an individual symbol on the tape and follows instructions on whether to change the symbol or leave it alone before moving to another symbol. The instructions depend on the current “state” of the machine. For example, if the machine needs to decide whether the tape contains the text string “TC” it can scan the tape in the forward direction while switching among the states “previous letter was T” and “previous letter was not C.” If while in state “previous letter was T” it reads a “C,” it goes to a state “found it” and halts. If it encounters the blank symbol at the end of the input, it goes to the state “did not find it” and halts. Nowadays we would recognize the set of instructions as the machine’s program.

It took some time, but eventually it became clear to everyone that Turing was right: The Turing machine could indeed compute all that seemed computable. No number of additions or extensions to this machine could extend its computing capability.

To understand what can be computed it is helpful to identify what cannot be computed. In a previous life as a university professor I had to teach programming a few times. Students often encounter the following problem: “My program has been running for a long time; is it stuck?” This is called the Halting Problem, and students often wondered why we simply couldn’t detect infinite loops without actually getting stuck in them. It turns out a program to do this is an impossibility. Turing showed that there does not exist a machine that detects whether or not another machine halts. From this seminal result followed many other impossibility results. For example, logicians and philosophers had to abandon the dream of an automated way of detecting whether an assertion (such as whether there are infinitely many prime numbers) is true or false, as that is uncomputable. If you could do this, then you could solve the Halting Problem simply by asking whether the statement “this machine halts” is true or false.

Turing went on to make fundamental contributions to AI, theoretical biology and cryptography. His involvement with this last subject brought him honor and fame during World War II, when he played a very important role in adapting and extending cryptanalytic techniques invented by Polish mathematicians. This work broke the German Enigma machine encryption, making a significant contribution to the war effort.

Turing was gay. After the war, in 1952, the British government convicted him for having sex with a man. He stayed out of jail only by submitting to what is now called “chemical castration.” He died in 1954 at age 41 by cyanide poisoning, which was initially ruled a suicide but may have been an accident according to subsequent analysis. More than 50 years would pass before the British government apologized and “pardoned” him (after years of campaigning by scientists around the world). Today, the highest honor in computer sciences is called the Turing Award.

Turing’s computability work provided the foundation for modern complexity theory. This theory tries to answer the question “Among those problems that can be solved by a computer, which ones can be solved efficiently?” Here, “efficiently” means not in billions of years but in milliseconds, seconds, hours or days, depending on the computational problem.

For example, much of the cryptography that currently safeguards our data and communications relies on the belief that certain problems, such as decomposing an integer number into its prime factors, cannot be solved before the Sun turns into a red giant and consumes the Earth (currently forecast for 4 billion to 5 billion years). NIST is responsible for cryptographic standards that are used throughout the world. We could not do this work without complexity theory.

Technology sometimes throws us a curve, such as the discovery that if a sufficiently big and reliable quantum computer is built it would be able to factor integers, thus breaking some of our cryptography. In this situation, NIST scientists must rely on the world’s experts (many of them in-house) in order to update our standards. There are deep reasons to believe that quantum computers will not be able to break the cryptography that NIST is about to roll out. Among these reasons is that Turing’s machine can simulate quantum computers. This implies that complexity theory gives us limits on what a powerful quantum computer can do.

But that is a topic for another day. For now, we can celebrate how Turing provided the keys to much of today’s computing technology and even gave us hints on how to solve looming technological problems.

This post originally appeared on Taking Measure, the official blog of the National Institute of Standards and Technology (NIST) on June 23, 2022.

To make sure you never miss our blog posts or other news from NIST, sign up for our email alerts.

About the Author

Head shot of Rene Peralta

René Peralta received a B.A. in Economics from Hamilton College in 1978. In 1980 he received a M.S. in Mathematics from the State University of New York at Binghamton. In 1985 he received a Ph.D. in Computer Science from the University of California at Berkeley. For the next 20 years he held various positions in academia, mostly as a professor of cryptology, algorithmics and computational number theory. In 2005 he moved to NIST. He is currently a scientist with the Computer Security Division. Among the projects he is currently involved in are The NIST Randomness Beacon, Circuit Complexity, Privacy Enhancing Cryptography, and Post-Quantum Cryptography.

--

--

National Institute of Standards and Technology

NIST promotes U.S. innovation by advancing measurement science, standards and technology in ways that enhance economic security and improve our quality of life.