Meet Twist: MIT’s Quantum Programming Language

Meet Twist: MIT’s Quantum Programming Language

While machine learning has been around a long time, deep learning has taken on a life of its own lately. The reason for that has mostly to do with the increasing amounts of computing power that have become widely available—along with the burgeoning quantities of data that can be easily harvested and used to train neural networks.

The amount of computing power at people’s fingertips started growing in leaps and bounds at the turn of the millennium, when graphical processing units (GPUs) began to be
harnessed for nongraphical calculations, a trend that has become increasingly pervasive over the past decade. But the computing demands of deep learning have been rising even faster. This dynamic has spurred engineers to develop electronic hardware accelerators specifically targeted to deep learning, Google’s Tensor Processing Unit (TPU) being a prime example.

Here, I will describe a very different approach to this problem—using optical processors to carry out neural-network calculations with photons instead of electrons. To understand how optics can serve here, you need to know a little bit about how computers currently carry out neural-network calculations. So bear with me as I outline what goes on under the hood.

Almost invariably, artificial neurons are constructed using special software running on digital electronic computers of some sort. That software provides a given neuron with multiple inputs and one output. The state of each neuron depends on the weighted sum of its inputs, to which a nonlinear function, called an activation function, is applied. The result, the output of this neuron, then becomes an input for various other neurons.

Reducing the energy needs of neural networks might require computing with light

For computational efficiency, these neurons are grouped into layers, with neurons connected only to neurons in adjacent layers. The benefit of arranging things that way, as opposed to allowing connections between any two neurons, is that it allows certain mathematical tricks of linear algebra to be used to speed the calculations.

While they are not the whole story, these linear-algebra calculations are the most computationally demanding part of deep learning, particularly as the size of the network grows. This is true for both training (the process of determining what weights to apply to the inputs for each neuron) and for inference (when the neural network is providing the desired results).

What are these mysterious linear-algebra calculations? They aren’t so complicated really. They involve operations on
matrices, which are just rectangular arrays of numbers—spreadsheets if you will, minus the descriptive column headers you might find in a typical Excel file.

This is great news because modern computer hardware has been very well optimized for matrix operations, which were the bread and butter of high-performance computing long before deep learning became popular. The relevant matrix calculations for deep learning boil down to a large number of multiply-and-accumulate operations, whereby pairs of numbers are multiplied together and their products are added up.

Over the years, deep learning has required an ever-growing number of these multiply-and-accumulate operations. Consider

Read More

MIT’s New Programming Language for Quantum Computing

MIT’s New Programming Language for Quantum Computing

Technology Communications Programming AI Concept

Time crystals. Microwaves. Diamonds. What do these a few disparate factors have in widespread?

Quantum computing. Contrary to common computer systems that use bits, quantum desktops use qubits to encode details as zeros or kinds, or both equally at the similar time. Coupled with a cocktail of forces from quantum physics, these refrigerator-sized machines can course of action a total ton of information — but they are much from flawless. Just like our regular personal computers, we require to have the suitable programming languages to properly compute on quantum personal computers.

Programming quantum computer systems demands recognition of something referred to as “entanglement,” a computational multiplier for qubits of types, which translates to a great deal of electricity. When two qubits are entangled, steps on a single qubit can alter the value of the other, even when they are bodily divided, providing increase to Einstein’s characterization of “spooky action at a distance.” But that efficiency is equal areas a resource of weak point. When programming, discarding one qubit without the need of becoming conscious of its entanglement with a further qubit can wipe out the information stored in the other, jeopardizing the correctness of the software.

Scientists from quantum computing called Twist. Twist can describe and verify which pieces of data are entangled in a quantum program, through a language a classical programmer can understand. The language uses a concept called purity, which enforces the absence of entanglement and results in more intuitive programs, with ideally fewer bugs. For example, a programmer can use Twist to say that the temporary data generated as garbage by a program is not entangled with the program’s answer, making it safe to throw away.

IBM Quantum Computer Close

While the nascent field of quantum computing can feel flashy and futuristic, quantum computers have the potential for computational breakthroughs in classically unsolvable tasks, like cryptographic and communication protocols, search, and computational physics and chemistry. Credit: Graham Carlow/IBM

While the nascent field can feel a little flashy and futuristic, with images of mammoth wiry gold machines coming to mind, quantum computers have potential for computational breakthroughs in classically unsolvable tasks, like cryptographic and communication protocols, search, and computational physics and chemistry. One of the key challenges in computational sciences is dealing with the complexity of the problem and the amount of computation needed. Whereas a classical digital computer would need a very large exponential number of bits to be able to process such a simulation, a quantum computer could do it, potentially, using a very small number of qubits — if the right …

Read More