How Quantum Computers and Machine Learning Will Revolutionize Big Data

How Quantum Computers and Machine Learning Will Revolutionize Big Data
Quanta Magazine. October 14, 2013

Large Hadron Collider (LHC) scientists rely on a vast computing grid of 160 data centers around the world, a distributed network that is capable of transferring as much as 10 gigabytes per second at peak performance.

The LHC’s approach to its big data problem reflects just how dramatically the nature of computing has changed over the last decade. Since Intel co-founder Gordon E. Moore first defined it in 1965, the so-called Moore’s law — which predicts that the number of transistors on integrated circuits will double every two years — has dominated the computer industry.
While that growth rate has proved remarkably resilient, for now, at least, “Moore’s law has basically crapped out; the transistors have gotten as small as people know how to make them economically with existing technologies,” said Scott Aaronson, a theoretical computer scientist at the Massachusetts Institute of Technology.

Alon Halevy, a computer scientist at Google, says the biggest breakthroughs in big data are likely to come from data integration.

Instead, since 2005, many of the gains in computing power have come from adding more parallelism via multiple cores, with multiple levels of memory.
The preferred architecture no longer features a single central processing unit (CPU) augmented with random access memory (RAM) and a hard drive for long-term storage.
Even the big, centralized parallel supercomputers that dominated the 1980s and 1990s are giving way to distributed data centers and cloud computing, often networked across many organizations and vast geographical distances.


D-Wave: The quantum company

D-Wave: The quantum company
19 June 2013

D-Wave is pioneering a novel way of making quantum computers — but it is also courting controversy.

Aerospace giant Lockheed Martin bought one of D-Wave’s computers in 2011 for about US$10 million, and Internet behemoth Google acquired one in May.

Using qubits made from superconducting loops of niobium, cooled to 20 millikelvin above absolute zero to keep them in their lowest energy states, D-Wave’s engineers created a usable computer …

there is no good theory to describe how quantum adiabatic computers will behave on a larger scale

speed should not be taken as proof of how the device is working. “Even if the machine does get to a solution faster than an ordinary laptop,” he says, “then you still face the question of whether that’s because of quantum effects, or because a team of people spent $100 million designing a special machine optimized to these types of problems.”

the real challenge, he says, will be the software.
“Programming this thing is ridiculously hard,” he admits; it can take months to work out how to phrase a problem so that the computer can understand it.

How Does a Quantum Computer Work?

How Does a Quantum Computer Work?
Jun 17, 2013

A quantum computer works in a totally different way from a classical computer.
Quantum bits or ‘qubits’ can exist in a superposition state of both zero and one simultaneously.
This means that a set of two qubits can be in a superposition of four states, which therefore require four numbers to uniquely identify the state.
So the amount of information stored in N qubits is two to the power of N classical bits.