May 17, 2013 | 4

Alan Woodward is a Professor at the Department of Computing, University of Surrey, where he specialises in cyber security, computer forensics, cryptography and steganography. Alan began with a degree in physics but did his postgraduate research in signal processing at a time when computing power began to enable some radical changes to what was possible. He began his career working for the UK government, was involved in delivering some of the most challenging IT developments of the past 20 years for a variety of organisations and has for the past 10 years since when he has also been Chief Technology Officer a company called Charteris which he helped to build from a start up and float on the London Stock Exchange. As well as writing extensively in the UK on technology as well as presenting on current affairs issues relating to technology for the likes of BBC TV and radio, Alan remains actively involved in the daily battles that occur in cyberspace. Follow on Twitter @profwoodward.

Contact Alan Woodward via email.

Follow Alan Woodward on Twitter as @profwoodward. Or visit their website.

Follow Alan Woodward on Twitter as @profwoodward. Or visit their website.

This week I had a fascinating discussion on BBC Radio 4 with Dr Geordie Rose, the CTO of DWave, triggered by the news that NASA and Google are investing in DWave’s “quantum computer”. The idea is to set up a facility that is used by both NASA and Google but also allows academics to book time on the system to try out new ideas.

Our radio conversation brought out an important issue that has dogged this subject for several years: when is a quantum computer not a quantum computer?

I began by explaining the theory behind quantum computing and why they hold the promise of significantly faster processing. In essence, it relies upon the fact that whilst conventional “bits” can be 0 or 1, quantum bits (so called qubits) can be both 0 and 1 at the same time (known as superposition). If you can combine qubits (known as entanglement) you can have a system that can process values that expand exponentially with the number of qubits you entangle. As with conventional programming, these qubits are passed through various logic gates to achieve the desired results. Hence, this is known as the “gate theory” of quantum computing.

Many academics, and increasingly large corporations such as IBM and Microsoft, have spent years working on the algorithms, error correction and a variety of techniques for creating qubits, ranging from photons to ion traps to braided anyons. To date, we have found it extremely difficult to maintain these qubits in superposition and to ensure they are truly entangled. “Decoherence”, where the qubits drop out of superposition and become just a 0 or a 1, is the bane of all quantum computer engineers.

This decoherence problem has spurred many to look for methods that are naturally immune from the effect. DWave were one such group. They have based their processor on an effect called quantum annealing, also sometimes referred to as adiabatic quantum computing, which was first discussed in 2000 as a possible means of conducting certain calculations.

The quantum annealing process is, as the name suggests, a quantum level effect. At the scale of a qubit, you can use the effect to determine the lowest “energy” state of a system. Hence, if you can describe a problem in terms of a function that has a “cost” of “energy” versus some other parameter, you can find the configuration that represents the optimal state. So, for example, think of the classic travelling salesman problem where one tries to find the shortest path when travelling between a number of cities. If you did this using simple trial and error on a conventional computer it would take longer than the age of the universe by the time you were up to 30 cities. Using quantum annealing you can define the problem as an optimisation task which means you can programme a DWave system to calculate it.

An obvious question is how much faster is quantum annealing than conventional computers? Based upon solving specific problems, that question was addressed in a paper just published, where academics compared conventional computers with a DWave system when solving optimisation problems which were known to be computationally hard. The DWave system is reported as being many thousands of times faster in some cases.

Thus, we have a system that can do useful computations based on quantum effects. It may not be a quantum computer as some purists might define it, but it does have one huge advantage: it exists and is available to do meaningful work. For all the theory, quantum computers based upon gate theory are still very experimental and can muster only a handful of qubits. Gate based quantum computing will come eventually; the money being invested and the screaming need for the technology as we head towards the end of Moore’s law mean that it’s a question of when not if. But, on the journey, which is currently of uncertain length, we should not be blind to opportunities on the way. It may prove to be a detour, but many interesting developments have arisen in computing by those who spotted just such an opportunity.

So, is DWave’s system a quantum computer? I think that’s the wrong question. Better to ask if the DWave system can help with some computations that were previously impractical, in which case the answer is yes.

**Images:**

D-Wave Systems, Inc., Arnab Das, WhiteTimberwolf, Saurabh.harsh

Add a Comment

You must sign in or register as a ScientificAmerican.com member to submit a comment.

Get 6 bi-monthly digital issues

+ 1yr of archive access for just $9.99

I’ve been following D-Wave, and the controversy surrounding it, for years now, and one thing that has recurrently struck me is the superficiality of the discussion and analysis about what has been, and what may yet be, accomplished by D-Wave’s approach. Essentially, the articles I’ve read have been little more than press releases by D-Wave with the words changed.

The one exception has been the blog of Scott Aaronson, “Shtetl-Optimized”. Aaronson is a professor of computer science at MIT, and he has a knack for explaining deep, essential concepts in his field in a witty, vivid way. A few years ago he posted on his blog a detailed explanation of Shor’s Algorithm so good that it impelled Peter Shor to leave a comment that it was the best explanation of his algorithm for laymen that he’d ever seen.

Well, in Aaronson’s May 16, 2013 blog post called “D-Wave: Truth Starts to Emerge” he presents an incredibly detailed analysis of what that “truth” really is. I strongly recommend that anyone interested in this subject peruse this blog post of Aaronson’s. I can’t do justice to it with a summary, but I’ll say this: though he concedes that entanglement may well be occurring, it’s entirely possible that D-Wave’s approach may never be able to translate that entanglement into a speed-up of computation over classical computing, which of course is the entire point of quantum computing.

With the announcement of the involvement of Google and NASA with D-Wave, everyone assumes that that means D-Wave’s approach MUST have validity. I suggest you withhold judgment on that until you’ve read Aaronson’s masterful dissection of what in fact has been going on in the experiments pitting D-Wave against classical computers.

I’m embarrassed to say that I myself bought some of the hype about D-Wave supposedly slaughtering the classical competition in these contests, and it was only by reading Aaronson’s blog post yesterday that I understood the pitiful truth about D-Wave’s ostensible accomplishments.

Link to thisAaronson’s gang now trolling other people article and as always shameless.

Link to thisCan we say that when playing chess the human brain is doing quantum computing to arrive at optimal moves ? Humans don’t evaluate every possible move and counter moves but by working out a few probable winning moves and most probable responsive counter mobes to them in our mind, chooses to play one of them , whereas computer when it plays chess evaluates at lightning speeds a majority of all possible moves and chooses its move , so that’s not quantum computing using technology of qbits or quantum entanglement.

Link to thisUnlike barth, I haven’t been following D-Wave, but having worked in the area of computer performance evaluation & capacity planning in the previous millennium, I was skeptical when I saw the claim that D-Wave was ’3600 time faster’ being bandied about.

Unfortunately I’ve lost some of my notes & sources, but a few weeks ago I found that performance claim to be based on a ‘benchmark’ run on a network of I’m pretty sure 4-6 ‘IBM’ (Lenovo) commercial desktop computers – total cost < $10k! It was not even compared to a massively parallel supercomputer…

See http://blogs.nature.com/news/2013/05/quantum-computer-passes-speed-test.html

"McGeoch compared a 439-qubit version of D-Wave to a commercial product from IBM designed to solve the same sorts of problems. The IBM product is designed to deliver a confident answer to a given problem after 30 minutes. McGeoch found that D-Wave did just as well at finding the right answers, but in a half-second run time. That’s 3,600 times faster. “It was really amazing,” she says."

Those running the tests seem to have been under the impression that, since they were using some software product that generated algorithmic solutions for specific problems for various processor platforms, they were comparing 'apples to apples'. As best I could determine, they were actually comparing a $10K PC with an approximately $10M quantum supercomputer specifically designed to solve only the the class of problems being tested! At the the very least, I think that a massively parallel supercomputer could have been used to compare against the D-Wave performance – to give a somewhat more valid performance comparison of multimillion dollar computational solutions!

From what I could determine the '3600 times faster' headline is utter nonsense!

Link to this