Showing posts sorted by relevance for query quantum computing. Sort by date Show all posts
Showing posts sorted by relevance for query quantum computing. Sort by date Show all posts

Tuesday, September 1, 2015

Get used to it: quantum computing will bring immense processing possibilities

D-Wave, CC BY

The one thing everyone knows about quantum mechanics is its legendary weirdness, in which the basic tenets of the world it describes seem alien to the world we live in. Superposition, where things can be in two states simultaneously, a switch both on and off, a cat both dead and alive. Or entanglement, what Einstein called “spooky action-at-distance” in which objects are invisibly linked, even when separated by huge distances.

But weird or not, quantum theory is approaching a century old and has found many applications in daily life. As John von Neumann once said: “You don’t understand quantum mechanics, you just get used to it.” Much of electronics is based on quantum physics, and the application of quantum theory to computing could open up huge possibilities for the complex calculations and data processing we see today.

Imagine a computer processor able to harness super-position, to calculate the result of an arbitrarily large number of permutations of a complex problem simultaneously. Imagine how entanglement could be used to allow systems on different sides of the world to be linked and their efforts combined, despite their physical separation. Quantum computing has immense potential, making light work of some of the most difficult tasks, such as simulating the body’s response to drugs, predicting weather patterns, or analysing big datasets.

Replica of the first ever transistor, manufactured at Bell Labs in 1947. Lucent Technologies

Such processing possibilities are needed. The first transistors could only just be held in the hand, while today they measure just 14 nm – 500 times smaller than a red blood cell. This relentless shrinking, predicted by Intel founder Gordon Moore as Moore’s law, has held true for 50 years, but cannot hold indefinitely. Silicon can only be shrunk so far, and if we are to continue benefiting from the performance gains we have become used to, we need a different approach.

Quantum fabrication

Advances in semiconductor fabrication have made it possible to mass-produce quantum-scale semiconductors – electronic circuits that exhibit quantum effects such as super-position and entanglement.

Quantum circuitry. Paul Koenraad/TU Eindhoven, Author provided

The image, captured at the atomic scale, shows a cross-section through one potential candidate for the building blocks of a quantum computer, a semiconductor nano-ring. Electrons trapped in these rings exhibit the strange properties of quantum mechanics, and semiconductor fabrication processes are poised to integrate these elements required to build a quantum computer. While we may be able to construct a quantum computer using structures like these, there are still major challenges involved.

In a classical computer processor a huge number of transistors interact conditionally and predictably with one another. But quantum behaviour is highly fragile; for example, under quantum physics even measuring the state of the system such as checking whether the switch is on or off, actually changes what is being observed. Conducting an orchestra of quantum systems to produce useful output that couldn’t easily by handled by a classical computer is extremely difficult.

But there have been huge investments: the UK government announced £270m funding for quantum technologies in 2014 for example, and the likes of Google, NASA and Lockheed Martin are also working in the field. It’s difficult to predict the pace of progress, but a useful quantum computer could be ten years away.

Building quantum computers. Michael Thompson, Lancaster Quantum Technology Centre, Author provided

The basic element of quantum computing is known as a qubit, the quantum equivalent to the bits used in traditional computers. To date, scientists have harnessed quantum systems to represent qubits in many different ways, ranging from defects in diamonds, to semiconductor nano-structures or tiny superconducting circuits. Each of these has is own advantages and disadvantages, but none yet has met all the requirements for a quantum computer, known as the DiVincenzo Criteria.

The most impressive progress has come from D-Wave Systems, a firm that has managed to pack hundreds of qubits on to a small chip similar in appearance to a traditional processor.

Quantum secrets

The benefits of harnessing quantum technologies aren’t limited to computing, however. Whether or not quantum computing will extend or augment digital computing, the same quantum effects can be harnessed for other means. The most mature example is quantum communications.

Quantum physics has been proposed as a means to prevent forgery of valuable objects, such as a banknote or diamond, as illustrated in the image below. Here, the unusual negative rules embedded within quantum physics prove useful; perfect copies of unknown states cannot be made and measurements change the systems they are measuring. These two limitations are combined in this quantum anti-counterfeiting scheme, making it impossible to copy the identity of the object they are stored in.

Adding a quantum secret to a standard barcode prevents tampering or forgery of valuable goods. Robert Young, Author provided

The concept of quantum money is, unfortunately, highly impractical, but the same idea has been successfully extended to communications. The idea is straightforward: the act of measuring quantum super-position states alters what you try to measure, so it’s possible to detect the presence of an eavesdropper making such measurements. With the correct protocol, such as BB84, it is possible to communicate privately, with that privacy guaranteed by fundamental laws of physics.

Quantum communication systems are commercially available today from firms such as Toshiba and ID Quantique. While the implementation is clunky and expensive now it will become more streamlined and miniaturised, just as transistors have miniaturised over the last 60 years.

Improvements to nanoscale fabrication techniques will greatly accelerate the development of quantum-based technologies. And while useful quantum computing still appears to be some way off, it’s future is very exciting indeed.

The Conversation

Monday, March 9, 2015

Quantum weirdness passes the atomic walk test

Confused? You will be. Inga Nielsen/Shutterstock

Quantum mechanics is a funny beast. On one hand it is a hugely successful theory, providing predictions of unrivalled accuracy across the natural world. On the other, it is a theory famed for its weirdness and failure to make sense in everyday terms: under quantum mechanics, light is both a particle and a wave, cats can be both dead and alive, and objects at the opposite ends of the galaxy exert a “spooky action-at-a-distance” on one another.


But is this weirdness really necessary? Could there be a theory more fundamental than quantum mechanics and yet in which only sensible, “non-weird” rules apply?


In particular, if we could find a theory that was “realistic”, that is, based on an intuitive reality in which objects have well-defined properties – for example dead or alive, but not both at the same time – we would solve a whole host of quantum-mechanical headaches in one swoop. But is a realistic description of nature possible? Can we have a theory which does the same job as quantum theory, and yet is free from all its weirdness?


Concrete tests


A key contribution to this discussion came from John Bell in 1960s. In addition to realism, Bell considered theories with a second very natural constraint – locality. Locality is the notion that far-apart objects cannot influence one another instantaneously. Newton described its opposite, non-locality, as “so great an Absurdity that I believe no Man who has in philosophical Matters a competent Faculty of thinking can ever fall into it”.


Bell proposed an experiment involving two particles far removed from one another. He proved that, according to all local realistic theories, a certain sequence of measurements on the two particles always return results that, when added together, give a number less than or equal to one. This relationship is known as a Bell’s inequality. Furthermore, Bell predicted that under quantum mechanics, the exact same sequence of measurements can give a value that exceeds one.


Here then was a concrete, testable difference between quantum mechanics and the whole class of theories built on local realism.


Since the initial work of John Clauser and Alain Aspect, this difference has been experimentally tested – with increasingly large distances between the particles. The result is clear: actual measurements of Bell’s ideas give a value higher than one. As this is beyond the realm of possibilities with local realistic theories, such theories cannot be viable descriptions of nature.


Doing the quantum walk


Bell’s tests involve multiple, separated objects and rule out theories which are both realistic and local. But for a single object, locality (its distance from another) isn’t an issue. So could it be that realism and realistic theories can still help us out of quantum paradoxes involving only single objects, such as Schrödinger’s cat?


This is the question addressed in new experiments by my colleagues at the University of Bonn. The team, led by Andrea Alberti, used lasers to first trap a single caesium atom and then drag it back and forth in a sequence of steps known as a “quantum walk”.


An atom takes a quantum walk through many paths at once. Robens et al. Phys. Rev. X 5, 011003, CC BY


Under a realistic description of its motion, the atom has a well-defined position at every point in time, even when we are not looking at it. Provided that we also assume it’s possible to measure the atom’s position without disturbing it, this property leads to another testable inequality – the Leggett-Garg inequality.


By looking for where the atom is not, rather than where it is, the Bonn team were able to perform the most faithful Leggett-Garg test to date. Their results show a clear victory for quantum mechanics – contrary to realistic theories and our own intuition, the quantum-walking atom has to be considered as being in two places at the same time.


A quantum future


Taken together, results like these make the hope of replacing quantum mechanics with something more “sensible” appear futile. By insisting that our new theory be realistic, we inevitably end up with weird results that include the ability to act on distant objects or measurements that disrupt what they are trying to record. And these make our new theory at least as weird, if not weirder, than the quantum theory we are trying to replace.


But rather than feel disappointed at losing our easy way out from quantum’s difficulties, we should feel emboldened in our confidence in quantum mechanics. The measured violations of Bell and Leggett-Garg inequalities stem from the quantum-mechanical properties of entanglement and superposition, respectively. By embracing these properties, we are learning to construct new technologies such as quantum computing, quantum imaging and quantum encryption.


More fundamentally, future research will extend these quantum tests to include ever larger objects. We are confident that quantum mechanics holds at the smallest of scales. Likewise, a realistic description fits the behaviour of objects in our everyday experience. But somewhere in the middle, something has to give, and the sharp lines thrown by these inequalities will be central to investigating boundary between the everyday and quantum worlds.


The Conversation

Friday, March 13, 2015

Upgraded LHC pushes physics into the unknown

Look into my high-energy particle physics and what do you see? CERN

There’s a certain degree of anticipation and anxiety among scientists at CERN and beyond as the Large Hadron Collider prepares to roar back into life after a two-year break.


Upgraded with more powerful magnets to smash particles together with almost twice its previous energy, this will bring with it the opportunity to discover new, even more massive particles – just as with the Higgs boson – that will signpost the way beyond our current understanding of particle physics, the Standard Model. Why do we think this? Because Einstein’s equation of energy-mass equivalence – more familiar to people as E=mc2 – tells us that in order to make more massive particles we need more energy – even more than the LHC has been capable of delivering so far.


Listen harder, hear more


But energy is only part of the story; what’s also needed is greater precision, more sensitive detectors that allow for more nuanced data, which reveals rare events or subtle effects not previously observed. To this end, the detectors have been upgraded too.


ATLAS, one of the four main experiments built around the 27km of the LHC complex, has gained the capacity to measure the paths of the charged particles produced in the collisions. This has improved the accuracy with which we can measure the lifetimes of these ephemeral particles that in some cases exist only for a tiny fraction of time.


Filter more noise


The experiments have also increased the rate and selectivity with which they record collisions in the LHC. As a great deal of subatomic particle physics is already known, the more unusual, exciting events are hidden within a huge torrent of data representing more mundane particle interactions. The sheer volume of raw data – about a petabyte, or around 210,000 DVDs per second – from the experiments requires algorithms to rapidly filter and select the new and unusual events for further study while discarding the rest.


Particle collissions in the LHC have taken us to the edge of physics. CERN


Better selectivity is not the end of the problem, however. To cope with the volume of data produced by the experiments due to the more energetic collisions and more sensitive detectors means new software and storage procedures must be written. These will also transmit the data across the worldwide distributed computing system, which allows not just an accurate reconstruction of each collision from the traces recorded in the detectors, but also more rapid access for scientists to the records.


Unanswered questions


It’s a lot of hard work under tight budget constraints, but the effort is worth it. There are many open questions that the Standard Model simply cannot answer.


Is the recently discovered Higgs boson the particle the Standard Model predicts, or is it the first of a family of undiscovered, even more rare Higgs particles that are predicted by more complete but speculative theories such as Supersymmetry? What is the nature of the dark matter that astronomy tells us is far more abundant than the ordinary matter we’ve come to understand so well? How did a Big Bang that produced a balance of matter and antimatter result in the world of matter that we know today?


The Standard Model of quarks and other particles, including the Higgs boson. MissMJ, CC BY


My own principle interest in these questions is being addressed through studying the decays of particles containing quarks, the fundamental particles found inside the protons and neutrons that constitute an atomic nucleus. Of the six types of quarks it is the bottom quark (also known as the beauty quark) that is particularly interesting as the way it decays displays a small bias for matter over antimatter, but not enough so far to explain the world we know.


However, through an odd but well understood quirk of quantum mechanics, new and massive particles even bigger than we can produce in the LHC can affect these particles' decays and leave a trail to the new physics we need to develop to better explain the universe. Some of these studies are already underway at dedicated experiments like LHCb, which has already proved several hypothesied supermassive particles, but for others general purpose experiments like ATLAS can be better.


Unlike the first season of experiments with the LHC, once the first proton beam fires up on March 23 we will not have such a clear roadmap of what to expect, or what to aim for. The first run was led by the knowledge that we would either find the Higgs and add to the Standard Model, or not find it and break the Standard Model in an act of creative destruction pushing us on to find better theories.


This time, there is a clear programme of work around the Standard Model, including the Higgs, but we have many guides that point towards new physics. Most analyses will advance science through excluding possibilities, but the new discoveries will be all the more enlightening. In a sense, we have entered a mode of more pure scientific discovery – and I for one cannot wait.


The Conversation

Thursday, June 4, 2015

Is this the end of particle physics as we know it? Let's hope not

An artist's impression of the much-searched for magnetic monopole Heikka Valja/MoEDAL Collaboration

Physicists around the world (myself included) are hoping that this week will mark the beginning of a new era of discovery. And not, as some fear, the end of particle physics as we know it.

After 27 months of shutdown and re-commissioning, the Large Hadron Collider has begun its much-anticipated “Season 2”. Deep beneath the Franco-Swiss border, the first physics data is now being collected in CERN’s freshly upgraded detector-temples at the record-breaking collision energy of 13 teraelectonvolts (TeV).

Much has been written about the upgrade to the accelerator, the experiments, and the computing infrastructure required to handle the fresh deluge of data from the new energy frontier. There has also – quite rightly – been a lot of attention paid to the crowning achievement of Run 1: the discovery of the Higgs boson.

But the “elephant in the collider” is this: we knew that Run 1 had to find the Higgs boson – or something like it, and it did. With Run 2, we don’t know what we’re looking for.

OK, so maybe that’s bit of an over-simplification. We certainly have a good few guesses as to what’s beyond the Standard Model of particle physics, our current best understanding of matter and forces at the fundamental level that was essentially completed in July 2012.

One of the leading contenders is supersymmetry, a theory that provides a candidate for the dark matter that supposedly makes up some 23% of our universe. As it happens, my PhD was based on the first results from the LHC Run 1 that said we hadn’t found evidence for supersymmetry.

To date, I have not had to write an embarrassing addendum to my thesis. But, while there are many compelling arguments for supersymmetry, it is not required in the same way the Higgs boson was. The Higgs was a missing piece in our current physics jigsaw; supersymmetry would represent a new puzzle entirely.

Scientific wild-goose chase?

Does that make Run 2 a waste of time? Are we pouring money into an extra-dimensional wild-goose chase? Are we, in fact, staring down the barrel of the end of collider-based particle physics?

You’d be forgiven for thinking so, if you had no knowledge or understanding of the history of particle physics (or how science works, for that matter). After all, science is arguably at its most boring when you 1) know exactly what you’re looking for, and 2) find it.

It’s much more fun to consider physics in the middle of the 20th century. You could pretty much describe all of known physics, chemistry, materials science, and biology with electrons, protons, neutrons and photons. Yet advances in particle detector technology – Wilson’s cloud chamber, Blackett’s triggers, Powell’s photographic emulsions – led to the discovery of completely new particles outside of this comfortable model of nature.

Vehicle of discovery Daniel Dominguez, Maximilien Brice/CERN

At the time, cosmic rays – particles bombarding our atmosphere from outer space – had far greater energies than the particles laboratory-based accelerators could produce. They represented a new energy frontier for physics, explored by the heroic particle hunters of the 1930s and ‘40s who trekked up mountains, launched high-altitude balloons, and flew aeroplanes in search of their quantum quarry.

They were rewarded for their efforts with, among other things, strange particles, a completely new type of matter that defied the predictions of the time and opened the door to a veritable zoo of subatomic building blocks.

The second half of the 20th century saw a trans-Atlantic race to build bigger and bigger particle accelerators to artificially produce cosmic rays in the controlled conditions of the laboratory and tame the particle zoo. This race was, arguably, won by the LHC. As we approach the new, unknown energy frontier of Run 2, we are therefore once again in need of a new generation of particle hunters. We need experimental physicists who are able to painstakingly pore over every byte of data in search of “what’s next”.

Monopole mission

Personally, I have eschewed supersymmetric searches (been there, done that) and, along with the students of the Langton Star Centre, joined the MoEDAL Collaboration. This experiment is looking for Paul Dirac’s hypothesised magnetic monopole. Based in the LHCb cavern at Point 8, MoEDAL (Monopole and Exotics Detector at the LHC) will use a number of novel detector technologies to look for tracks generated by the heavy, highly-ionising magnetic monopoles that could, in theory, be produced in the proton-proton collisions.

Magnetic monopoles are the magnetic equivalent of single electric charges – like a magnet with only a north or south pole, and not both - and their discovery would shake physics to its electromagnetic core. It’s a high-risk, high-reward search – but by providing alternatives to the traditional detector methodologies of CMS and ATLAS, we’re ensuring that as many bases are covered as possible.

We don’t know what we will find in Run 2. It could be monopoles, dark matter, micro-black holes, extra dimensional excitations, gravitons or something else entirely. What’s certain is this: if we are to find anything, we are going to have to be incredibly clever about how we go about it. We may even need your help. If we don’t find anything, it might be the beginning of the end of what terrestial, collider-based physics can tell us about the Universe. But even a null result from Run 2 would still be a result, and an important one at that.

So, it is the dawn of a new era for particle physics. It is time for the experimentalists to once again outshine their theoretical friends. It is open season for the particle hunters.

Tiny cell superheroes are suiting up to give bone cancer the boot!

Imagine your body is a sprawling, high-tech kingdom, and usually, your immune system is the elite police force keeping everything...