The First Optical Microprocessing Chip Is A Breakthrough In Quantum Computing

Published this week in one of the most prestigious journals, Science, scientists at the University of Bristol have debuted new optical chip that could be to quantum computers what traditional microprocessors are to current computer technology: invaluable. Already the chip has been able to tackle typically year-long experiments in just a matter of hours. What else will it achieve?

Optical chip: this novel chip allows previously complex and expensive experiments to be conducted cheaply and rapidly. Image from the University of Bristol.Optical chip: this novel chip allows previously complex and expensive experiments to be conducted cheaply and rapidly. Image from the University of Bristol.

Standard computers operate on a system of bits based on transistors, wherein each bit is either a “0” or a “1”. In a simple system of 8 bits, there are 28 states in which the computer can exist and it will exist in exactly one of these. In a quantum computer, the fundamental unit is the qubit which is based not on a transistor, but on some two-state quantum mechanical phenomenon. In a system with 8 qubits, there will be 28 possible states and the computer can exist in any arbitrary superposition of these states simultaneously. This will allow quantum computers, when they have truly come into existence, to tackle problems that are too computationally complex for traditional computation. For example, sorting through vast chemical libraries to design new pharmaceuticals, performing superfast database searches, or solving complex mathematical problems that have so far eluded even our modern supercomputers.

Quantum computing chip: Canadian company D-Wave is one of few working towards realizing an actual quantum computer.Quantum computing chip: Canadian company D-Wave is one of few working towards realizing an actual quantum computer.

A major requirement for the development of quantum systems is a more thorough understanding of the quantum mechanical nature of light and how it can be controlled. Until now, just conducting experiments at this level has been incredibly time consuming due to the notoriously finicky behavior of light and matter at a quantum level. Designing and testing a single experiment could take years of a scientist’s time and might not even yield a positive result. There are few research groups around the world with the time, resources and ability to conduct such work and the development of quantum computing is slowed as a result.

Enter the University of Bristol’s Dr. Anthony Laing, a research associate in the Department of Physics. He and his research team have produced the optical equivalent of a microprocessor which can be easily reprogrammed to manipulate photons in an infinite number of ways. Experiments that would once have taken years of time and a wealth of research dollars, can now be conceived and executed in the time it takes to write a few lines of code. The team has already demonstrated the operation of the chip by performing a year’s worth of experiments in just a matter of hours and its capabilities and design are such that it can realize future experiments of which we are yet to even conceive. Says Laing, “a whole field of research has essentially been put onto a single optical chip that is easily controlled. The implications of the work go beyond the huge resource savings. Now anybody can run their own experiments with photons, much like they operate any other piece of software on a computer. They no longer need to convince a physicist to devote many months of their life to painstakingly build and conduct a new experiment."

The University of Bristol is establishing itself as a hot spot for quantum computing with research like this as well as its innovative Quantum in the Cloud which aims to make a real quantum processor available for any and all who have the wherewithal to use it. Bringing quantum technologies from the realm of esoteric science fiction to the mainstream scientific community and the public at large is the best way to assure that this new era in computing is only just around the corner.

Via Phys.org, the University of Bristol, and Science.