Serving Proudly As The Voice Of Valley County Since 1913

Demystifying Quantum Computing

Tech Space

Future technologies are cool, but they're daunting.

Physics is cool, but it's vast. So what happens when we combine the two into a revolutionary new way for computers to process vast amounts of data at speeds far beyond the possibility of our current architecture?

Enter: Quantum Computing.

Don't worry, this is a much easier to grasp concept than the name suggests.

First let's skim over how a regular computer works to crunch numbers and solve problems.

The computer on your desk works exclusively in bits, or binary digits. These bits can either be a 1 or a 0 and everything your computer does is expressed in this way, from watching YouTube to rendering animation.

It runs this binary data through a series of gates at a rate of billions per second in cycles, the speed of which you may have seen expressed as gigahertz or GHz.

An operation will take many of these cycles to complete, but once it has, the computer takes the resulting solution and stores it ready to use. Moving your mouse cursor, saving a file, playing solitaire; it's all broken down into bits and processed in this way.

The limiting factor of this conventional method is twofold.

First is the physical limitation of how fast you can flip an electrical signal from a 1 to a 0 in any given medium.

Anything conductive has a natural capacitance, meaning it temporarily stores a little of the electricity passing though it. This results in needing some "cool down" time between signals to give the conductor a chance to fully discharge before the next begins, else you end up with a 0 which looks way too much like a 1 and a horrible blue screened crash.

The solution to this is simply expand on the number of conductive "pipes" which you're pushing the binary data down, but this greatly increases the heat produced by the system to the point of not being appropriate for convenient use in the majority of environments.

So how does quantum computing differ?

Wellm remember those binary digits? Quantum computers use a different basic component of data exchange called a qubit.

These guys use the principal of quantum mechanics called superposition, in that they can be thought of as both a 1 and a 0 simultaneously. When you measure to see which it is, the superposition is destroyed; think of it like a coin toss. When the coin is quickly rotating in the air, it could be seen as both heads and tails at the same time. Once it lands, you see which side is up and the state of being "both" comes to an end.

Quantum entanglement makes up the other part of how these new breed of computers process data.

In a nutshell, this means that the observation of one qubit will immediately give you the state of its twin. Quantum computers perform operations using these phenomena to calculate the probability of an outcome rather than doing the computational equation in its entirety. They're not simply a faster, better machine. The way that they actually "compute" is totally new, and totally different to our current ideas about what constitutes a computer. It's genre redefining.

When then can we pop down to Best Buy and pick one up then? Sadly, no time soon.

This technology isn't without its limitations in this early stage.

First of all they are incredibly sensitive to temperature and vibrations, with most being kept in isolated locations and cooled to a brisk -460ºF. Due to the way they function, they're also prone to errors at this stage. The technology is evolving and harnessing parallel operations as a form of error correction, but we're a long way away from you unboxing your new Quantum iPad on quantum Instagram.

Richard Noble is the founder of Want For Tech, an IT company based in Glasgow.

 

Reader Comments(0)