Many of us find computing a bit intriguing, perhaps intimidating, if it ventures into anything beyond pencil and paper or 4-function calculator. There is this mystique involved in computing because of the abstraction of numbers to something that is represented by zeros and ones, then calculating huge and complicated things at that level in tiny fractions of a second.

This mystique is augmented and magnified when the word “programming” enters the horizon. Now many of us get that queasy feeling of being overwhelmed by some geek, nerd, or worse – a 6-year old kid! 🙂

But really numbers are just abstractions of say cows you can see (and count) in the field, or piles of tomatoes in baskets. Sure you can tell someone else “I have 50 baskets of tomatoes, but you already abstract a picture of 50 baskets to a number “50”…

There are some people living in jungles who have had no need to abstract quantities in their lives beyond “one”, “two”, “three”, “many”. It is a simple culture that has no need for more detailed abstraction than that! Sure, they cannot easily calculate 20% of 40 had they lived in our culture, but we will be unable to survive more than 4 minutes after being bitten by some unusual beetle on the jungle floor if we visit their world – but they would simple reach to the nearest remedy, apply it to the bite, and we’ll be as good as new!

So much for the competitive edge of 20% of 40!

Back to the intimidation of programming: We cook things. Say an egg: Take a vessel, fill it with water. Gather some firewood and make fire – or just light up your stove. Take an egg and carefully place it in the water in the pan. Put the pan on the fire, and let it boil for 3-5 minutes, remove from the fire, add cold water – voila! A boiled egg!

You have done this many-many times. You already know several tricks to make things faster, better boiled eggs etc.

Congratulation! You have just finished a “programming job”. It consists of steps doing something in each step. Testing of certain results (say, the egg is too cold to put on a raging fire in boiling water, so you reduce the heat and wait, test again, and proceed) and taking alternative paths of action depending on the tests.

Then you process certain inputs in some steps, and eventually produce some specific, desired results – a perfectly boiled egg.

There. This really is a great model for a program running in a computer.

Let me take you to the start of this blog: The fearsome abstraction of numbers into zeros and ones.

Nearly no programmer calculates numbers in their heads using ones and zeros – except some fringe characters, which you should definitely stay away from.

But it happens to be a simple thing for a computer circuit to represent a state with these two values – “0” or “1”, “no voltage”, “voltage” – that’s it. We do not work in voltages and currents, but computer circuits do, so it is their language.

Traditional computing since Boole and Babbage – and Countess Ada! – have worked numbers in this fashion. Since the 1940s, the current/no current in a circuit became the implementation medium for computing, inside computer circuits.

We have heard so much about it the binary states in computing that we mostly have some rudimentary understanding of the two states.

Enter quantums. Physics has been nice a fun for centuries, being described by rules (“Laws of Physics”) by many folks who experimented and derived the rules – Newton to Einstein to Hawking. Some experimented in their heads using mathematical tools, others used their heads to intercept falling apples (Newton!) And so our physical world was pretty well described by various laws, gravity, speed, momentum, optics, electricity-oriented laws and so on.

But this all held well with the physical world of non-atomic dimensions – big stuff, especially big as compared to atomic particles. When it comes to these little entities that are smaller than small, the laws of physics simply do not quite work the way we are used to from High School Physics classes.

Quantum computing is one such “critter” that behaves differently in the small-scale subatomic world. While computer circuits determine states by zero or one, quantum computing has these states as well, but also adds another state – the in between state, neither on, nor off.

Yes, it sounds weird to say – or read – that something can be neither on nor off, neither zero, nor one. But there you have it – while the bits that we use use to represent states in a computing device can have only these two states, qubits, the representations of state in quantum computing, can have the on state, the off state, and the in-between state.

The only way to grasp this is to accept it as an “IT IS SO!”

Imagine the total befuddlement of people in Newton’s age when he suggested gravity as a law of nature – “whoa!!! Wait a moment here, what do you mean Earth pulls stuff without even touching it?” But we we do not find gravity to be a strange notion, we know it, and have discussed it, performed experiments to measure it and so on.

Qubits are to us like gravity was to Newton’s contemporaries. The third state is also called by the lovely technical term “coherent superposition.”

But don’t get too happy yet! It gets better! So what is the point here, you may ask. Qubits when joined in large numbers, can perform mathematical computations that are quite complex. The complexity leads to new possibilities to compute things that used to take loooong time on traditional computers in mere fractions of seconds.

Here is a practical example: You were told that a good password is a long one which utilizes many all numerics, all letters – upper and lower case – and as many symbols as you can. Such passwords you were told CAN BE CRACKED by computer using trial and error, but the number of trials will be so large that it is not practical to crack passwords that way, it might take the fastest computers 15,000 years or something like that.

But what about quantum computers? Yeah, exactly! They might take a few seconds to crack a password that a traditional computer would need 15,000 years to crack.

Weird stuff.

Now, don’t panic, don’t run out to change your passwords – this is not quite ready for prime time, although who knows what the NSA is using ALREADY – maybe they have a fully functional quantum computer of significant power.

But realistically, it’ll be many many years before quantum computers are ready to crack your passwords and lay bare to the world the cat pictures you have been sending in your emails…

There is more strangeness in quanta! One interesting – and widely popularized – aspect is the introduction of the uncertainty principle into their world (and ours!) If you measure an aspect of a quantum, their other measurable aspects are no longer static and may change. You can search for a term “Schrödinger’s cat” (here on Wikipedia) and it may melt your brain, but it is a fun piece of reading. Great gymnastics for the brain…

Another interesting effect of the quantum-sized world is the remote effect – a change in a quantum in one location can convey a change in another quantum even while the distance between the quanta may be millions of miles. And the change happens instantaneously! So much for speed of light!

So you can change a qubit of info in Chicago, and that change is conveyed to some spaceship 3 billion miles away from Chicago instantly! Huh? Yep, these are the implications of the science.

Enjoy that brain melt!