Are you ready for Imitation Game 3?

Imitation gameI’m sure some of you’re wondering, “Wait, what happened to Imitation Game 2?” No, Hollywood has not released Imitation Game 2, but the real world of computing is rapidly moving to Imitation Game 3, and in this post I highlight some of the spectacular developments currently underway in computing technology. But first, a (very) brief history of computing is called for, to help put the new developments into context.

A brief history of computing technology

Like any major success, computing has many fathers (although some of these might more accurately be described as grandfathers, great grandfathers, uncles and grand-uncles).

One could start the story with the Babylonian abacus circa 2400 BC, but I prefer to start it with George Boole, an English mathematician born two hundred years ago in 1815. He invented a system of logical thought (called Boolean algebra in his memory) that till today serves as a mathematical basis for digital logic circuits, which work on binary digits, or bits. Around the same time, Joseph-Marie Jacquard used punched cards for the first time to control textile looms, thereby not only revolutionizing the textile industry, but also introducing programmability and automation into processes which were previously only done manually. Later in that century, Herman Hollerith invented data storage on punched cards and used it to great effect in the 1890 US census, which was famously processed several years faster and several times cheaper than previous (manual) runs. Hollerith’s company was largely the genesis of the tech titan, IBM.

Charles Babbage is widely accepted as one of the fathers of modern computing, owing to his invention of the Analytical Engine, the first conceptual model of a general-purpose computer. The second “father”, of course, is Alan Turing, who invented the Turing machine, which is a mathematical model of a computing machine which can use a predefined set of rules to determine a result from a set of input variables. It is his life that forms the subject of the film, Imitation Game, in which we see how Turing develops an electro-mechanical computer to crack the Nazi code (Enigma) during World War II.

Imitation Game 2.0

The aforementioned fathers of computing created the conceptual and mathematical basis for modern computing and then managed to translate those ideas into electro-mechanical computing machines. The current generation of computers are of course, electronic digital computers. Among the fathers of this generation of computers (and there are very many), a few are worth mentioning. John von Neumann improved on the early digital computer (ENIAC) to make the first electronic digital computer (called the EDVAC) which included stored instruction programs (or software, as we now call it). Gordon Moore co-founded Intel but is more famous for coming up with the so-called Moore’s law, which is the observation that, over the history of computing hardware, the number of transistors in a chip has doubled every two years or so. Later of course, we saw the arrival of today’s household names like Bill Gates, Steve Jobs and their many contemporaries, collaborators and competitors.

Together, this generation achieved results that are at least as spectacular as Turing’s WW II triumph with the electro-mechanical computer, and their cumulative efforts have brought us to today’s computing miracles, ranging from neural supercomputers on one hand to smartphones on the other.

Welcome to Imitation Game 3.0

In the next decade or two, everything we know about computing technology is all set to change (again) as computing will transcend Moore’s Law. Yes, Moore’s Law is dying, as Gordon Moore himself acknowledged in a recent interview. Speaking to media a few months ago on the 50th anniversary of his world-famous eponymous law, Mr. Moore said, “We won’t have the rate of progress that we’ve had over the last few decades. I think that’s inevitable with any technology; it eventually saturates out. I guess I see Moore’s law dying here in the next decade or so, but that’s not surprising.”

The first computers to be made in the 1940’s were about the size of a room. At the time, IBM engineers estimated that a more powerful computer would weigh about 1.5 tons. Probably this is what led then IBM Chairman Thomas Watson to infamously remark “I think there is a world market for maybe five computers.” But computers – and the world – changed when vacuum tube technology was replaced by transistor technology, and then again by semiconductor chip technology that is ubiquitous today.

Today’s computer chips are incredibly small – the smallest ones involve fabrication processes on the scale of 14 nanometers and 10 nm ones are on the way. How small is that? Well, a human hair is 2000 times thicker, which means if the chip were as thick as a human hair, our hair would have to be as thick as a cucumber to maintain the same proportion. Making the chips any smaller is reaching the point of impossibility without a radical change in how we make (or indeed, how we think about) computers.

It turns out that we are actually poised to make a big leap into the next and even more exciting phase of computing technology, and while I don’t know what exactly the technology roadmap will look like, here are some of the mega-tech trends that will surely shape the future of computers in the post-Moore’s law world, or in other words, these are the trends which will define Imitation Game 3.0.

Nanotechnology: from Silicon Valley to Planet Graphene?

The need for higher processing power per unit area and energy will soon necessitate a change in technology. One of the most proximate changes that is being worked on is to replace, partly or entirely, silicon as the material on which to write the electronic circuitry of a computer.

Carbon is coming up as the most likely candidate. Carbon-based nanotubes (CNT) and phase change materials (PCM’s) show relatively near term promise to take processing speed to the next level, albeit in different ways. However, the most excitement – and the most skepticism – surrounds the wonder material graphene, which is a carbon wafer that is literally only one atom thick, but is 200 times stronger than steel. If we are able to tame graphene, this would allow processing speeds to increase by anywhere between 10-100 times current speeds, depending on which specific technology triumphs, if any.

Optical computing: from electrons to photons?

Changing the material from silicon to graphene could make computers 100 times faster, but that is not quite the same as changing the game, eh? Not cool enough, I say. Enter optical computing, where the idea is to use photons of light instead of electrons to carry information and perform calculations. The general idea here is that photons can travel faster and carry more information than anything in nature, and hence optical computers could, in theory, be several orders of magnitude faster than current speeds.  In their purest and most ambitious avatar, optical computers could literally work at the speed of light, completely changing the world in which we live. And if this sounds far-fetched, know that MIT published a report on this topic in 2010, and that Intel and IBM are both working on photonic chips as we speak. Not only that, upstart researchers such as this one claim that they have almost invented an optical computer that can perform billions of calculations in parallel at the speed of light, thus equipping a home desktop with the processing power of a current generation super-computer! (I do not have the technical expertise to comment on the validity of this claim).

Biological computers: is it in our DNA?

Ok, so swapping electrons with photons is not cool enough for you? Try DNA computing, where the frontiers of physics, chemistry and biology collide to create a potential technological breakthrough. Sometimes also referred to as bio-molecular computers, these computers use strands of DNA and use the information coding techniques of life itself to transmit data and perform calculations.

If it works, we could get three trillion computers, working in parallel, in a space the size of a water droplet, according to researchers. But the attraction of bio-molecular computing is not just the size and speed it brings us, as incredible as those improvements will be (once again, due to parallel calculations rather than sequential in current microelectronic chips). Rather, the attraction is that it fundamentally changes what a computer means and where it can exist. These inherently biological computers can interact directly with living cells. For example, one could program DNA chips to work inside the body to combat a disease like cancer by identifying and killing individual cancerous cells. DNA chips could, in theory, even be programmed to support human brain functions that are diminished by age or injury.

Quantum computing: freedom from binary, digital logic?

Perhaps this, at last, is the technological revolution that truly deserves to be called Imitation Game 3.0. Let me explain why, but a quick modern physics refresher first: as stuff gets smaller and smaller, the classical physics of old genius Newton no longer works and needs to be replaced by the modern physics of less old genius Einstein. Now, modern physics at the micro-level, called quantum physics, involves some pretty weird stuff that totally defies our conventional notions of what’s possible or reasonable, because our conventional logic – including the digital, binary logic on which computers run – is based on classical notions of matter, space and time. For instance, de Broglie showed that very small things can be both waves and particles at the same time, Heisenberg told us that the act of observing them can alter their state of being and Schrodinger showed that they can exist in more than one place at the same time (well, sort of). Suffice to say, quantum logic is completely different and is governed by a set of equations that are completely alien to our everyday experience and logic, including our current computer logic.

As computer chips became smaller and smaller in the last decades, quantum effects, especially quantum tunneling, started interfering with their manufacture and the Intels of the world currently go to great lengths to avoid them from creating trouble. This made some scientists wonder whether these quantum effects, instead of being avoided, could in fact be used to develop a totally new kind of computer, called the quantum computer. The operational physics of this kind of computer will take too long to explain in this blog post. To understand the key features of a quantum computer, please see this infographic from Intel, who together with Microsoft and Google, are investing in quantum computing research.

The most important thing to understand is this: quantum computers work in qubits instead of the bits in binary logic, and these qubits can be both 0 and 1 at the same time. This and a few other quantum properties allow quantum computers to solve problems not just by working exponentially faster, but also by using a different set of algorithms that are practically impossible to solve with a conventional binary digital logic based computer.

So while graphene chips replace silicon as the material, optical computers replace the electrons with photons and DNA computers replace the rest of the hardware with well, DNA, quantum computers replace the fundamental logic of computers itself – replacing the binary, digital logic with quantum logic.

Interestingly, whenever quantum computers see the light of day, computers would have evolved beyond George Boole and his logic, which was the starting point of this story. In this sense, quantum computers represent a truly new species of computer and not just an evolution of the current ones.

Even more interesting, the fact that quantum computers can solve currently un-solvable algorithms, makes them of great interest to security experts, both code-makers and code-breakers. Which brings us right back to how Alan Turing used binary computing machines to defeat Enigma during World War II. As I said, this is Imitation Game 3.0!

So what?

You might legitimately ask why anyone who is not a computer engineer (disclaimer: neither am I!) should bother himself or herself with all of this. After all, most of us are not going to actively contribute to this technological journey. I believe it is important to stay on top of these trends, because they will change the world just like digital computers did.

At one level, unless these technological evolutions or revolutions in computing hardware occur at least to some degree, all the buzzwords the world is excited about today – Big Data, Internet of Things, Artificial Intelligence or Machine Learning – will fall flat due to the absence of adequate processing power to imbue them with any reality.

At another level, we will need even more processing power to predict and manage climate change, to decode our genes fully, to develop advanced cures for ailments, to unravel the mysteries of the universe and its origins, or more pragmatically, to safely manage fleets of millions of driver-less cars in a country like the US!  None of this will be possible unless some of the miracle technologies above take computers beyond the plastic-sheathed, boxy, binary, digital, silicon-based devices they are today.

So, what do I think is the future of computing machines? They will not exist – not as we know them today. They will be nowhere, because they will be everywhere, even possibly inside of us and in outer space! They will be integrated with us and with each other seamlessly, and will not need artificial interfaces like keyboards and screens to communicate with us or with each other. They will be distributed widely, but with an ability to aggregate as needed. Working with us and for us, they will help us not only improve life on earth but also help us transcend this planet. Or at least, I hope so!

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *