I’ll answer my own question. Of course it will.
Now let me set the stage.
In my last blog, I recounted having attended an event in Seattle last month put on by the United States Chamber of Commerce. It was jointly presented by the Institute for Legal Reform and the Center for Emerging Technologies.
One of the speakers said something like “Big Data is the new oil.” I made a connection then with AI in the form of Deep Learning, and I’m repeating it here:
“If Big Data is the new oil, deep learning is the new refinery.”
I also summarized my overview of the ingredients of the New Refinery as follows, asserting that they consisted of the Computer Processing Units (CPUs) + Graphics Processing Units (GPUs) + Deep Learning algorithms (DL) + Applications revealing useful insights (A); i.e., CPUs + GPUs + DLs + As.
I ended my last blog by suggesting that another Revolution is on the horizon and would give you an “early warning” about it.
Here’s why I see.
On the hardware side, I think we must take note of the efforts to develop better hardware, the so-called Quantum Computers.
A Quantum Computer (QC) is a computer that operates in an entirely different way from the computers with which we’re familiar. In the world we experience as humans, the physics is a classical approximation of Nature, which is quantum mechanical.
Our computers are digital and operate in two specific states of either one (1) or zero (0), a binary “yes” or “no” choice. The data encoded in this way are called “bits.”
A QC uses a quantum processing unit (QPC) and a “superposition” of states called “quantum bits” or “qubits” for short.
I won’t delve into the development of the QC except to alert you to a few details:
The idea was first proposed in 1980. See the Timeline section of the Wikipedia article for Quantum Computing at https://en.wikipedia.org/wiki/Quantum_computing.
In 1981, the late (and great) Richard Feynman is reported to have said: "Nature isn't classical, dammit, and if you want to make a simulation of nature, you'd better make it quantum mechanical, and by golly it's a wonderful problem, because it doesn't look so easy."
For Feynman’s quote, see this IBM blog post from May of 2016: https://www.ibm.com/blogs/think/2016/05/the-quantum-age-of-computing-is-here/.
Here's the Wikipedia timeline on the timeline for Quantum Computing:
Scroll through it, and you’ll see that QC has been developing for a very long time, but also that the pace of innovation is accelerating.
More recently, here are some key developments by some of our tech giants:
In March 2017, IBM announced a commercially available quantum computing system with an open Application Programming Interface (API) called (not surprisingly) IBM Q.
In December 2017, Microsoft announced a preview version of a develop kit with a programming language called Q#. This language is for writing programs that run on an emulated quantum computer.
In March 2018, Google’s Quantum AI Lab announced a 72 qubit processor called Bristlecone.
(All of these developments are on the Wikipedia article for Quantum Computing, the link to which is here:
So now what about software?
Yes, that’s beginning to appear too.
Quantum Walk Neural Networks
In January of 2018, a paper by Dernbach, Mohseni-Kabir, Towsley and Pal was published called Quantum Walk Inspired Neural Networks for Graph-Structured Data. We now have yet another abbreviation: QWNNs.
The first author is Stefan Dernbach, who is currently a PhD student at the Computer Networks Research Group at the University of Massachusetts College of Information and Computer Sciences. There are three co-authors: Arman Mohseni-Kabir, Don Towsley and Siddarth Pal. Towsley is Dernbach’s PhD advisor. Mohseni-Kabir is a graduate student in the physics department at UMass Amherst. Pal is a scientist with BBN Raytheon Technologies.
The link to this two-page article is here:
The abstract reads in part:
“We propose quantum walk neural networks (QWNN), a new graph neural network architecture based on quantum random walks, the quantum parallel to classical random walks. A QWNN learns a quantum walk on a graph to construct a diffusion operator which can be applied to a signal on a graph. We demonstrate the use of the network for prediction tasks for graph structured signals.”
Note the phrase, “prediction tasks.”
That’s what’s so promising. Like it or not, we want (and need) AI to help us do in the future what we alone cannot do now. As I will discuss in my next blog, devices and software systems using AI, especially in the form of Deep Learning, are (to quote the title of a book I will review in my next post) "prediction machines."
But humans are also prediction machines, and I will make this prediction now: We’ll be hearing a lot more about the combination of QCs with some variation of QWNNs.