What's Next for Signal Processing? Part 2

28 April 2015
Quantum Computing

In the first part of this post, I looked at where computing is headed with more cores, wider bandwidth and special application engines. That’s what we can see today. But the future is not necessarily a straight line…

The real interesting stuff lies further out in time and, for now at least, on the fringes of technology. These are the devices that will use fundamentally different approaches to solving computational problems—the “DARPA-hard” things that lie well beyond the possible today. The need is here now. We all read about how systems and operators are swamped with orders of magnitude more data than they can possible analyze. Potentially useful data is dropped on the floor. Cross correlation and pattern matching between different sensors and modalities is but a dream in many cases. This potentially puts lives at risk.

The weird stuff

A couple of areas pique my interest. Quantum computers have been discussed for decades and recently became real (or not—the validity of some examples remains the subject of debate, which is kind of ironic to those of us who had our minds blown by college lectures on Schrödinger's Cat). Just as the cat can be considered as simultaneously alive and dead, quantum computers may be in existence or may not be. Only by observing do we collapse the duality to its reality. 

Anyway: quantum computers offer the promise of several orders of magnitude of speedup for certain problem sets. Cryptography is expected to be a good fit and is one of the big driving factors. You can readily understand why some Three Letter Agencies have an interest, but companies like Google are dabbling too—fast classification of images seems to be of interest to them. “Why?” we will leave as an exercise for the reader…

Commercially available quantum machines have been on sale for a few years and have already been through several iterations in the number of qubits (units of quantum data). Whether these truly exhibit quantum entanglement (if you though the cat thing was weird, wrap your mind around this stuff—even Einstein called it “spooky action at a distance”) seems to be the subject of some debate. Intriguing as it is, it seems unlikely we are going to see a COTS 3U VPX board with quantum computing any time soon, and if we did, I’m not sure who would buy it and what they would use if for. I haven’t seen many RFIs for Infinite Improbability Drives recently. 

Also, it needs to be acknowledged that many of our compute problems will not run faster on quantum computers. The problem at hand must map well to the method of convergence—otherwise, we run afoul of Kaplan's law of the instrument: Give a small boy a hammer, and he will find that everything he encounters needs pounding.

Our industry is cyclical

GE Analog ComputerThe other interesting area lies in analog computing. Yes, I said analog. I have maintained for some time that our industry is cyclical and old technologies that were long since abandoned pop up again and are of renewed relevance more often than you would think - a bit like my old suits from the 70s. Once again, military applications drove rapid development—specifically, fire control systems. The usual example that is cited is the Norden bombsight (also an early example of the application of anti-tamper technology—it had an explosive device built in to allow the design to be protected in the event of exposure to adversaries). Even GE produced an analog computer in the early 1960s.

However, by the 1970s the analog era was pretty much over, with new-fangled digital computers taking over. The downside of quantization noise was finally worth dealing with to gain flexibility and programmability—not to mention how inaccurate and hard to maintain those potentiometers and other mechanical components can be. 

Today, analog computers are all but relegated to being museum pieces. But…recently DARPA released a Request for Information on the subject of Analog and Continuous-variable Co-processors for Efficient Scientific Simulation (ACCESS). Why would such a forward-looking organization look to old and abandoned technology? 

The answer lies in the difficulty that conventional digital devices have in effectively solving partial differential equations, the root of many heavy computational loads, and also the speed at which analog solvers can collapse to an approximate solution to a problem, with an accuracy that is sufficient for many image analysis and classification needs. DARPA is willing to spend money in the hope that a new breed of analog computers might just be part of the solution. It is anticipated that the probabilistic inference of such analog systems will be combined with digital systems to provide a much shorter time to extract actionable data from a firehose of information.

Even more exotic

Later in our exchange, my colleague asked about biological computing. Here biological pathways are built to emulate computational elements using proteins or DNA molecules. The attraction of this approach lies in the potentially low cost of production—large scale systems can literally be grown. No multi-billion dollar fabrication lines, just a petri dish (well, you get the idea anyway). Given that researchers are really only at the equivalent stage of producing the first transistors, it seems like it will be a while before we see anything really useful here—but certainly an area to watch.

So what was the answer to the question again?

Given that the likelihood of seeing quantum, analog or biological COTS boards in the next few years is low, the conclusion seems to be that tomorrow’s signal processors will look pretty similar to today’s. Sure, we will see more cores, wider vector pipelines, faster versions of PCIe, InfiniBand and Ethernet. We will probably still be programming them in C++, OpenCL, OpenMP, OpenMPI and so on. Maybe we will finally have moved from restricted bandwidth copper backplanes to optical ones. 

Even so, tomorrow’s signal processing system will be very familiar to today’s engineer. What is important is to select a partner who understands product lifecycles and technology insertion, and understands how to support those in open architectures. Given that GE has been in the computer business since the 1960s (with not just the earlier-cited EF-140 analog computer, but also digital mainframes like the GE-200), you can believe that we have the knowledge and heritage to be worthy of consideration for your next project. 

And: by the time quantum, analog and biological COTS boards are viable, I’m sure we’ll be there to help you integrate those too.

Peter Thompson

Peter Thompson is Vice President, Product Management at Abaco Systems. He first started working on High Performance Embedded Computing systems when a 1 MFLOP machine was enough to give him a hernia while carrying it from the parking lot to a customer’s lab. He is now very happy to have 27,000 times more compute power in his phone, which weighs considerably less.