AUVSI Day 2: Unmanned to Autonomous - You're on Your Own
Critical to the future of unmanned aerial systems (UAS) is the ability to fly within the regulations issued by the FAA. Past FAA regulations imposed restrictions defined by vehicle mass, speed, altitude and line-of-sight requirements. Flying beyond-the-line-of-sight (BLOS), however, will be required to evolve this industry toward the numerous applications now in development and keep the U.S. in a competitive market position. Safety within the airspace is critical, however. At the AUVSI conference today, FAA Administrator Michael Huerta issued the latest FAA initiative to mature small UAS BLOS operations (FAA press release).
Most existing commercial unmanned systems still operate with a man in the loop for control and situational awareness. It’s not particularly difficult to build and fly a variety of UAS platforms with man in the loop and, typically, within line of sight. Autonomy is the interesting next step in the unmanned systems evolution—and it’s an entirely different ballgame when it comes to computing: computing from the perspective of safety, and computing from the perspective of awareness.
Autonomy requires brains and perception. The ability of an unmanned system to be truly autonomous requires human-like ability to see, hear, touch, understand, judge and act. In many cases, the on-board sensor suite and computing requirements to achieve autonomy can eclipse payload sophistication—and the level of autonomy can vary considerably depending on the unmanned platform and its desired BLOS capabilities.
Computers, and the application, are the brains of course. IBM’s high-performance Watson computer was an impressive feat against Jeopardy contestants a few short years ago. It was also very large and needed lots of power. It used to consume the size of a typical bedroom, but now I understand they’ve got it down to the size of a large rackmount server (perhaps it reaches back into the cloud?). But in the world of unmanned systems, size matters, at least for platforms that need to fly—and high performance compute capacity is still required to perceive and process the environment and execute tasks, autonomously.
Perception ideally needs a high-fidelity 3D sensed environment. Again, the degree depends on, among other factors, the platform and the application, together with environment and safety requirements. Multi-sensor data fusion provides a more complete and accurate representation, and can use different sensing modalities offering more reliable perception, particularly in challenging environments. The computational requirements for image and radar processing are demanding enough as it is for SWaP-constrained platforms. The added requirements for understanding and acting usually leverage Bayesian-type statistical computational frameworks and other fusion methodologies, and these require high computationally capacity. This is particularly challenging for real time environments.
Processor technology continues to make significant leaps in density, power and performance. GE offers compelling computing solutions for unmanned systems payload and perception processing that leverage GPU co-processing in power efficient SoCs (game changing on-board processing). It’s the same basic technology that runs the highest performance desktop graphics cards, that power the world’s 10 most powerful supercomputers, and that some luxury car manufacturers use in their piloted driving initiatives. These GE solutions are engineered onto industry standard embedded computing form factors, enabling quick development and deployment on to UAS platforms.
TeraFLOPS performance used to be the privileged domain of these high-end supercomputers. Now, it will fit in the palm of your hand and permit unmanned systems to be on their own.