Can you remember the first computer adventure games? I’m not talking about Super Mario Brothers or Halo… I’m referring to text-based games, like Hitch Hikers Guide to the Galaxy. It displayed some text describing the scene and you had to type in some text to do things like walk around or pick up items. It was neat, but became quickly frustrating after typing a hundred different phrases to get a door open or turn on a device.
Eventually, we got great visual gaming platforms like Atari, Nintendo, PlayStation and Xbox. Games like Halo are just amazing and immersive. When playing those games, you’ll notice there is little to no text at all—and they are loads of fun!
Can you image taking a game as fun and complex as Halo and re-designing it to be text-based? What might have taken 15 minutes to get through a level might now take weeks or months. You would get lots of text describing the constantly changing 3D scene, and you would have to react before more text popped up stating the scene changed and more monsters are rushing at you. Wow! That just sounds like the most annoying game ever.
Begging for visualization
Do you develop embedded software that isn’t visualized until it’s basically deployed and hooked up to a physical display? Military applications are filled with software that just begs for visualization: radar, sonar, signal intelligence and so on. I’ve been a software developer in the military field for over 20 years, and something that has always been noticeable is that developers still mainly use text to set application parameters and verify resulting output. This just seems crazy when we have games like Halo with amazing displays.
Unfortunately, adding GUIs (graphical user interfaces) to an embedded application is actually a very difficult thing—for many reasons. The embedded system you’re developing on may not have a graphics card or graphics driver. Embedded applications are sequential and GUI applications are event-driven, which is a whole different skill set that can take months to learn. There are hundreds of toolkits to choose from, and none stand out. These are just some of the barriers to visualization.
With years of experience creating GUIs for embedded systems, Abaco has invented a new way to add GUIs to an embedded application effortlessly. The product is called DataView, which is part of the AXIS development tool suite. It removes the traditional barriers that most GUI packages have. DataView connects remotely to the application via TCP, so even if the embedded system doesn’t have a graphics card or driver, you can still do visualization remotely. While the application is running, the GUI can be connected and disconnected, and there is zero overhead when the GUI is not connected. It uses sequential programming instead of event-driven programming, so there’s no major learning curve. Its widget set is specifically designed for controlling an application’s input values and display application data at any stage of processing. Finally, it only has five APIs. Yes, only five—where most other GUI toolkits have hundreds or thousands of APIs. All input parameters are stored in a single C data structure and if any control is changed in the GUI, the C data structure in the application is updated. The output data is also stored in a single data structure and sent to the GUI at appropriate times. It’s such an amazing innovation that it reduces GUI code by a factor of 50X from traditional GUI toolkits; just in case you missed that, I did say 50X.
Sounds like it’s time to get DataView. You can add visualization to your embedded application in about a day. Find out more here.