Better isn't always - better...
I spent a long part of my working life with Texas Instruments – a fantastic company for whom I have the greatest respect and affection. I learned a huge amount there – mostly from what the company got right, but, very occasionally, from what it got wrong.
Cast your mind back to the early 1980s – more specifically, August 21, 1981. Does that ring a bell with you? Let me help you: it was the launch date of one of the most revolutionary computer systems ever – one that went a long way towards transforming the world into what we know today. I’m talking, of course, about the IBM PC.
It wasn’t its powerful 4.77MHz 8088 processor, its memory that was expandable to a huge 256kB or its high resolution 320 x 200 (CGA) resolution screen that made it remarkable. First, it was remarkable because the world’s largest computer company had given legitimacy to a category that had existed for some time, if mainly for hobbyists. (This against a backdrop of IBM’s president, Thomas Watson, having predicted in 1943 that “There is a world market for maybe five computers” and DEC founder Ken Olsen saying, in 1977, that he could see “no reason anyone would want a computer in their home”.)
Second, it was remarkable because it featured an open architecture – an amazing departure for a company that was at the time well versed in the dark arts of keeping its customers loyal. Open architecture meant that, in no time at all, you could get any number of peripherals for it – greatly expanding its utility.
TI saw an opportunity, and hatched a plan to launch a competitor. TI has always – rightly – prided itself on its technology, and TI engineers knew they could build something better. (It had already launched the 99/4A, the world’s first 16-bit home computer). Specifically, they wanted it to support even higher resolution graphics than CGA.
There was a downside to what TI planned to do, however. Building a better computer would make it very slightly incompatible with IBM’s version. The company reasoned that customers would happily forego that compatibility in order to get something superior.
However: a number of TI engineers disagreed with the design approach, believing that compatibility was essential for success. They were outvoted – and, because they believed so strongly in their case, they left to set up their own company.
Thus it was that the TI Professional Computer (TIPC) with its 720 x 300 resolution color screen and its portable (28lbs was portable back then…) equivalent, the TIPPC, were launched in January 1983.
To cut a long story short: it bombed. Why? Because companies were buying IBM PCs to take advantage of Lotus 1-2-3, Ashton-Tate’s database software and the EasyWriter word processor. While the TIPC launched with all three, it was uneconomical for those companies to modify their software when subsequent releases were made – and for TI to pay them to do it. Had the TIPC been 100% compatible with the IBM PC, the port would have been straightforward and cost-free. Lack of software was largely what killed TI’s product.
TI persisted with the TIPC for a while, and had limited success. However, limited success was far from what those engineers who left the company enjoyed. The company they founded was Compaq.
All of which goes to show that the latest and greatest technology is often of secondary importance to customers if they end up forfeiting other things they really care about – like compatibility, interoperability and upgradability. At Abaco, we like to think we have that balance right.
In part two, I’ll look at another example of how misunderstanding why customers buy what they do can lead to suboptimal product development decisions.