Book Review: Apple's Dilemma

From aiCIO Magazine's Winter 2011 Issue: What can Clay ­Christensen’s The Innovator’s­ Dilemma tell us about Walter Isaacson’s Steve Jobs?

To see this article in digital magazine format, click here. 

Unlike other product developers,” biographer-king Walter Isaacson writes in his timely tome of Steve Jobs, “Jobs did not believe the customer was always right; if they wanted to resist using a mouse, they were wrong.” And if they were wrong, Isaacson goes on to detail, Jobs was going to show them they were wrong—and they would love him for it.

Such thinking is at odds with the simplistic adage that “the customer is always right.” Instead, it closely mirrors more recent academic work around the sustainability of corporate dominance—epitomized by the work of Harvard academic Clay Christensen and his theory of how innovation is successfully navigated. At its root, Christensen says this: Established companies are unable to deliver true innovation (of the type that Jobs was so prolific) because their incentives and processes are all designed to encourage a better delivery of the status quo, rather than risking tearing that status quo—and their entrenched position—to pieces. Jobs himself adored this business school must-read—The Innovator’s Dilemma—but should Apple and its founder really be considered the template for the book’s prescriptions for how best to avoid being overtaken by rival, and more innovative, companies? 

The answer is as complex as Steve Jobs’ personality. Jobs, Isaacson shows, was an intensely conflicted and flawed human being. “There are parts of his life and personality that are extremely messy, and that’s the truth,” Jobs’ wife Laurene Powell told Isaacson. Powell does not explicitly list her husband’s faults, but it is safe to assume the list of “messy” parts include the denial of paternity and abandonment of his first daughter, his fits of unmitigated rage and seeming eagerness to denigrate friend and foe, and his accusations of intellectual theft against Microsoft’s Bill Gates (to which Gates replied: “Well Steve, I think there’s more than one way of looking at it. I think it’s more like we both had this rich neighbor named Xerox and I broke into his house to steal the TV set and found out that you had already stolen it.”). Like many an historic figure, of course, what made him difficult also made him successful. His demand for perfection caused him to belittle those who did not meet his standards—but also catalyzed the creation of the Mac, iMac, iPod, iPhone, and iPad. His heavy youthful drug use—LSD being his weapon of choice—cannot have aided his health, but he also credits it with being “one of the most important things in my life” and a key to his creations. His immaturity caused Apple’s board to kick him out of the company in 1985, but without such actions he would have never altered the animation business as CEO of Pixar.  

Besides pronouns, perhaps the most used word in Isaacson’s novel is shit. This is because Jobs, in this quest for perfection, saw things as either world-changing or fecal matter. There was nothing in between. This ruthlessness applied beyond products; he was equally as binary with business strategy. “One of Jobs’s business rules was to never be afraid of cannibalizing yourself,” Isaacson writes. “‘If you don’t cannibalize yourself, someone else will,’ he said. So even though an iPhone might cannibalize sales of an iPod, or an iPad might cannibalize sales of a laptop, that did not deter him.” Christensen, who undoubtedly has already read Isaacson’s book, will have smiled when he read that, for it echoes exactly what he has written. “Hewlett-Packard’s experience in the personal computer business illustrates how a company’s pursuit of a disruptive technology…might entail, in the end, killing another of its business units,” he wrote in The Innovator’s Dilemma, and, indeed, much of what he prescribed was implemented at Apple. “Markets that do not exist cannot be analyzed,” Christensen writes. “Customers don’t know what they want until we’ve shown them,” said Jobs in Isaacson’s biography. Embrace disruptive advances by fostering autonomous teams, often in a different physical location, Christensen writes. New product teams at Apple, intentionally or not, were often housed in separate buildings and given relative freedom to develop their ideas, according to Isaacson. The disk-drive industry showed that disruptive products will be made before there is any obvious demand for them, Christensen writes. “At the end of a routine meeting with Toshiba, the engineers mentioned a… 1.8-inch drive, and they were not sure what to do with it,” according to Isaacson. Jobs immediately cornered the market, understanding the disruptive and essential nature of the technology for his iPod. Such alignment should not surprise: “‘It’s important that we make this transformation [from a computer-as-hub theory of technology], because of what Clayton Christensen calls the ‘innovator’s dilemma,’ where people who invent something are usually the last ones to see past it, and we certainly don’t want to be left behind,’” Jobs is quoted as saying. He was unabashedly a believer in Christensen’s mantra.

The irony is that within Christensen’s work lurks danger for Apple. Controlling the entire user experience was one of Jobs’ founding principles—his battle to keep pornography off of the iPhone being just one lighthearted example of his desire to control physical design, production, software development, marketing, and an array of other inputs. According to The Innovator’s Dilemma, this is hazardous. Christensen himself is quoted by Isaacson in the book: “‘If Apple continues to rely on a proprietary architecture,’ the Harvard Business School professor Clayton Christensen told Wired, ‘the iPod will likely become a niche product.’” 

It is here that Isaacson potentially crosses the line from neutral observer to partisan lobbyist. He quickly dismisses Christensen’s claim. Of course, the iPod is not a niche product—yet. In the academic’s theory, Apple has played the role of innovator, of the outsider who brings disruptive technology to an industry (computing, music, telephone, books) that more established players are unable to embrace. The market is either perceived to be too small, or their existing customers tell them they don’t need such a product. At its root, there is no one within a company who, on the margin, is rewarded for the enormous gamble they would have to take. Indeed, it is “good” management—via efficient resource allocation—that actually stops these businesses from successfully navigating disruptive change. The problem for Apple is that they are no longer outsiders. The company—largely through Jobs’ effective creation and marketing since his return to Apple in the late 1990s—is now the ultimate technology insider. 

The interesting part is that they have been here before. Apple had been a pioneer in the personal computer market of the 1980s, but had insisted on a closed architecture. Bill Gates, on the other hand, cared not for end-to-end integration. “Jobs…decree[d] that the Macintosh operating system would not be available for any other company’s hardware,” Isaacson writes. “Microsoft pursued the opposite strategy, allowing its Windows operating system to be promiscuously licensed. That did not produce the most elegant computers, but it did lead to Microsoft dominating the world of operating systems. After Apple’s market share shrank to less than 5%, Microsoft’s approach was declared the winner in the personal computer realm.” 

 

The finale of Isaacson’s book is ragged, the result, I suspect, of pressure from the publisher following Jobs’ death. What had been an intricately woven story of the personal and the technical reverts to a laundry list of remembrance from family and famous friends as Jobs wastes away. 

One such sometimes-friend: Microsoft’s Gates, who takes the time to comment on life-after-Jobs for Apple—and how, perhaps, it was Jobs himself, through sheer force of will, who was able to overcome the limitations of the closed operating system. “I use to believe that the open, horizontal model would prevail,” Isaacson quotes Gates as saying. “But you proved that the integrated, vertical model could also be great.” He adds an essential, and cautionary, caveat, however: “The integrated approach works well when Steve is at the helm. But it doesn’t mean it will win many rounds in the future.” Even a dying Jobs struggled to argue with this. When told by Isaacson of Gates’ comments, Jobs was asked to think of “another company that made great products by insisting on end-to-end integration. He thought for a while, trying to come up with an example. ‘The car companies,’ he finally said, but then he added, ‘Or at least they used to.’” Not a rounding endorsement of the model he championed. 

What Gates was saying—and, perhaps, what Christensen meant when he predicted that the iPod would become a niche product—is twofold: First, Jobs was so personally extraordinary and early enough to markets that he could overcome the problem that arises when companies become established, and second, that time may not be kind to Apple. Jobs did many things right: He gave small teams autonomy; he didn’t listen to his customers about what they wanted in the future; he did not dismiss initially small markets. But he got one thing wrong: He failed to learn from Microsoft’s dominance of the 1980s and 1990s, the result of their willingness to license software and be “open.” Apple largely has avoided the negative aspects of the innovators dilemma thus far, but with products like Google’s Android operating system grabbing market share, Christensen’s warnings seem to be coming to the fore as Apple mourns. 

The applications of this lesson are widespread—perhaps it shows asset managers that the boutique model, with discrete strategies, housed in distinct units, offer the most robust strategy for future success. In Apple’s case, however, it means one thing: Your leader is gone. Be wary. —Kip McDaniel 

«