Chip wars


Project Trinity quickly became Xenon because someone else at Microsoft was using the Trinity code name. One of the hardware leads, Mike Abrash, resigned a month after the project began, so Nick Baker (pictured right) and Jeff Andrews in the Silicon Valley hardware team had to take more responsibility.

This time, the whole hardware team knew they needed to get their costs under control. Instead of using off-the-shelf parts designed by different chip vendors for the PC, they needed to control the design much more closely and get the rights to consolidate multiple chips into one over time in order to have a path for cost reduction. That would help accomplish a major business goal: to cut the console’s price over time.

Sony knew how to ride Moore’s Law, the 1965 observation by Intel’s Gordon Moore that held that the number of components on a state of the art chip doubled every two years. With each new generation, chip makers could either double the number of components on the same size chip, making it more powerful. Or they could keep the number of components the same and halve the size of the chip, effectively cutting its cost in half. Following this law, Sony brought the costs of the original PlayStation from $450 when it was launched in 1994 to about $80 five years later. Consequently, it could slash the price of the box, raising the volume of sales and continuing to make a profit on the hardware.

Sony had its own chip factories and could, if it chose, control the complete process of chip design through production. So while the original Xbox sold at a loss, Sony was able to sell the PlayStation 2 hardware for a profit because it controlled the cost reductions for its chips (and it didn’t include an expensive hard drive).

Webinar

Three top investment pros open up about what it takes to get your video game funded.

Watch On Demand

But with proper planning, Microsoft felt it could compete with Sony.

This time, the WebTV team of 100 or so chip engineers in Silicon Valley would design one of the chips entirely by itself. And it would work closely with its major chip vendors, ATI Technologies and IBM. Those vendors won the deals in part because Microsoft generated bad blood with Nvidia in a chip pricing dispute, and because Intel wasn’t as willing as IBM to customize a design.

The tricky part was that IBM was a full partner with Sony on the Cell microprocessor. But, as told in the 2009 book, The Race for a New Game Machine, Sony made a critical mistake in negotiating with IBM. IBM supplied a PowerPC microprocessor core for use as a sub-processor in the Cell chip for Sony’s PlayStation 3. But IBM retained the right to use the core, modified in some ways, in other products.

IBM turned around and supplied the core to Microsoft. In that sense, Microsoft got lucky, since it was two years behind Sony in designing its next-generation machine. It was risky for the relationship with Sony, but IBM couldn‘t refuse Microsoft, since a billion dollars in chip purchases were at stake.

Jim Kahle, an IBM fellow on the Cell project, felt it was reckless of IBM to work with both rivals so closely, but IBM had processes in place for separating such projects, as it was also working with Nintendo on its next-generation chip. The fact that it was able to balance these partners was a great victory for IBM, as it shut Intel out of the game console business.

But the process was rocky. In the spring of 2003, I wrote a story for the San Jose Mercury News that Sony had gotten patents for something called a Cell microprocessor. Secretly, IBM was stunned at the disclosure and threatened a lawsuit because Sony had patented work done by IBM’s engineers. Sony sheepishly amended the filing to include IBM engineers.

Ironically, one of the IBM engineers later had his name on a patent for the Cell processor as well as a patent for the Xbox 360 processor.

Dave Shippy, a co-author of the game machine book and an IBM engineer, confessed he felt “contaminated” because he had worked with Sony for two years and was now in meetings with Microsoft over a similar chip design. In one hilarious moment, Shippy had to quickly relocate a meeting near a cafeteria to a more remote place in the IBM offices in Austin, Texas, so that the Sony and Microsoft engineers wouldn’t run into each other.

“We all felt that IBM had violated many of its core business practices in jockeying both horses in this particular race,” Shippy said.

Still, Shippy was a true IBM patriot. He did as he was told.

Microsoft’s planned microprocessor carried the code name Waternoose (pictured right), after the crab-like, five-eyed creature in the Monsters Inc. movie. The Microsoft engineers asked for a lot of modifications on the IBM chip, so much so that it turned out to be a very different kind of custom microprocessor. Ken Kutaragi, head of Sony’s game business, had insisted on eight cores on the Cell, instead of the six that IBM recommended, because “eight is beautiful.” As a result, manufacturing the Cell turned out to be harder and more expensive than it should have been. But in contrast to IBM’s server chips, the Cell was far more power efficient for the processing work that it did.

While the Cell chip had one PowerPC subprocessor and eight other processing cores, the Microsoft chip had three PowerPC cores and a very different graphics chip. The Microsoft engineers wanted something that would be simpler for game creators to program. Their final design wasn’t nearly so complicated as Sony’s, though it did require game developers to learn multicore programming.

Chris Prince, a member of the Advanced Technology Group, said the mandate from above was to break even on hardware costs, so the chips couldn’t be too expensive. He and the rest of the group eschewed “theoretical power” in favor of “usable power,” or capabilities that could be most easily exploited by game developers.  That kind of thinking paid off in the future and enabled a quicker launch of the Xbox 360. Once again, Microsoft had one of the easiest devices to program.

Microsoft and IBM had also figured out a way to bake encryption into the microprocessor itself, making it much harder for hackers to reverse engineer and pirate the system.

In a way, IBM was betraying Sony. On the other hand, it was exercising its legal right to resell a core that could be customized into something else. The balancing act that IBM played in Austin was masterful and the Sony-Microsoft-IBM relationship will go down in history as one of the greatest corporate love triangles of all time.

Ken Kutaragi, the head of Sony’s game business,  and his lawyers could have foreclosed IBM from working with Microsoft in their contract, but failed to do so. It was as bad a fumble as when IBM chose Microsoft and Intel to provide key parts of the personal computer and failed to lock them down under contract so they couldn’t provide those vital pieces for others. That was how the PC clone business was born and IBM lost control of the PC industry.

For the PS 2, Sony was able to drive cost reductions by fusing its graphics chip and microprocessor together by the end of the generation. But with the PS 3, Sony had many stumbles.

At first, it experimented with a graphics processor called the RS, designed by the same team that made the graphics chip for the PS 2. It was a very different kind of graphics chip, but was incredibly hard to program. That effort failed because the chip was too big to be manufactured. Then Sony shifted the plan to include two Cell chips in the system. That didn’t work. At the last minute, Sony hired Nvidia to do the graphics chip as a companion to a single Cell microprocessor.

One of the trade-offs for IBM: it lost Apple, which had been using PowerPC chips, as a customer. As IBM fell behind, Apple made a fateful switch to Intel that helped the Mac regain its position in the market. It was funny how one decision cascaded into another and changed the landscape of both the game and chip industries.

Launching Xbox Live

At the Consumer Electronics Show in 2002, Microsoft showed a funny video that illustrated Xbox Live. In it, gamers play a football game. One particularly player with a menacing voice turns out to be a little kid holding a game controller. The punchline was that you could be virtually anyone online.

It was an appealing pitch, and one that Microsoft was better at making than its rivals, since it had the system that was ideal for online gaming.

By June 2002, Microsoft had sold 3.5 million Xboxes. In the fall of 2002, Xbox Live was ready for its launch, where it was hyped as much like the debut of the Xbox a year earlier. There were some glitches, but Microsoft managed to get Xbox Live launched with a great deal of fan adoration.

Cameron Ferroni said recently that Xbox Live had all of the features to revolutionize online play: player achievements for hitting goals in games, single gamer tags (or identities), consistent sets of friends, a single connected environment where you didn’t have to log in to a new service just to play another online game, and a digital distribution online marketplace. For that, Microsoft charged $50 a year. Charging that fee was a big risk, as Sony followed suit by making its online gaming free.

Eventually, games like Halo 2, Halo 3, Call of Duty Modern Warfare and other giant games came to depend on Xbox Live to hold on to gamers for months at a time. But early on, people were as skeptical about Xbox Live as they were about the original Xbox.

Electronic Arts balked at making online games that worked with Xbox Live because it allowed Microsoft to disintermediate EA. EA’s Larry Probst didn’t like that Microsoft would charge a fee to EA’s customers to play EA games, and he hated the idea that Microsoft would know who EA’s customers were, even as Microsoft was making rival sports games. The impasse was a sore point that meant EA would hold back from full support of the Xbox 360. EA would later exclusively announce that it would make online versions of its games only for Sony’s PlayStation 2 console.

Every year, Microsoft launched new updates for Xbox Live that continuously transformed the service and gave a new look or functionality to the aging Xbox 360 hardware. It was a live service that changed with a simple update of its software. By 2007, Microsoft had more than 8 million subscribers to Xbox Live. By 2011, that number had climbed past 35 million. Indie games could debut on a regular basis on Xbox Live alongside giant console releases. It remains a strategic asset for Microsoft.

The view from the top

After the Xbox Live launch, Microsoft was able to move full-bore into Xenon. A new scouting team (called Xe30) was assembled to figure out what to do for the next Xbox. It included by A.J. Redmer, one of the seasoned game studio chiefs, operating system expert Jon Thomason, and Chip Wood, a business planner.

They latched on to ideas such as making sure that the next console would be able to run high-resolution games on the high-definition flat-screen TVs that were becoming so popular. Redmer didn’t want the system to be “Dreamcasted,” which is what happened when Sega launched the mid-level Dreamcast and Sony then announced a high-end PlayStation 2, drying up demand for the Dreamcast.

The early work forced different Microsoft divisions to cooperate, even though executives were still fiercely territorial.

“We are playing our game this time,” Allard said.

Based on research, Robbie Bach (pictured right) had calculated that, based on Sony’s progress on Cell, it would be able to launch in 2005. The lesson of the first Xbox had been clear. Sony launched 20 months earlier than Microsoft and sold 20 million PS 2 units before Microsoft sold one Xbox. To close the gap, Microsoft had to launch at the same time.

In a stroke of luck, Ed Fries seized an opportunity to steal a big developer away from Nintendo. Rare, the maker of a bunch of Nintendo’s big hits, kept proposing to make $20 million games, and Nintendo kept coming back and asking it to make the games for $2 million. In September, 2002, Fries got board approval for the biggest of deals: the $375 million acquisition of England’s Rare.

But it was a mixed blessing. Fries was under pressure from Bach; if he spent the money on Rare, Bach wanted Fries to cut back on other game development. Though he beefed up his game studios, Fries still got hammered from the hardware side, represented by J Allard, to have a bunch of games lined up for the next Xbox launch.

“Spend less, sell more games,” Fries recalled later. “I don’t know how to do that.”

After all, he had to treat his teams like the artists they were, or they would walk to Sony or Nintendo. Allard wanted Fries to crank out new versions of Halo just about every year, since that would ensure a Halo game would arrive for the Xenon launch. But Fries wasn’t about to force the Bungie team to rush its game-making process. A hot game like Halo could make up for a dozen bad games. But if Bungie were forced to do nothing but Halo, it wouldn’t have the resources to do any original games.

Bach favored Fries in this internecine fight, and he gave Fries control of third-party publishing as well. But Bach didn’t want the team to retreat into silos, for that would doom the integrated planning of the next console launch.

Fries wanted a hard drive in the system again to support higher game quality and better online games. But the first system’s hard drive cost $50 and could barely be brought down to $30 by the end of the four years. It contributed to considerable losses, and Ballmer said that it wasn’t a good business decision to include it because it didn’t enable Microsoft to charge a higher price for its console. So business-minded executives fought against the hard drive, leaving Fries to wonder, “What is exciting about this box?”