There is so much wrong and yet a lot that’s right in this chapter, which was the last one in the original hardcover edition. I don’t know whether to be embarrassed by it or proud. How does computing today compare with my predictions from 1992?
ACCIDENTAL EMPIRES
CHAPTER FIFTEEN
FUTURE COMPUTING
Remember Pogo? Pogo was Doonesbury in a swamp, the first political cartoon good enough to make it off the editorial page and into the high-rent district next to the horoscope. Pogo was a ‘possum who looked as if he was dressed for a Harvard class reunion and who acted as the moral conscience for the first generation of Americans who knew how to read but had decided not to.
The Pogo strip remembered by everyone who knows what the heck I am even talking about is the one in which the little ‘possum says, “We have met the enemy and he is us.” But today’s sermon is based on the line that follows in the next panel of that strip—a line that hardly anyone remembers. He said, “We are surrounded by insurmountable opportunity.”
We are surrounded by insurmountable opportunity.
Fifteen years ago, a few clever young people invented a type of computer that was so small you could put it on a desk and so useful and cheap to own that America found places for more than 60 million of them. These same young people also invented games to play on those computers and business applications that were so powerful and so useful that we nearly all became computer literate, whether we wanted to or not.
Remember computer literacy? We were all supposed to become computer literate, or something terrible was going to happen to America. Computer literacy meant knowing how to program a computer, but that was before we really had an idea what personal computers could be used for. Once people had a reason for using computers other than to learn how to use computers, we stopped worrying about computer literacy and got on with our spreadsheets.
And that’s where we pretty much stopped.
There is no real difference between an Apple II running VisiCalc and an IBM PS/2 Model 70 running Lotus 1-2-3 version 3.0. Sure, the IBM has 100 times the speed and 1,000 times the storage of the Apple, but they are both just spreadsheet machines. Put the same formulas in the same cells, and both machines will give the same answer.
In 1984, marketing folks at Lotus tried to contact the people who bought the first ten copies of VisiCalc in 1979. Two users could not be reached, two were no longer using computers at all, three were using Lotus 1-2-3, and three were still using VisiCalc on their old Apple IIs. Those last three people were still having their needs met by a five-year-old product.
Marketing is the stimulation of long-term demand by solving customer problems. In the personal computer business, we’ve been solving more or less the same problem for at least ten years. Hardware is faster and software is more sophisticated, but the only real technical advances in software in the last ten years have been the Lisa’s multitasking operating system and graphical user interface, Adobe’s PostScript printing technology, and the ability to link users together in local area networks.
Ken Okin, who was in charge of hardware engineering for the Lisa and now heads the group designing Sun Microsystems’ newest workstations, keeps a Lisa in his office at Sun just to help his people put their work in perspective. “We still have a multitasking operating system with a graphical user interface and bit-mapped screen, but back then we did it with half a mip [one mip equals one million computer instructions per second] in 1 megabyte of RAM,” he said. “Today on my desk I have basically the same system, but this time I have 16 mips and an editor that doesn’t seem to run in anything less than 20 megabytes of RAM. It runs faster, sure, but what will it do that is different from the Lisa? It can do round windows; that’s all I can find that’s new. Round windows, great!”
There hasn’t been much progress in software for two reasons. The bigger reason is that companies like Microsoft and Lotus have been making plenty of money introducing more and more people to essentially the same old software, so they saw little reason to take risks on radical new technologies. The second reason is that radical new software technologies seem to require equally radical increases in hardware performance, something that is only now starting to take place as 80386- and 68030-based computers become the norm.
Fortunately for users and unfortunately for many companies in the PC business, we are about to break out of the doldrums of personal computing. There is a major shift happening right now that is forcing change on the business. Four major trends are about to shift PC users into warpspeed: standards-based computing, RISC processors, advanced semiconductors, and the death of the mainframe. Hold on!
**********
In the early days of railroading in America, there was no rule that said how far apart the rails were supposed to be, so at first every railroad set its rails a different distance apart, with the result that while a load of grain could be sent from one part of the country to another, the car it was loaded in couldn’t be. It took about thirty years for the railroad industry to standardize on just a couple of gauges of track. As happens in this business, one type of track, called standard gauge, took about 85 percent of the market.
A standard gauge is coming to computing, because no one company—even IBM—is powerful enough to impose its way of doing things on all the other companies. From now on, successful computers and software will come from companies that build them from scratch with the idea of working with computers and software made by their competitors. This heretical idea was foisted on us all by a company called Sun Microsystems, which invented the whole concept of open systems computing and has grown into a $4 billion company literally by giving software away.
Like nearly every other venture in this business, Sun got its start because of a Xerox mistake. The Defense Advanced Research Projects Agency wanted to buy Alto workstations, but the Special Programs Group at Xerox, seeing a chance to stick the feds for the entire Alto development budget, marked up the price too high even for DARPA. So DARPA went down the street to Stanford University, where they found a generic workstation based on the Motorola 68000 processor. Designed originally to run on the Stanford University Network, it was called the S.U.N. workstation.
Andy Bechtolscheim, a Stanford graduate student from Germany, had designed the S.U.N. workstation, and since Stanford was not in the business of building computers for sale any more than Xerox was, he tried to interest established computer companies in filling the DARPA order. Bob Metcalfe at 3Com had a chance to build the S.U.N. workstation but turned it down. Bechtolscheim even approached IBM, borrowing a tuxedo from the Stanford drama department to wear for his presentation because his friends told him Big Blue was a very formal operation.
He appeared at IBM wearing the tux, along with a tastefully contrasting pair of white tennis shoes. For some reason, IBM decided not to build the S.U.N. workstation either.
Since all the real computer companies were uninterested in building S.U.N. workstations, Bechtolscheim started his own company, Sun Microsystems. His partners were Vinod Khosla and Scott McNealy, also Stanford grad students, and Bill Joy, who came from Berkeley. The Stanford contingent came up with the hardware design and a business plan, while Joy, who had played a major role in writing a version of the Unix operating system at Berkeley, was Mr. Software.
Sun couldn’t afford to develop proprietary technology, so it didn’t develop any. The workstation design itself was so bland that Stanford University couldn’t find any basis for demanding royalties from the start-up. For networking they embraced Bob Metcalfe’s Ethernet, and for storage they used off-the-shelf hard disk drives built around the Small Computer System Interface (SCSI) specification. For software, they used Bill Joy’s Berkeley Unix. Berkeley Unix worked well on a VAX, so Bechtolscheim and friends just threw away the VAX and replaced it with cheaper hardware. The languages, operating system, networking, and windowing systems were all standard.
Sun learned to establish de facto standards by giving source code away. It was a novel idea, born of the Berkeley Unix community, and rather in keeping with the idea that for some boys, a girl’s attractiveness is directly proportional to her availability. For example, Sun virtually gave away licenses for its Network Filing System networking scheme, which had lots of bugs and some severe security problems, but it was free and so became a de facto standard virtually overnight. Even IBM licensed NFS. This giving away of source code allowed Sun to succeed, first by being the standard setter and then following up with the first hardware to support that standard.
By 1985, Sun had defined a new category of computer, the engineering workstation, but competitors were starting to catch on and catch up to Sun. The way to remain ahead of the industry, they decided, was to increase performance steadily, which they could do by using a RISC processor—except that there weren’t any RISC processors for sale in 1985.
RISC is an old IBM idea called Reduced Instruction Set Computing. RISC processors were incredibly fast devices that gained their speed from a simple internal architecture that implements only a few computer instructions. Where a Complex Instruction Set Computer (CISC) might have a special “walk across the room but don’t step on the dog” instruction, RISC processors can usually get faster performance by using several simpler instructions: walk-walk-step over-walk-walk.
RISC processors are cheaper to build because they are smaller and more can be fit on one piece of silicon. And because they have fewer transistors (often under 100,000), yields are higher too. It’s easier to increase the clock speed of RISC chips, making them faster. It’s easier to move RISC designs from one semiconductor technology to a faster one. And because RISC forces both hardware and software designers to keep it simple, stupid, they tend to be more robust.
Sun couldn’t interest Intel or Motorola in doing one. Neither company wanted to endanger its lucrative CISC processor business. So Bill Joy and Dave Patterson designed a processor of their own in 1985, called SPARC. By this time, both Intel and Motorola had stopped allowing other semiconductor companies to license their processor designs, thus keeping all the high-margin sales in Santa Clara and Schaumberg, Illinois. This, of course, pissed off the traditional second source manufacturers, so Sun signed up those companies to do SPARC.
Since Sun designed the SPARC processor, it could buy them more cheaply than any other computer maker. Sun engineers knew, too, when higher-performance versions of the SPARC were going to be introduced. These facts of life have allowed Sun to dominate the engineering workstation market, as well as making important inroads into other markets formerly dominated by IBM and DEC.
Sun scares hardware and software competitors alike. The company practically gives away system software, which scares companies like Microsoft and Adobe that prefer to sell it. The industry is abuzz with software consortia set up with the intention to do better standards-based software than Sun does but to sell it, not give it away.
Sun also scares entrenched hardware competitors like DEC and IBM by actually encouraging cloning of its hardware architecture, relying on a balls-to-the-wall attitude that says Sun will stay in the high-margin leading edge of the product wave simply by bringing newer, more powerful SPARC systems to market sooner than any of its competitors can.
DEC has tried, and so far failed, to compete with Sun, using a RISC processor built by MIPS Computer Systems. Figuring if you can’t beat them, join them, HP has actually allied with Sun to do software. IBM reacted to Sun by building a RISC processor of its own too. Big Blue spent more on developing its Sun killer, the RS/6000, than it would have cost to buy Sun Microsystems outright. The RS/6000, too, is a relative failure.
**********
Why did Bill Gates, in his fourth consecutive hour of sitting in a hotel bar in Boston, sinking ever deeper into his chair, tell the marketing kids from Lotus Development that IBM would be out of business in seven years? What does Bill Gates know that we don’t know?
Bill Gates knows that the future of computing will unfold on desktops, not in mainframe computer rooms. He knows that IBM has not had a very good handle on the desktop software market. He thinks that without the assistance of Microsoft, IBM will eventually forfeit what advantage it currently has in personal computers.
Bill Gates is a smart guy.
But you and I can go even further. We can predict the date by which the old IBM—IBM the mainframe computing giant— will be dead. We can predict the very day that the mainframe computer era will end.
Mainframe computing will die with the coming of the millennium. On December 31,1999, right at midnight, when the big ball drops and people are kissing in New York’s Times Square, the era of mainframe computing will be over.
Mainframe computing will end that night because a lot of people a long time ago made a simple mistake. Beginning in the 1950s, they wrote inventory programs and payroll programs for mainframe computers, programs that process income tax returns and send out welfare checks—programs that today run most of this country. In many ways those programs have become our country. And sometime during those thirty-odd years of being moved from one mainframe computer to another, larger mainframe computer, the original program listings, the source code for thousands of mainframe applications, were just thrown away. We have the object code—the part of the program that machines can read—which is enough to move the software from one type of computer to another. But the source code—the original program listing that people can read, that has details of how these programs actually work—is often long gone, fallen through a paper shredder back in 1967. There is mainframe software in this country that cost at least $50 billion to develop for which no source code exists today.
This lack of commented source code would be no big deal if more of those original programmers had expected their programs to outlive them. But hardly any programmer in 1959 expected his payroll application to be still cutting checks in 1999, so nobody thought to teach many of these computer programs what to do when the calendar finally says it’s the year 2000. Any program that prints a date on a check or an invoice, and that doesn’t have an algorithm for dealing with a change from the twentieth to the twenty-first century, is going to stop working. I know this doesn’t sound like a big problem, but it is. It’s a very big problem.
Looking for a growth industry in which to invest? Between now and the end of the decade, every large company in America either will have to find a way to update its mainframe software or will have to write new software from scratch. New firms will appear dedicated to the digital archaeology needed to update old software. Smart corporations will trash their old software altogether and start over. Either solution is going to cost lots more than it did to write the software in the first place. And all this new mainframe software will have one thing in common: it won’t run on a mainframe. Mainframe computers are artifacts of the 1960s and 1970s. They are kept around mainly to run old software and to gladden the hearts of MIS directors who like to think of themselves as mainframe gods. Get rid of the old software, and there is no good reason to own a mainframe computer. The new software will run faster, more reliably, and at one-tenth the cost on a desktop workstation, which is why the old IBM is doomed.
“But workstations will never run as reliably as mainframes,” argue the old-line corporate computer types, who don’t know what they are talking about. Workstations today can have as much computing power and as much data storage as mainframes. Ten years from now, they’ll have even more. And by storing copies of the same corporate data on duplicated machines in separate cities or countries and connecting them by high-speed networks, banks, airlines, and all the other other big transaction processors that still think they’d die without their mainframe computers will find their data are safer than they are now, trapped inside one or several mainframes, sitting in the same refrigerated room in Tulsa, Oklahoma.
Mainframes are old news, and the $40 billion that IBM brings in each year for selling, leasing, and servicing mainframes will be old news too by the end of the decade.
There is going to be a new IBM, I suppose, but it probably won’t be the company we think of today. The new IBM should be a quarter the size of the current model, but I doubt that current management has the guts to make those cuts in time. The new IBM is already at a disadvantage, and it may not survive, with or without Bill Gates.
So much for mainframes. What about personal computers? PCs, at least as we know them today, are doomed too. That’s because the chips are coming.
While you and I were investing decades alternately destroying brain cells and then regretting their loss, Moore’s Law was enforcing itself up and down Silicon Valley, relentlessly demanding that the number of transistors on a piece of silicon double every eighteen months, while the price stayed the same. Thirty-five years of doubling and redoubling, thrown together with what the lady at the bank described to me as “the miracle of compound interest,” means that semiconductor performance gains are starting to take off. Get ready for yet another paradigm shift in computing.
Intel’s current top-of-the-line 80486 processor has 1.2 million transistors, and the 80586, coming in 1992, will have 3 million transistors. Moore’s Law has never let us down, and my sources in the chip business can think of no technical reason why it should be repealed before the end of the decade, so that means we can expect to see processors with the equivalent of 96 million transistors by the year 2000. Alternately, we’ll be able to buy a dowdy old 80486 processor for $11.
No single processor that can be imagined today needs 96 million transistors. The reality of the millennium processor is that it will be a lot smaller than the processors of today, and smaller means faster, since electrical signals don’t have to travel as far inside the chip. In keeping with the semiconductor makers’ need to add value continually to keep the unit price constant, lots of extra circuits will be included in the millennium processor— circuits that have previously been on separate plug-in cards. Floppy disk controllers, hard disk controllers, Ethernet adapters, and video adapters are already leaving their separate circuit cards and moving as individual chips onto PC motherboards. Soon they will leave the motherboard and move directly into the microprocessor chip itself.
Hard disk drives will be replaced by memory chips, and then those chips too will be incorporated in the processor. And there will still be space and transistors left over—space enough eventually to gang dozens of processors together on a single chip.
Apple’s Macintosh, which used to have more than seventy separate computer chips, is now down to fewer than thirty. In two years, a Macintosh will have seven chips. Two years after that, the Mac will be two chips, and Apple won’t be a computer company anymore. By then Apple will be a software company that sells operating systems and applications for single-chip computers made by Motorola. The MacMotorola chips themselves may be installed in desktops, in notebooks, in television sets, in cars, in the wiring of houses, even in wristwatches. Getting the PC out of its box will fuel the next stage of growth in computing. Your 1998 Macintosh may be built by Nissan and parked in the driveway, or maybe it will be a Swatch.
Forget about keyboards and mice and video displays, too, for the smallest computers, because they’ll talk to you. Real-time, speaker-independent voice recognition takes a processor that can perform 100 million computer instructions per second. That kind of performance, which was impossible at any cost in 1980, will be on your desktop in 1992 and on your wrist in 1999, when the hardware will cost $625. That’s for the Casio version; the Rolex will cost considerably more.
That’s the good news. The bad news comes for companies that today build PC clones. When the chip literally becomes the computer, there will be no role left for computer manufacturers who by then would be slapping a chip or two inside a box with a battery and a couple of connectors. Today’s hardware companies will be squeezed out long before then, unable to compete with the economics of scale enjoyed by the semiconductor makers. Microcomputer companies will survive only by becoming resellers, which means accepting lower profit margins and lower expectations, or by going into the software business.
**********
On Thursday night, April 12, 1991, eight top technical people from IBM had a secret meeting in Cupertino, California, with John Sculley, chairman of Apple Computer. Sculley showed them an IBM PS/2 Model 70 computer running what appeared to be Apple’s System 7.0 software. What the computer was actually running was yet another Apple operating system code-named Pink, intended to be run on a number of different types of microprocessors. The eight techies were there to help decide whether to hitch IBM’s future to Apple’s software.
Sculley explained to the IBMers that he had realized Apple could never succeed as a hardware company. Following the model of Novell, the network operating system company, Apple would have to live or die by its software. And living, to a software company, means getting as many hardware companies as possible to use your operating system. IBM is a very big hardware company.
Pink wasn’t really finished yet, so the demo was crude, the software was slow, the graphics were especially bad, but it worked. The IBM experts reported back to Boca Raton that Apple was onto something.
The talks with Apple resumed several weeks later, taking place sometimes on the East Coast and sometimes on the West. Even the Apple negotiators scooted around the country on IBM jets and registered in hotels under assumed names so the talks could remain completely secret.
Pink turned out to be more than an operating system. It was also an object-oriented development environment that had been in the works at Apple for three years, staffed with a hundred programmers. Object orientation was a concept invented in Norway but perfected at Xerox PARC to allow large programs to be built as chunks of code called objects that could be mixed and matched to create many different types of applications. Pink would allow the same objects to be used on a PC or a mainframe, creating programs that could be scaled up or down as needed. Combining objects would take no time at all either, allowing applications to be written faster than ever. Writing Pink programs could be as easy as using a mouse to move object icons around on a video screen and then linking them together with lines and arrows.
IBM had already started its own project in partnership with Metaphor Computer Systems to create an object-oriented development environment called Patriot. Patriot, which was barely begun when Apple revealed the existence of Pink to IBM, was expected to take 500 man-years to write. What IBM would be buying in Pink, then, was a 300 man-year head start.
In late June, the two sides reached an impasse, and talks broke down. Jim Cannavino, head of IBM’s PC operation, reported to IBM chairman John Akers that Apple was asking for too many concessions. “Get back in there, and do whatever it takes to make a deal,” Akers ordered, sounding unlike any previous chairman of IBM. Akers knew that the long-term survival of IBM was at stake.
On July 3, the two companies signed a letter of intent to form a jointly owned software company that would continue development of Pink for computers of all sizes. To make the deal appear as if it went two ways, Apple also agreed to license the RISC processor from IBM’s RS/6000 workstation, which would be shrunk from five chips down to two by Motorola, Apple’s longtime supplier of microprocessors. Within three years, Apple and IBM would be building computers using the same processor and running the same software—software that would look like Apple’s Macintosh, without even a hint of IBM’s Common User Access interface or its Systems Application Architecture programming guidelines. Those sacred standards of IBM were effectively dead because Apple rightly refused to be bound by them. Even IBM had come to realize that market share makes standards; companies don’t. The only way to succeed in the future will be by working seamlessly with all types of computers, even if they are made by competitors.
This deal with Apple wasn’t the first time that IBM had tried to make a quantum leap in system software. In 1988, Akers had met Steve Jobs at a birthday party for Katherine Graham, owner of Newsweek and the Washington Post. Jobs took a chance and offered Akers a demo of NeXTStep, the object-oriented interface development system used in his NeXT Computer System. Blown away by the demo, Akers cut the deal with NeXT himself and paid $10 million for a NeXTStep license.
Nothing ever came of NeXTStep at IBM because it could produce only graphical user interfaces, not entire applications, and because the programmers at IBM couldn’t figure how to fit it into their raison d’etre—SAA. But even more important, the technical people of IBM were offended that Akers had imposed outside technology on them from above. They resented NeXTStep and made little effort to use it. Bill Gates, too, had argued against NeXTStep because it threatened Microsoft. (When InfoWorld’s Peggy Watt asked Gates if Microsoft would develop applications for the NeXT computer, he said, “Develop for it? I’ll piss on it.”)
Alas, I’m not giving very good odds that Steve Jobs will be the leader of the next generation of personal computing.
The Pink deal was different for IBM, though, in part because NeXTStep had failed and the technical people at IBM realized they’d thrown away a three-year head start. By 1991, too, IBM was a battered company, suffering from depressed earnings and looking at its first decline in sales since 1946. A string of homegrown software fiascos had IBM so unsure of what direction to move in that the company had sunk to licensing nearly every type of software and literally throwing it at customers, who could mix and match as they liked. “Want an imaging model? Well, we’ve got PostScript, GPI, and X-Windows—take your pick.” Microsoft and Bill Gates were out of the picture, too, and IBM was desperate for new software partnerships.
IBM has 33,000 programmers on its payroll but is so far from leading the software business (and knows it) that it is betting the company on the work of 100 Apple programmers wearing T-shirts in Mountain View, California.
Apple and IBM, caught between the end of the mainframe and the ultimate victory of the semiconductor makers, had little choice but to work together. Apple would become a software company, while IBM would become a software and high-performance semiconductor company. Neither company was willing to risk on its own the full cost of bringing to market the next-generation computing environment ($5 billion, according to Cringely’s Second Law). Besides, there weren’t any other available allies, since nearly every other computer company of note had already joined either the ACE or SPARC alliances that were Apple and IBM’s competitors for domination of future computing.
ACE, the Advanced Computing Environment consortium, is Microsoft’s effort to control the future of computing and Compaq’s effort to have a future in computing. Like Apple-IBM, ACE is a hardware-software development project based on linking Microsoft’s NT (New Technology) operating system to a RISC processor, primarily the R-4000, from MIPS Computer Systems. In fact, ACE was invented as a response to IBM’s Patriot project before Apple became involved with IBM.
ACE has the usual bunch of thirty to forty Microsoft licensees signed up, though only time will tell how many of these companies will actually offer products that work with the MIPS/ Microsoft combination.
But remember that there is only room for two standards; one of these efforts is bound to fail.
**********
In early 1970, my brother and I were reluctant participants in the first draft lottery. I was hitchhiking in Europe at the time and can remember checking nearly every day in the International Herald Tribune for word of whether I was going to Vietnam. I finally had to call home for the news. My brother and I are three years apart in age, but we were in the same lottery because it was the first one, meant to make Richard Nixon look like an okay guy. For that year only, every man from 18 to 26 years old had his birthday thrown in the same hopper. The next year, and every year after, only the 18-year-olds would have their numbers chosen. My number was 308. My brother’s number was 6.
Something very similar to what happened to my brother and me with the draft also happened to nearly everyone in the personal computer business during the late 1970s. Then, there were thousands of engineers and programmers and would-be entrepreneurs who had just been waiting for something like the personal computer to come along. They quit their jobs, quit their schools, and started new hardware and software companies all over the place. Their exuberance, sheer numbers, and willingness to die in human wave technology attacks built the PC business, making it what it is today.
But today, everyone who wants to be in the PC business is already in it. Except for a new batch of kids who appear out of school each year, the only new blood in this business is due to immigration. And the old blood is getting tired—tired of failing in some cases or just tired of working so hard and now ready to enjoy life. The business is slowing down, and this loss of energy is the greatest threat to our computing future as a nation. Forget about the Japanese; their threat is nothing compared to this loss of intellectual vigor.
Look at Ken Okin. Ken Okin is a great hardware engineer. He worked at DEC for five years, at Apple for four years, and has been at Sun for the last five years. Ken Okin is the best-qualified computer hardware designer in the world, but Ken Okin is typical of his generation. Ken Okin is tired.
“I can remember working fifteen years ago at DEC,” Okin said. “I was just out of school, it was 1:00 in the morning, and there we were, testing the hardware with all these logic analyzers and scopes, having a ball. ‘Can you believe they are paying for us to play?’ we asked each other. Now it’s different. If I were vested now, I don’t know if I would go or stay. But I’m not vested—that will take another four years—and I want my fuck you money.”
Staying in this business for fuck you money is staying for the wrong reason.
Soon, all that is going to remain of the American computer industry will be high-performance semiconductors and software, but I’ve just predicted that we won’t even have the energy to stay ahead in software. Bummer. I guess this means it’s finally my turn to add some value and come up with a way out of this impending mess.
The answer is an increase in efficiency. The era of start-ups built this business, but we don’t have the excess manpower or brainpower anymore to allow nineteen out of twenty companies to fail. We have to find a new business model that will provide the same level of reward without the old level of risk, a model that can produce blockbuster new applications without having to create hundreds or thousands of tiny technical bureaucracies run by unhappy and clumsy administrators as we have now. We have to find a model that will allow entrepreneurs to cash out without having to take their companies public and pretend that they ever meant more than working hard for five years and then retiring. We started out, years ago, with Dan Fylstra’s adaptation of the author-publisher model, but that is not a flexible or rich enough model to support the complex software projects of the next decade. Fortunately, there is already a business model that has been perfected and fine-tuned over the past seventy years, a business model that will serve us just fine. Welcome to Hollywood.
The world eats dinner to U.S. television. The world watches U.S. movies. It’s all just software, and what works in Hollywood will work in Silicon Valley too. Call it the software studio.
Today’s major software companies are like movie studios of the 1930s. They finance, produce, and distribute their own products. Unfortunately, it’s hard to do all those things well, which is why Microsoft reminds me of Disney from around the time of The Love Bug. But the movie studio of the 1990s is different; it is just a place where directors, producers, and talent come and go—only the infrastructure stays. In the computer business, too, we’ve held to the idea that every product is going to live forever. We should be like the movies and only do sequels of hits. And you don’t have to keep the original team together to do a sequel. All you have to do is make sure that the new version can read all the old product files and that it feels familiar.
The software studio acknowledges that these start-up guys don’t really want to have to create a large organization. What happens is that they reinvent the wheel and end up functioning in roles they think they are supposed to like, but most of them really don’t. And because they are performing these roles—pretending to be CEOs—they aren’t getting any programming done. Instead, let’s follow a movie studio model, where there is central finance, administration, manufacturing, and distribution, but nearly everything else is done under contract. Nearly everyone—the authors, the directors, the producers—works under contract. And most of them take a piece of the action and a small advance.
There are many advantages to the software studio. Like a movie studio, there are established relationships with certain crafts. This makes it very easy to get a contract programmer, writer, marketer, etc. Not all smart people work at Apple or Sun or Microsoft. In fact, most smart people don’t work at any of those companies. The software studio would allow program managers to find the very best person for a particular job. A lot of the scrounging is eliminated. The programmers can program. The would-be moguls can either start a studio of their own or package ideas and talent together just like independent movie producers do today. They can become minimoguls and make a lot of money, but be responsible for at most a few dozen people. They can be Steven Spielberg or George Lucas to Microsoft’s MGM or Lotus’s Paramount.
We’re facing a paradigm shift in computing, which can be viewed either as a catastrophe or an opportunity. Mainframes are due to die, and PCs and workstations are colliding. Processing power is about to go off the scale, though we don’t seem to know what to do with it. The hardware business is about to go to hell, and the people who made all this possible are fading in the stretch.
What a wonderful time to make money!
Here’s my prescription for future computing happiness. The United States is losing ground in nearly every area of computer technology except software and microprocessors. And guess what? About the only computer technologies that are likely to show substantial growth in the next decade are—software and microprocessors! The rest of the computer industry is destined to shrink.
Japan has no advantage in software, and nothing short of a total change of national character on their part is going to change that significantly. One really remarkable thing about Japan is the achievement of its craftsmen, who are really artists, trying to produce perfect goods without concern for time or expense. This effect shows, too, in many large-scale Japanese computer programming projects, like their work on fifth-generation knowledge processing. The team becomes so involved in the grandeur of their concept that they never finish the program. That’s why Japanese companies buy American movie studios: they can’t build competitive operations of their own. And Americans sell their movie studios because the real wealth stays right here, with the creative people who invent the software.
The hardware business is dying. Let it. The Japanese and Koreans are so eager to take over the PC hardware business that they are literally trying to buy the future. But they’re only buying the past.
Thank you so much for publishing this. I have been reading your articles for most of the past 10 years and have learned a lot. You are not always right (nobody is), but you are always thoughtful and carefule in your analysis and I don’t think any reader could ask for more. Thank you for what you do.
Well while you weren’t willing to give odds on Jobs being the leader of the next generation of personal computing your call on RISC wasn’t too far off given that ARM architecture has dominated the mobile space since the turn of the century and probably will for sometime to come.
As for IBM who knows why they are still around? Somewhere between outsourcing their work force and the inertia of their clients they have managed to survive.
While you weren’t willing to give odds on Jobs being the leader of the next generation of personal computing your call on RISC wasn’t too far off given that ARM architecture has dominated the mobile space since the turn of the century and probably will for sometime to come.
As for IBM who knows why they are still around? Somewhere between outsourcing their workforce and the inertia of their clients they have managed to survive.
So, how much of this came true? And are those mainframes still running?!?
“Alas, I’m not giving very good odds that Steve Jobs will be the leader of the next generation of personal computing.”
In retrospect this looks very funny, but no on in 1992, except maybe Steve Jobs himself, thought Jobs’ best days were ahead of him.
Bob,
You know what they say, “Hindsight is 20/20.” In my case, it’s 20/30, due to my eye operation.
Cringe predicted outsourcing! “let’s follow a movie studio model, where there is central finance, administration, manufacturing, and distribution, but nearly everything else is done under contract. Nearly everyone—the authors, the directors, the producers—works under contract.”
except for the part where they get advances and a piece of the pie. it’s more like McJobs.
“except for the part where they get advances and a piece of the pie” I suspect that is part of the “contract”. The point being there are no employee “benefits” since there are no employees.
Well, that’s how the games industry worked until recently.
“There is going to be a new IBM, I suppose, but it probably won’t be the company we think of today. The new IBM should be a quarter the size of the current model, but I doubt that current management has the guts to make those cuts in time.”
There is a new IBM, and indeed, it seems to make its billions without creating software or hardware. The management back in ’92 may not have had the intestinal fortitude, but from what we’ve seen in your postings about IBM in the past, the current management certainly does. Who knew that Palmisano, Rometty et al. were writing their playbook using a dog-eared, paperback copy of Accidental Empires?
Indeed, we have met the enemy and he is us.
in terms of editing for future editions, it probably would be good to name some of the things that came out of the alliances you discuss (like the power-pc chip, which would lead to later motorola developments in phone-quality chips as mentioned by others above).
I guess the interesting thing missing is the fate of 64-bit computing. At the time of all of this, DEC was less than a year away from releasing the Alpha, which Microsoft would successfully port NT to, thus giving Microsoft and DEC a major head start on the 64 bit platform out of which would come…nothing. I would be curious to how marketing failures (on DEC’s part) and successes (on Intel’s) would take that collective 64bit lead and throw it away, leading us to waiting almost 15 years before 64bit would become ubiquitous, far past when an interpretation of Moore’s law said it should have.
Another interesting point to that story parallels Compaq waiting for IBM to bring out the 386 computer and eventually jut doing it themselves. Same thing with AMD and Intel
Another interesting point to that story parallels Compaq waiting for IBM to bring out the 386 computer and eventually jut doing it themselves. Same thing with AMD and Intel
64-bit computing didn’t take off back then because of the expense.
With 16-bit chips, you needed complicated banked addressing to support more than 64 kilobytes of memory. 8-bit computing was even worse. But with 32-bit addresses you could use a flat memory space to address 4GB of memory. There was no need for more memory until recently. Even my AlphaStation had no more than 1GB of RAM.
Even now, 32-bit programs are common. The operating system has to be 64-bit (or support PAE) to use all the RAM, but most programs don’t need it. I recently encountered a program that was 64-bit for no good reason, and a memory leak caused it to crash the entire computer. And then there are other reasons for 32-bit, especially the binary plugins to web browsers and Office, that keep those programs 32-bit.
Well, let’s see:
1. Even as you predicted the death of IBM, you went right on giving credence to the notion (which you had already conclusively proven, in earlier chapters, to be false, but which everyone else in the business still believed at the time) that what happened to and with IBM still mattered to anybody but IBM. The de facto definition of “PC” was already set by Microsoft as you were writing this, as would be clear to everybody by the time of Win95, but which at that point in history was clear only to you, and you chickened out of saying so.
2. I’ll give you a pass on Apple. Apple did indeed decide to move to being primarily a seller of bits and merely incidentally a seller of computers on which to consume those bits. And then it decided not to. And then it decided to do that again. It seems to be working out better for them this time than it did the first.
3. The word “internet” doesn’t appear. It obviously wasn’t on your radar, which would be more forgivable if it weren’t for what your _next_ book was about. It wasn’t on anybody else’s radar either. But the absence of any consideration of the biggest thing to hit the computer industry since (depending on how you measure things) the Apple II, the Altair, or the semiconductor…well, it does make the whole story rather amusingly quaint-sounding.
4. Japan. Yeah…well, like with the internet, you’re not the only person who blew that one.
5. Voice interfaces? Really? No thought at all given to the fact that trying to get work done in an office full of voice-interface computers would be kind of like trying to have a deep philosophical conversation in a crowded discotheque or on the floor of the stock exchange?
But overall? Not as far off as one might expect.
Well, IBM is not entirely irrelevant in the PC space. Even though Taligent turned out to be a flop, the PowerPC processors ran in Macs until 2006, and all 3 major gaming consoles of the last generation use IBM processors.
In 1991, IBM was becoming irrelevant, but they had so much presence in business that it was prudent to keep up with what they were doing.
Ugh that picture of a most odiuos actor destroying a very good Philip K. Dick story on a mediocre movie adaptation almost ruined this whole chapter for me. I give 8 out of 10 on the predictions and I liked the Y2K talk before anybody even had it in their radar screen.
I have no idea how many readers near my age (65) you have, but I suspect everyone of us appreciate appreciate the virtual memory (and memories) you have provided through this reprinting. Many thanks for doing so. Good luck with your new startup site.
Liberal Cartoons == Moral Vanity
Hi handed holier than thou. Bob I love you but I can’t abide the idea of Doonesbury being a moral arbiter cause it’s just a self justifying high horse.
What is so hard about those German words …
It is Andy von Bechtolsheim (no need for that second “c” in his name), and Motorola headquarters are in Schaumburg (foam castle), not Schaumberg (foam mountain, for those of you who like to have a giggle over literal translations ).
Replace “Japanese” with “Taiwanese” in the last paragraph, and you are spot-on.
Although by your original prediction they’re very late (14yrs), wrist watches computing has started to appear. (Pebble / iWatch)
Just a failure to foresee the death of the wristwatch.
The phone didn’t move into the watch, the watch moved into the phone.
Great insight from a tech prophet. Spot on on multiprocessors, SystemOnChips, the PC in devices such as watches (mobile today), content against hardware, etc.
And on Open Source. Not long ago it was considered a cancer and not the American Way.
I think the issue there was computers still had a lot of speeding up to do.
In the early years, computers were bought to do only one or two things. It was easy to imagine computer speed stalling, while Moore’s Law would put the same power into a wristwatch.
Instead, software has become more demanding. Processor makers have continued to use the smaller transistors to make faster processors. More transistors allow them to make ever-deeper pipelines and wider dispatch units, and larger amounts of cache to handle the processor being so much faster than the RAM. Cringely thought of using a 486 in 2000; I tried, and it was not a nice experience.
I guess now they’re finally moving back into integration. The latest CPUs from AMD and Intel both have memory controllers and graphic processors supporting OpenCL, what used to be separate chips. Cell phones and a lot of small devices use Systems on Chip, though usually still with separate memory. Like the death of IBM, it’s just taking a bit longer than expected.
Obviously predictions for the future in the new book would start from where we are now. Some candidates include google glass and or a watch, gestures in the air, some sort of interface that looks at brain waves or electrical fields from muscle twitches that don’t move enough to cause gestures in the air. Those all seem better than voice to me, but I’m not much good at seeing the future.
The cloud and privacy deserve coverage.
More and more, I think the 3d printers and the ‘makers’ interfacing simple programs with unusual combinations of hardware are actually going to be important. The cottage industry of the present or near future.
I think we underestimated how much software would bloat.
For example, back in the day, Emacs was the ultimate bloated text editor. (EMACS unofficially stood for, Eight Megabytes And Constantly Swapping.) Sure, you could edit text in 1MB of RAM on the Lisa, but Emacs is not just text. Emacs includes its own LISP interpreter that can run literally any program. It can be programmed with syntax for any source code language, and it comes with its own copy of Eliza. A couple people even wrote web servers for Emacs, so you can edit and publish online using a text editor.
Programmers make programs that take all available resources until they run into some restriction. Now we’re finally having real restrictions: The power wall that keeps CPU clock speeds from rising fast, leading to multicore and heterogeneous computing; and mobile computing, that punishes programmers for killing devices’ battery lives.
A lot of the bloat is there to make programmers’ lives easier. It started with structured programming (Go To Statement Considered Harmful), but it also includes object-oriented programming, garbage collection, and standard libraries. That’s not to mention programmers being sloppy and using inefficient data structures, or efficient data structures in inefficient ways.
Wikipedia also includes this: “Some programmers, such as Linux Kernel designer and coder Linus Torvalds or software engineer and book author Steve McConnell, also object to Dijkstra’s point of view, stating that GOTOs can be a useful language feature, improving program speed, size and code clearness, but only when used in a sensible way by a comparably sensible programmer.” http://en.wikipedia.org/wiki/Goto
This is not a comment on the book.
Bob, are you aware that your site loads significantly slower since the web lamb icon appeared on it? It can take sometimes up to 10 sec to load, and this happens on every page.
This is not a comment on the book.
Bob, are you aware that your site loads significantly slower since the web lamb icon appeared on it? It can take sometimes up to 10 sec to load, and this happens on every page. Even posting a comment takes much, much, longer than before.
I don’t know when the icon first appeared since I rarely scroll all the way down, but I associate the slow loading and not-immediately-visible-comment problems with the new website that dates back at least a couple months to the start of the AE book review.
I don’t know when the icon first appeared since I rarely scroll all the way down, but I associate the slow loading and not-immediately-visible-comment problems with the new website that dates back at least a couple months to the start of the AE book review. Let’s bring back the old Christmas Card website…800 responses in a few weeks and never a hiccup.
Do you have a source for the alleged “insurmountable opportunity” Pogo comic in the intro? I have looked for it for years and have never actually found it. I wonder if it is apocryphal. It’s a cute story, but I wonder if you or a fact-checker could verify it.
X you have excelled yourself!
Prescient, predictive prophetable but profitable?
I wonder if S Jobs read this when it was published to get his vision?
Pogo was the best political strip ever — Pogo in Oz for the ’56 Olympics and he got the Ozzie character 100% correct.
I’ve grown to love natural evolution and the government should play a part — it storing the unsuccessful computer code in an archive of the damned. A repository for future “year 2000” problems solved.
ARM was like human predecessors wondering amongst the dinosaurs. No one saw not even you X!
What I’ve noticed is the specter of Apple in all the great advancements.
iOS App store is exactly what you wanted 20 years ago.
I don’t think he was necessarily wrong about Steve Jobs being the leader of the next generation of personal computing. PCs were only beginning to catch on with the mainstream in 1992, in my opinion. I think from that reference, Bill Gates could have been considered the next leader, and thus making Cringely’s statement correct. Also, I think you could make an argument for widespread internet use being a generation of personal computing. Jobs was not the leader of this, either. Though, he definitely was the leader of the generation after that (mobile devices).
From what I can see, most of these predictions were spot on in the broad strokes and incorrect in the details. Turns out NextSTEP was **far** more viable than Pink, but only because Jobs could later leverage Apple again to turn NextSTEP into MacOS X, and now, iOS. Turns out CISC was the future of workstations and PCs, but RISC is surging from below like x86 once did and ARM could well become the basis of low-end server hardware running something like Android Server within a decade and might well be the basis of high-end server hardware within two decades, slowly cooking Intel and AMD’s goose (but perhaps very slowly). The mainframe has proven stubbornly persistent, partly due to the Unix Wars and partly due to the failure of any RISC standard before ARM to become predominant, but may yet fade as corporate software gets rewritten on distributed commodity systems (whether they turn out to be x86 based or ARM based). Much of what’s predicted here will take much longer than was thought in ’92, but it will probably come to pass eventually, in some fashion.