The founders of the microcomputer industry were groups of boys who banded together to give themselves power. For the most part, they came from middle-class and upper-middle-class homes in upscale West Coast communities. They weren’t rebels; they resented their parents and society very little. Their only alienation was the usual hassle of the adolescent—a feeling of being prodded into adulthood on somebody else’s terms. So they split off and started their own culture, based on the completely artificial but totally understandable rules of computer architecture. They defined, built, and controlled (and still control) an entire universe in a box—an electronic universe of ideas rather than people—where they made all the rules, and could at last be comfortable. They didn’t resent the older people around them—you and me, the would-be customers—but came to pity us because we couldn’t understand the new order inside the box—the microcomputer.
And turning this culture into a business? That was just a happy accident that allowed these boys to put off forever the horror age—that dividing line to adulthood that they would otherwise have been forced to cross after college.
The 1980s were not kind to America. Sitting at the end of the longest period of economic expansion in history, what have we gained? Budget deficits are bigger. Trade deficits are bigger. What property we haven’t sold we’ve mortgaged. Our basic industries are being moved overseas at an alarming rate. We pretended for a time that junk bond traders and corporate disassemblers create wealth, but they don’t. America is turning into a service economy and telling itself that’s good. But it isn’t.
America was built on the concept of the frontier. We carved a nation out of the wilderness, using as tools enthusiasm, adolescent energy, and an unwillingness to recognize limitations. But we are running out of recognized frontiers. We are getting older and stodgier and losing our historic advantage in the process. In contrast, the PC business is its own frontier, created inside the box by inward-looking nerds who could find no acceptable challenge in the adult world. Like any other true pioneers, they don’t care about what is possible or not possible; they are dissatisfied with the present and excited about the future. They are anti-establishment and rightly see this as a prerequisite for success.
Time after time, Japanese companies have aimed at dominating the PC industry in the same way that similar programs have led to Japanese success in automobiles, steel, and consumer electronics. After all, what is a personal computer but a more expensive television, calculator, or VCR? With the recent exception of laptop computers, though, Japan’s luck has been poor in the PC business. Korea, Taiwan, and Singapore have fared similarly and are still mainly sources of cheap commodity components that go into American-designed and -built PCs.
As for the Europeans, they are obsessed with style, thinking that the external design of a computer is as important as its raw performance. They are wrong: horsepower sells. The results are high-tech toys that look pretty, cost a lot, and have such low performance that they suggest Europe hasn’t quite figured out what PCs are even used for.
It’s not that the Japanese and others can’t build personal computers as well as we can; manufacturing is what they do best. What puts foreigners at such a disadvantage is that they usually don’t know what to build because^ the market is changing so quickly; a new generation of machines and software appears every eighteen months.
The Japanese have grown rich in other industries by moving into established markets with products that are a little better and a little cheaper, but in the PC business the continual question that needs asking is, “Better than what?” Last year’s model? This year’s? next year’s? By the time the Asian manufacturers think they have a sense of what to aim for, the state of the art has usually changed.
In the PC business, constant change is the only norm, and adolescent energy is the source of that change.
The Japanese can’t take over because they are too grownup. They are too businesslike, too deliberate, too slow. They keep trying, with little success, to find some level at which it all makes sense. But that level does not exist in this business, which has grown primarily without adult supervision.
Smokestacks, skyscrapers, half-acre mahogany desks, corporate jets, gray hair, the building of things in enormous factories by crowds of faceless, time card-punching workers: these are traditional images of corporate success, even at old-line computer companies like IBM.
Volleyball, junk food, hundred-hour weeks, cubicles instead of offices, T-shirts, factories that either have no workers or run, unseen, in Asia: these are images of corporate success in the personal computer industry today.
The differences in corporate culture are so profound that IBM has as much in common with Tehran or with one of the newly discovered moons of Neptune as it does with a typical personal computer software company. On August 25, 1989, for example, all 280 employees of Adobe Systems Inc., a personal computer software company, armed themselves with waste baskets and garden hoses for a company-wide water fight to celebrate the shipping of a new product. Water fights don’t happen at General Motors, Citicorp, or IBM, but then those companies don’t have Adobe’s gross profit margins of 43 percent either.
We got from boardrooms to water balloons led not by a Tom Watson, a Bill Hewlett, or even a Ross Perot but by a motley group of hobbyist/opportunists who saw a niche that needed to be filled. Mainly academics and nerds, they had no idea how businesses were supposed to be run, no sense of what was impossible, so they faked it, making their own ways of doing business—ways that are institutionalized today but not generally documented or formally taught. It’s the triumph of the nerds.
Here’s the important part: they are our nerds. And having, by their conspicuous success, helped create this mess we’re in, they had better have a lot to teach us about how to recreate the business spirit we seem to have lost.
Bob,
Wow! Where is everybody?
Nerds finally found a place to excel and make tons of money, if they had the drive to get the needed things done. Normally nerds are overshadowed by either physically focused people or socially focused people. In other words, the popular people ruled the roost. The triumph of the nerds was a major change in the natural order. It is yet to be determined how long the new era will last.
“As for the Europeans, they are obsessed with style, thinking that the external design of a computer is as important as its raw performance. They are wrong: horsepower sells. The results are high-tech toys that look pretty, cost a lot, and have such low performance that they suggest Europe hasn’t quite figured out what PCs are even used for.”…….
Shh.. Don’t tell Steve Jobs.
Turns out Europeans were not wrong, just ahead of their times. Same for Mr. Jobs.
Once computer performance became adequate for what it’s used the most (today, mostly access to the internet where the connection is the bottleneck anyway) then other factors increased in importance: Design, size, portability, user interface, usability.
The common misconception is to equal “design” to only “how it looks”.
Yeah, anyone else remember the Next? It was a great machine — but it was too expensive and, more imporantly, too far ahead of its time.
Looks like Steve learned from his mistakes, but didn’t give up the Vision.
(Notice how Nerds call him “Steve” and not “Mr. Jobs”?)
Yet computer performance is still not adequate (and with tablets takes a step backwards) for ever more complex 3D worlds (MMO games – not only fantasy games, but also large scale simulations – that can be used for gaming [FPSs, flight sims, etc] but also for modeling our next steps into new frontiers, and for creative expression [machinima] at a fraction of the cost of live film on location). Each successive advancement in PC capability lead to more complex and realistic simulations to follow; the fidelity was on an upward climb – but without PCs to create or consume – will that continue?
For this activity to continue, it will require increased development of more advanced, and cheap PCs – otherwise these capabilities may be priced out of reach of the average young (and young-at-heart) hobbyist – both developers and users of the technology – and as Robert asserts here the adolescent energy needed will find it more difficult to make headway for anything more complicated than a website/tablet application.
If the age of high performance PCs is passing – then what will replace it? We already know that the thin-client will not be able to support the rich capabilities of dedicated high performance systems. Without significant advancements in technology, I only see proliferation of ‘Angry Birds’ quality applications, and no real progress in performance and resultant rich environments that allow quality of the experience to increase.
Is anyone thinking about what we are losing by moving in this direction today?
Good point. That’s progress though. Are there any young hobbyists today who can take apart and rebuild a modern car engine as could be done in the 60s?
Dutch company ASML, formerly Philips, is the largest chip machine manufacturer in the world. I think they know something about performance.
In no real hurry to dust-off the beige towers of yore.
I think Steve Jobs knew what PCs were used for. Early on I’d say his eyes were bigger than his stomach, which is a lot different than being clueless.
What has changed is that horsepower no longer sells as much as it did back then.
Europeans were bit-players in the home computer revolution if you count Sinclair computers like the ZX81 http://en.wikipedia.org/wiki/ZX81. The main action though was in the United States.
I wouldn’t say the Europeans were obsessed with style. I certainly can’t think of any “stylish” home or microcomputer by anyone until the Macintosh, which was designed to look user friendly.
The US has more of a tradition of inventors, risk takers, big winners and unsung losers. Europe is more cautious, satisfied to play second fiddle if the big risks can be avoided. Many firms made solid IBM-compatible PCs (Olivetti, Tandem spring to mind). Philips had its own home computer OS but that failed miserably. So other than Clive Sinclair, Europeans evolved the PC after the PC revolution took place in the US.
Regarding horsepower BTW, the Cooper mini apparently ran rings around US muscle cars in the 70s and 80s (Source: Top Gear).
The Atari 800 had an appealing design. But a less appealing price tag.
But like you I thought the UK computers were either built to the lowest common denominator or only appealed to the educational market.
The BBC Model B was more expensive than any other ‘so called’ home computer. A really utilitarian design. Dull as ditch water. But the hardware /software integration was truly astonishing.
I bought the Atari. 😀
I have one in my garage. 🙂
The big change was not that computer users became interested in style, but that Jobs reinvented the computer itself as a fashion device. In the fashion space there is not only a primary emphasis on aesthetic issues, there is also an expectation that things will be frequently replaced, and that (what we would call) early adopters are very happy to pay a premium in order to be at the head of the fashion curve.
“Shh.. Don’t tell Steve Jobs.”
Whose designer is a European, of course.
New phase, new times.
Clearly it’s a generalisation, but a reasonable one at the time.
Nobody could claim that the Dragon 32, Sinclair Spectrum +2, or anything that Amstrad ever made were pretty, but it’s a fair description of the Oric Atmos, ZX80/81 & BBC Micro.
“Korea, Taiwan, and Singapore have fared similarly and are still mainly sources of cheap commodity components that go into American-designed and -built PCs.”
No mention of China in 1991, even as a source of “cheap commodity components.” In a generation they’ve gone from being not listed to designing and producing systems, not to mention hacking into our lists.
History – gotta love it. Or else.
“But we are running out of recognized frontiers.”
There is one frontier that you did quite well with once upon a time, that would be difficult to exhaust – space – but america seems to have withdrawn form any particularly serious assault on that.
I once watched a space documentary that showed Russian technicians and engineers spending an afternoon just mounting their escape tower. The problem they had was that the tower wasn’t seating properly. The narrator indicated that it was off by only the width of a few sheets of paper, but that wasn’t good enough. Had it been deployed like that, it’s sideways component of thrust would have compromised the capsule resulting in tragedy.
That’s when I released that getting into space by rockets would never be cheap! They are the size of buildings, but require the precision of a watch maker.
As expensive as it is likely to remain, though, I do support space exploration as the benefits are likely to be even greater.
Hopefully the next endevour will be to safely dispose of some of the orbiting junk up there. Before it causes a catastrophe in space or down here.
So, now that computer chips are too small for the average nerd to dumpster dive and build their own and we’re outsourcing the American “expertise” to India, are we looking at the “maturation” of the industry? Is the computer industry “grown up”?
“We got from boardrooms to water balloons led not by a Tom Watson, a Bill Hewlett, or even a Ross Perot”
Is this entirly accurate? How much of Microsoft’s early success can be attributed to an IBM contract?
What’s really depressing.is that most of the players have left the stage. The nerds have grownup or died. Tablets and phone are just shrunken versions of the PC and the network has become the computer. What comes next neural implants so that we are on the net all of the time? Matrix like virtual reality? Maybe I’ve read too much Sci Fi?
You find now to be depressing? I find it exciting.
Okay, so, PCs have stagnated. Asian companies are catching up so handily that Lenovo will probably become the number one seller of traditional PCs, and is moving up in the top-10 smartphone list. Though, Samsung still doesn’t seem to have gotten the hang of it, with their terrible Exynos security bugs and UEFI bricking bugs. But with the long-awaited demise of Windows XP, I’m becoming a lot more optimistic.
The pioneers are going away, and it’s no longer within an adolescent’s ability to create a computer from scratch that is as useful, personally, as a computer from a major vendor. Actually, it has long ago ceased to be economically viable to make an entire computer from parts, even on as crude a level as the parts you could buy from Newegg. But with smartphones and tablets, even that level of self-sufficiency is going away. We could wallow in sadness, but it’s better for your health to be optimistic about the new frontiers, such as the cloud.
With certain incremental improvements, they may technically bring nothing new, but after a certain point an improvement in degree enables a major change in kind of work you can do. Analogy: Instant compile times enable rapid write-compile-test cycles. That’s what I think happened with the improvements in computing and networking that have lead to the cloud. There are advances in management systems and programs that you can execute using the cloud. There are advances in computing power, with deployments of CPU and GPU that you could potentially use. SDN and virtualization are replacing traditional notions of networks and servers. Netflix is becoming a major force on the Internet using practically none of their own hardware, but using capacity rented from Amazon and Akamai and Level3. I’d love to see what comes, next.
Oh, certain people are excited about mobile apps, too. Just computers, but they’re finally powerful enough to do a lot of different things, and small enough to have with you all the time, and power-efficient enough to have running all the time.
I wouldn’t worry much about the future. Even though you and I can’t think of what comes next, I’m sure someone will.
I guess it’s down to the individual. But I like the ability to incrementally upgrade parts of my PC and keep it useful. Average processors speeds bottomed out 6 or 7 years ago, they’ve just gone multi-core since then.
I bought an copy of XP 64-bit for an MSI motherboard, and today could easily locate a replacement on ebay for about $40. Ditto the CPU.
The rest of the hardware is replaced as and when, usually for less than spec’ing out a completely new build.
“They didn’t resent the older people around them—you and me, the would-be customers…” I think you can no longer presume that your readers will be older than The Nerds.
“… newly discovered moons of Neptune…” Not so new anymore.
Tehran? Do the American masses know what/where Tehran is?
For all that the world has changed since the 80’s, these are truths that are even more true today, Bob.
“The 1980s were not kind to America. Sitting at the end of the longest period of economic expansion in history, what have we gained? Budget deficits are bigger. Trade deficits are bigger. What property we haven’t sold we’ve mortgaged. Our basic industries are being moved overseas at an alarming rate. We pretended for a time that junk bond traders and corporate disassemblers create wealth, but they don’t. America is turning into a service economy and telling itself that’s good. But it isn’t.”
Bob, I am kinda surprised you did not set up a Wiki for this project. Make us register in order to edit. You can rool back unacceptable changes. Wiki’s (especially MediaWiki) are a fantastic value.
OK, I’ve read along for a week now and something stands out: lots of outdated references.
If you really plan to bring out an updated Accidental Empires, it will need footnotes. Many many footnotes.
I’m beginning to wonder if it is even possible to “update” history. Reminds me of the title of my high school Latin book “New Latin”.
The Japanese have fallen silent since the console wars ended – around the turn of the millenium.
No more Sega, or Dreamcast. Just more boring episodes of Microsoft vs Sony.
I for one really miss the Japanese games titles.
in the time you are talking about, the Japanese didn’t only build a little better and a little cheaper,
their Quality was so much higher than the US’s that the US chip industry almost went out of business. I was visiting Fairchild one day when a tech dropped a wafer on the floor and unfortunately it didn’t break so she put it back in the boat to contaminate the fab for endless time, until someone got desperate and cleaned everything. On the other hand, Japanese wafers were absolutely pristine—no visual defects for one thing, but not the most important.
When I read these pages, I’m struck that the tense seems so wrong, and nearly envy verb needs to be changed.
The trends that were important then have morph significantly. Manufacturing was taken abroad by foreign competitors then, now it is sent abroad by US firms. There are still new frontiers, which weren’t even possible ideas until the PC and its mobile descendants were ubiquitous – think social networking, twitter, etc. Nerds are still triumphing, and there are a lot of parallels with the 80s that should be made for the rewrite to be relevant to today.
Every for envy; morphed for morph; a real keyboard on a PC beats the IPad touch keyboard with autocomplete,eh?
LOL – Modern stuff eh?
I can’t speak for the Apple world, but generally autocomplete and physical keyboards are separate concepts. One doesn’t add or remove the other. Personally I like physical keyboards and large displays but having both means a thicker device (e.g. umpcs and laptops have both but their thick since the keyboard and screen are on top of each other while being carried.
“They didn’t resent the older people around them—you and me, the would-be customers—but came to pity us because we couldn’t understand the new order inside the box—the microcomputer.”
In those days it really was possible for a nerd to fully understand everything that made up a microcomputer including the software. These days it’s impossible and nerds have to specialise on their particular area of expertise. Things were a bit more exciting in those days for hackers and tinkerers.
I guess that’s down to the individual and how early in the micro-era they got started.
But it’s not as hard to pick up as 20-25 years ago. Back then we had to faff about with multiple options via jumpers on motherboards and disk drives. These days you’d be lucky to find more than 2 or 3 hard options on a modern PC board. Plus manually setting IRQs and DMA.
It’s pretty much [auto]set and forget nowadays.
There’s more to learn, in the sense of ‘historical’ changes. But most of it *is just that*, history. Not really important.
Sure you could learn about stuff right down at the electronic component level, but that’s more to do with wanting to get into the hardware design or perhaps writing device drivers. I don’t believe it’s required knowledge for overclocking either.
Todays overclockers are more akin to heating engineers.
In Adobe’s current incarnation, do they still have water fights? I’ll bet not!
Though this was written before it happened, the dot-com bust took a lot of the play out of us. We had fun, as a rule, up to that point. Worked hard and played hard. The dot-com boom had us dreaming. Then the bust and we got blamed for the economic troubles. And we bought into it.
There is still some of that “fun” popping up lately–say the last 3 years. A few start-ups that make a point of being a fun place to work but its sometimes more calculated to entice the “resources” to get more work done.
Brings back a stray memory from the early 1990s of calling Adobe to unlock on some fonts and chatting with the fun-sounding woman in the service department, “Great weather here, all my co-workers are outside playing volleyball. . .”