We stand today near the beginning of the post-PC era. Tablets and smart phones are replacing desktops and notebooks. Clouds are replacing clusters. We’re more dependent than ever on big computer rooms only this time we not only don’t own them, we don’t even know where they are. Three years from now we’ll barely recognize the computing landscape that was built on personal computers. So if we’re going to keep an accurate chronicle of that era, we’d better get to work right now, before we forget how it really happened.
Oddly enough, I predicted all of this almost 25 years ago as you’ll see if you choose to share this journey and read on. But it almost didn’t happen. In fact I wish it had never happened at all…
The story of Accidental Empires began in the spring of 1989. I was in New York covering a computer trade show called PC Expo (now long gone) for InfoWorld, my employer at the time. I was at the Marriott Marquis hotel, the phone rang and it was my wife telling me that she had just been fired from her Silicon Valley marketing job. She had never been fired before and was devastated. I, on the other hand, had been fired from every job I ever held so professional oblivion seemed a part of the package. But she was crushed. Crushed and in denial. They’d given her two months to find another job inside the company.
“They don’t mean it,“ I said. “That’s two month severance. There is no job. Look outside the company.”
But she wouldn’t listen to me. There had to be a mistake. For two months she interviewed for every open position but there were no offers. Of course there weren’t. Two months to the day later she was home for good. And a week after that learned she had breast cancer.
Facing a year or more of surgery, chemotherapy and radiation that would keep my wife from working for at least that long, I had to find a way to make up the income (she made twice what I did at the time). What’s a hack writer to do?
Write a book, of course.
If my wife hadn’t been fired and hadn’t become ill, Accidental Empires would never have happened. As it was, I was the right guy in the right place at the right time and so what I was able to create in the months that followed was something quite new — an insider view of the personal computer industry written by a guy who was fired from every job he ever held, a guy with no expectation of longevity, no inner censor, nothing to lose and no reason not to tell the truth.
And so it was a sensation, especially in places like Japan where you just don’t write that Bill Gates needed to take more showers (he was pretty ripe most of the time).
Microsoft tried to keep the book from being published at all. They got a copy of the galleys (from the Wall Street Journal, I was told) and threatened the publisher, Addison-Wesley, with being cut off from publishing books about upcoming Microsoft products. This was a huge threat at the time and it was to Addison-Wesley’s credit that they stood by the book.
Bullies tend to be cowards at heart so I told the publisher that Microsoft wouldn’t follow-through and they didn’t. This presaged Redmond’s “we only threatened and never really intended to do it” anti-trust defense.
The book was eventually published in 18 languages. “For Pammy, who knows we need the money,” read the dedication that for some reason nobody ever questioned. The German edition, which was particularly bad, having been split between two different translators with a decided shift in tone in the middle, read “Für Pammy, weiß, wer ich für Geld zu schreiben.”
“For Pammy, who knows I write for money.”
Doesn’t have the same ring, does it?
The book only happened because my boss at InfoWorld, the amazing Jonathan Sacks (who later ran AOL), fought for me. It happened because InfoWorld publisher Eric Hippeau signed the contract almost on his way out to door to becoming publisher at arch-rival PC Magazine.
Maybe the book was Hippeau’s joke on his old employer, but it made my career and I haven’t had a vacation since as a result. That’s almost 24 years with no more than three days off which probably in itself explains much of my behavior.
Accidental Empires is very important to me and I don’t serialize it here lightly. My point is to update it and I trust that my readers of many years will help me do that.
Join me for the next two months as we relive the early history of the personal computer industry. If you remember the events described here, share your memories. If I made a mistake, correct me. If there’s something I missed (Commodore, Atari, etc.) then throw it in and explain its importance. I’ll be be with you every step, commenting and responding in turn, and together we’ll improve the book, making it into something even more special.
And what became of Pammy? She’s gone.
Change is the only constant in this — or any other — story.
Funny, Bob. Very funny.
Sometimes I’ve wished I was born a little bit later, in order to have computers introduced into my life earlier. Yet now I see my kids playing with their tablets and being practically oblivious to the filesystems and working components in them. And I realize now I have been extremely lucky to have matured with the industry.
Agree, it’s almost like having been alive during the industrial revolution, to have seen the automobile and airplane transform civilization by breaking the rules of physical human limits. In our lifetimes we have seen (some) limits of the human brain superseded, as well as the notion of physical information transfer and storage utterly blown away. I just wish I’d kept that old 240 baud modem my dad brought home in 1982, the one that actually cradled the phone receiver.
They were called Acoustic Couplers. I used a 110 baud to timeshare, 240 would have been pure luxury. I actually don’t remember 240, the next step up was to 300 as I recall.
You don’t know how lucky you were. In the late ’70s my school had an acoustic coupler which ran at 0.2 baud and was connected to a mainframe at a nearby university. I tried to download a clip of Sylvia Kristel taking her clothes off but they ripped out the whole system and replaced it with a PC before Sylvia had unbuttoned her shirt.
My timing was just about right. I was in 11th grade when Popular Electronics published an article called ‘Build Your Own Microcomputer for $80’. I had to read that article six times before it all sank in, but within a few weeks, I had built my very own RCA 1802 based computer.
http://en.wikipedia.org/wiki/COSMAC_ELF
It only had 256 bytes of memory, eight toggle switches for binary input, an LED, and a two digit hexadecimal display. My first program was a reflex tester, written in machine code.
By today’s standards, that computer wasn’t much, but to me, it was one of the most amazing machines I have ever owned. Back in 1976, everything to do with micro computers was new and exciting.
Sometimes, I miss those days.
Lost my wife to breast cancer at that same time.
Been reading you the entire time.
I think I’ll have to dig out the book, time for a re-read. Looking forward to seeing more, Bob.
Yep you sure did miss Commodore! Your book feels incomplete without the story of the Commodore PET, C64 and the Amiga. The Amiga in particular was groundbreaking and you hardly ever mentioned it.
Need to fix this.
So fix it! When we get to the parts where it belongs, please tell us your stories.
I’ll do my best! 🙂
There’s a really interesting interview of Dave Haynie (it’ll be on YouTube for sure). It will fit right in with this adventure.
If you really want to know the Commodore story, I recommend that you pick up Brian Bagnall’s excellent book “Commodore: A Company on the Edge”. Bagnall’s does a fantastic job placing the company in its historic place while providing a comprehensive account of the people and the incredible products they invented. There is just so much to be told that it would be an injustice to try to edit down and shoe-horn that company’s story into Accidential Empires.
ROBERT X. CRINGLEY – Archiver of the PC Age.
It has a nice ring to it, and I’m glad someone with skillz is doing it right. History tends to be written by the winners, and forgotten by everyone else. It’ll be nice to see a non-biased book that records the details and how they actually occured.
Not to quibble, but I’d like to offer another constant in addition to change – that there were always alternatives to what happened. As Pulitzer Prize winning biographer Martin J. Sherwin stated in 2008: “No historical event ever had to happen the way it happened. The counter-factual, the “could have been” in history is a unique and essential aspect of human intelligence. And it should always be recognized that options existed. Alternatives always have been available, and they always will be available.”
One of the great lessons you’ve documented with AE and Triumph of the Nerds is that the “woulda, coulda, shouldas” do indeed historically matter. Your telling of the Xerox PARC story is Exhibit A to that point.
I used to hang out at a computer store back in the mid-1970s. It had all the big names of the day on display. Imsai. Northstar. A Sol 20. Maybe even an Altair. I was born in 1966 and this was absolute heaven. Later, after the TRS-80 came out, I hung out at the much closer Radio Shack after school, often until closing.
Like you I lived this history.
I programmed the 4004, 8008 with hand assembled code. Hand entered into a eprom programmer from ProLog.
I didn’t jump on the Altair bandwagon. I bought a Sphere which was 6800 based.
I can give you the timings for the standard quick benchmark of the day:
For i=1 to 1000
Next i
Altair 680b basic (6800) was actually a little faster than the Intel 8080, when the cpus were running their max clock at the time. 3.2 seconds vs 4.0 seconds
The first Microsoft Basic was really tight code. It would run in 4k of memory. The basic interpreter took up about 3200 bytes of that leaving about 800 bytes for user program. A the time a floating point subroutine package with transcendentals took up about 1600 bytes. The rest of the basic interpreter was done in 1600 bytes!!!
Later I used 6809 stuff from Southwest Technical Products. They had Unix like operating systems called Uniflex and Microware’s OS9.
Lots of history here.
I look forward to helping with your project.
The 6800-based Altairs, which came out near the end of the company’s run, were very quick. When we visited MiPS founder Ed Roberts in 1995 he had standardized on those almost 10 year-old machines to run his medical office.
On the other hand, I just yesterday bought a 1.2-GHz four-core ARM-based Ubuntu computer in a USB stick with one GB of RAM and 8 GB of non-rotating storage for $93! The good old days are, in many ways, right now.
[…] Atualização de 05/Fev: Devido aos problemas de fuso entre o Brasil e os EUA o primeiro capítulo entrou meio tarde para nós, mas já está no ar. […]
Regarding the German Translation: “Für Pammy, weiß, wer ich für Geld zu schreiben.” This was really what they wrote in this book – because it isn’t even proper German.
This roughly translates to “For Pammy, knows, whom I for money write” (the German sentence doesn’t really make any sense, and the commas are wrong too).
“For Pammy, who knows I write for money.” would be something like: “Für Pammie, die weiss dass ich für Geld schreibe”
“For Pammy, who knows we need the money,” would be something like : “Für Pammie, die weiss, dass wir das Geld brauchen”.
It’s hard to believe that the German editor was so bad, that there was a sentence in it, which is not even correct German.
Well, i had a deep look in the basement, and there, near brook´s myhtical
man month, i found my 1993 German Edition of AE – but it seems to be correct:
” Für Pammy, die weiss, dass ich für Geld schreibe”
an advice: just forget about that “dass” if you type it in google translate
Bob! It is so exiting – a continuation of AE ! YES 🙂 *happy*
oh folks, I envy you, most of you took part of that computer revolution, at a time
i was not even born ( 1979) 🙁 *sigh
With or without the “daas” google says he writes for money instead of the point he wanted to make that Pammy knows “we need the money”.
In the Brazilian edition, our local translator rewrote the beginning of the chapter three (Role Models) adapting the 1913’s football match between Notre Dame x Army analogy into a 1958’s Fifa World Cup game between Brazil x Soviet Union.
He did the same thing on chapter thirteen (Economics of Scale) transforming the baseball scenario into a soccer one, replacing Rick Miller’s role by two players: Romario and Boberto Baggio.
The result was quite a mess but — you know — soccer is quite popular here in Brazil. :->
I’m surprised that we’re on the topic of translations but there’s an explanation for this. The better translations (better in terms of book sales) tended to do this. The book sold better in Japan than it did in the USA in large part because ASCII, the Japanese publisher, used a successful novelist as translator. Another edition that did quite well was the French where they turned it into a large format illustrated paperback. The French translator’s day job was working for the government office charged with protecting the French language from foreign influence! In all these cases the translators tried to localize the more arcane analogies and, while odd from an American perspective, it generally helped those foreign editions.
Recently resurrected my TRS-80 Model 1 (circa 1977) for nostalgic reason, wanted to do some Z80 code in TBug…. it was an amazing procession to witness
I love Accidental Empires, fellow worker of the time (1994) sent it to me from a store in The Netherlands.
I read it many many times. A precious play of an extraordinary time. And written with style. I had thought about the need to continue the book, since it ends right before Internet. I’ll write something about the ’90s and send it for you to consider, maybe will help somehow.
And thanks for writing it !
So excited to see this update develop, Thanks Robert X!
In Ireland in the early 80s, the thing to have (for geeks anyway) was the Sinclair Spectrum, 48K memory! The US had it later under the Timex brand.
I was going to say “If it’s for Pammy, of course we’ll help!” but then I read this:
“And what became of Pammy? She’s gone.”
I thought you said your wife is “Pammy”? Sorry, some geeks are just too dense to trust … ;(
It’s funny how life turns out. If you were an InfoWorld reader you knew Pammy as a spunky woman 15 years younger than me. The original Pammy died of breast cancer many years ago though after I was fired from InfoWorld in 1995. We never had children because the estrogen spikes of pregnancy are deadly for those suffering breast cancer. My present wife is also a spunky woman 15 years younger than me but we have three young sons. Now you know why I got such a late start as a father.
Ah, okay — so it’s for both Pammy’s. My condolences (a wee bit late).
I started in the computing business as a 14 year old kid working on Dad’s US Navy IBM 1130 writing FORTRAN 2 code on 80 column cards in a beautiful island in the Caribbean, waiting for WW3. Then I became a nuclear engineer writing weapons and reactor design code on punched cards and teletypes. When I joined IBM, I was sent to unit record schools and was the last systems engineer trained on card sorters and unit record accounting, before moving to System/360, System/3, and many of the other “iron” of the time. Today, I’m still writing code for Android, QNX and IOS.
It’s been quite a ride, I agree. Along the way I met you at Three Mile Island, I wrote code for the Space Shuttle, turned down Bob Metcalf’s offer to join him at 3Com, turned down to help Bob Young start Red Hat and couldn’t just bear the thought of going West to work for those small firms called Apple and Microsoft. I cherish the memory of the sales call when I was told by Armonk to go meet Jobs and try to save the IBM S/38 that ran the Apple internal accounting systems. I was there when some big mistakes were hatched (like SAA) at the Blue Pig and loved it when Lou Gerstner called me “the most dangerous man in the room” in a room full of IBM executives and engineers. It’s been fascinating sitting in on the sidelines and watching egos rise, crash and burn and sometimes be resurrected to crash and burn again.
I know history needs to be written and not forgotten when a PhD in Computer Science recently came by my home office and said “That’s this?” to an 80 column that was next to a 96 column (20% more data and smaller) punch card as he admired my 1970’s bottle of “chad”…….Carry on Bob…..
“What’s This?” not “That’s This?”…..finger check!
Remember the battles between oval and rectangular holes? I’ll be counting on you to speak up when we get ot the IBM chapters.
Oh God, yes! Now I won’t be able to sleep tonight because I’ll have nightmares of squares versus holes and optical card readers versus readers that used “real reader technology” like brushes.
I was a tainted Beamer with no chances to move up to the top early in my career. Besides not being tall and being Hispanic, I was a “Basic Systems” SE, with a competency in something nobody understood and wanted for career advancement….sensor systems and boxes like the IBM 51xx…..and inconsequential boxes like the System/3X and the PC…..
I guess I’ll be buried “Nine Edge Up”……
I have a copy of the book and plan to keep it until I am dead so that others might find it, just in case people start to think Bill Gates invented personal computers. I also think I have a VHS tape of the television program somewhere. You might have heard that the guy who wrote the Amiga operating system (multi-tasking in 256K) is still around, talking about the increasing complexity of computer systems, and has released his programming language REBOL (www.rebol.com) into the open source world.
Who is Pammy?
Any chance this will be available on Audible Unabridged sometime? Read by the author maybe?
When this process is complete we’ll definitely have an audio edition. There was a casette edition many years ago but it was only 40,000 words. We’ll get the whole thing online by this summer with some way to allow listener to choose which version they want to hear (1996 or 2013).
I played pool with Eric Hippeau at a MacWEEK party. He seemed like a squared-up guy.
The penultimate line in your story…. like a kick in the guts. I’m so sorry.
As to the computer history story, my Australian computer journey started in 1982 with the Z80 powered Microbee.
16 kB of ram… using the family TV as a monitor. Storage = cassette tape.
http://en.wikipedia.org/wiki/MicroBee
Maybe this wonderful little piece of Aussie tech could rate mention in your book.
Are you aware the Microbee has been resurrected?
https://www.microbeetechnology.com.au/
A tad pricey compared to current technology, but a proper piece of history you can build yourself.
What about the old Heathkits? That was something missed in your book. I remember always wanting their hero 2000 robot. Their last, late product…
I’ve been programming since the late 70s and lived through much of what you wrote. I’m sure as I re-read it here it will bring back many memories and if any are useful I’ll post comments.
I work with a lot of students now and they’re always curious about what it was like to program computers “back in the day”. The notion of 16k, 32k or 64k of memory baffles them. They have no idea how any of us got anything to work. My attitude remains the opposite, why do we need so much memory to get anything done today? It’s baffling. I guess incompetent programming fills all available memory.
I enjoyed those early, heady days in the 70s and 80s when everything in the micro world was new. When you could build something by yourself and hold the entire program in your head. When you had to comprehend the hardware to get the most out of the machine. When lazy programming was punished. When you could talk to anyone at a trade show and know that they were pretty much as geeky as you were, and as interested in the nuts and bolts of computers. And when the media was more in tune with the programmers, understanding what they were doing and cheering them along the way.
It will be an interesting and hopefully enjoyable trip down memory lane. Reminiscing tends to be bitter-sweet I find.
And I’m so sorry about your loss. I had no idea when I read that dedication years ago. Cancer sucks.
It doesn’t seem like the desktop’s golden age is waning at the workplace. Each office or cubicle I glance into here (I walk around a lot) has somebody sitting in front of a big screen. I wouldn’t want to try to do real development or analysis work on small format display.
I always told people? read this series and you will understand what has come to be called “The computer industry”. I first read it when it came out and was so amazed by the revelations in it? that I used to give this book to D.o.D. “procurement pukes” and tell them if they wanted to keep from being ripped off by the vendors they should memorize it. I was only on the ‘fringe’ of the PC world having started off with ‘Cromemco’ and their Unix derivative “Cromix” and then on to Sun Micro Systems (may they rest in peace despite Oracle destruction of the O/S and hardware). Wisely Cringe steered clear of the Unix story in the book. jccampb
… and I worked at Cromemco as a member of three person team (Egon Zakrajsek, Boris Krtolica and myself) that “inherited” original Z-80 based Cromix (designed by Roy Harrington and written by Roy and Ed Hall) and evolved it to a real 68000 based OS. I always felt that Cromemco was not adequately represented in Bob’s book.
I think the best part of that era is that as a high school graduate with no prospects for higher education I was able to get a job with decent pay right away working for a semiconductor company. The demand was so great they would hire anyone with a pulse back then. With some effort I advanced myself to the point I was making pretty good money the same time my friends were getting out of college with degrees and starting at the bottom. Even now the skills I was able to learn back then have kept me employed for 30 years and hopefully another 10 so I can retire at 62. I just don’t see this kind of opportunity for today’s youth who now need a four year degree just to be an administrative assistant.
Funny, didn’t know Google Translate existed back in the day…
Really excited about this project, Bob. I’ve always wondered what the story was with your ability to get interviews with all the top dogs of Silicon Valley. Did you cultivate relationships with them before writing the book? Or after? How did you get guys like Jobs, Gates, Ballmer, and Metcalfe to talk to you on camera? If MS was against the book why did their founders agree to do the filmed interviews? You must have some serious social engineering chops.
Gravity attracts. so does Gravitas. Cringe got Gravitas with the nerd readers. first with the rumor mill as Robert X. Cringely, and then AE. at this point, Dvorak is a B-lister, and as the “wise of the empire” need to shape discussions, they become availiable for interviews and perhaps even TV.
only time I ever saw Bill Gates was when he was doing roadshows at computer clubs in the late 80s, and I needed to bring a Zenith flatscreen monitor over to the auditorium for him to see what is going on the projection screen. four handlers in the green room, door open, and this guy is, as always, hunched over a laptop. the rest of the world didn’t exist. on stage, the presentation was all that there was, and then billg was gone with the staff. local MS office guy handed out door prizes, and I wheeled my monitor back to the lab.
i think bob facebook friended them, then did a bunch of the interviews over skype. i heard he hit the jackpot and got the steve jobs interview after steve popped up on bob’s chatroulette.
Great timing! …for me, at least. I just recently finally got around to reading Jobs’ biography, which led me to Steve Jobs: The Lost Interview, which caused me to dig out and watch my old copy of Triumph of the Nerds, which is what made me go to the bookshelf and find my old copy of Accidental Empires and give its first re-read since I bought it. Somehow nothing pushes my nostalgia buttons like computer industry history does.
Bob, I hope that as the AE corpus expands that we do hear a little more about the interesting stories of companies and products that failed for one reason or another, or were consumed by others, or were too isolated from the rest of Silicon Valley. These stories are like the appendix in the body – curious dead ends.
Borland, Monolithic Memories, Ashton-Tate, Motorola Computers, etc. etc. are all stories that I would like to know more about.
Really looking forward to this, Bob. Thanks for doing it!
I read this volume years ago and it’s now proudly displayed in a box in the garage. Just kiddin’, Bob… I was a faithful Cringely reader who had to beg my town librarian to let me have first dibs on his copy of InfoWorld before it got to the public-access rack. What I didn’t know then was this:
1) My jazz buddy, poet-laureate Brad Leithauser, would finish Cranbrook (where we both attended along with a recent presidential candidate) in Bloomfield Hills, MI, to play poker with Bill Gates and Steve Ballmer instead of attending class at Harvard .
2) Like me, Ballmer was a Ford brat but attended nearby Detroit Country Day School. Cranbrook regularly played DCDS in any number of league sports.
3) The little Jewish boy, Dirk “Derry” Kabcenell—that our Cranbrook Latin teacher hung out a second story window by his ankles—after a stint at PARC would go on to become Oracle employee #5. He retired after achieving the lofty position of VP & Chief Database Architect.
4) A hockey nerd one year my junior at Cranbrook would become CEO of Sun Microsystems. Scott McNealy is the guy’s name.
5) Being the younger brother of one of my former Cranbrook friends, John Fisher would also migrate to CA and, with a friend, become a principal in Draper Fisher Jurvetson, the extremely successful Silicon Valley VC firm.
6) Another Cranbrook hockey nut—this one three years my senior—William “Bing” Gordon would become a principal at Electronic Arts, later joining up with Kleiner Perkins Caufield & Byers, another major Sand Hill Road player.
7) Yet another Crannie would host “Crossfire” Sundays on TV. Later, Michael Kinsley was tapped by Bill Gates to become the editor of Slate. Now an official Redmondian (Redmondite?), there he would meet and marry Patty Stonesifer, one-time Windows chief who went on to run the Bill & Melinda Gates Foundation.
Oh, and how could I forget?
One day the latest Cranbrook “Traditions” alumni magazine shows up, featuring a full page devoted to a writeup on yet another Crannie grad made big-wig, Pandora’s Tim Westergren!
Oh, and before my time there, the father of ASCII, Bob Bemer, was also a Cranbook graduate.
Jesus….
????
Makes me feel better about winning all those Metro Conf. tilts against Cranbrook (but not DCDS – they recruited better athletes!) back in my days at Mt. Clemens Lutheran North (when I was the first student to turn in a term paper on computer print out, in 1982, using a TRS-80 Model III & printer that couldn’t print below-the-line characters — i.e., raised j/g/y…). 😉
Oh God, yes.. those horrible dot matrix printers. I used WordStar on a TRS-80 Model II running CP/M to print my computing assignments on an IBM Selectric because it was much more elegant. This in a government computer centre where none of the rest of the staff understood or appreciated microcomputers.
Thanks for writing about the birth of microcomputers. There were lots of us hackers in the late 60s and early 70s who hungered for “our own computer” so we wouldn’t have to beg for a new compiler or share time with some “scientist” who couldn’t write his way out of a paper bag. Late night computer centers with key punch machines and a few dumb terminals was as close as we could get to having the access to the power we craved. Some of us built Heathkit H-19 terminals and acquired acoustic modems before an individual could own them – yep, corporations could lease them but individuals couldn’t simply buy them. We memorized the specs of all the Motorola chips and jumped out of our skins when Zilog produced the Z-80 expressly for “computer hobbyists.” We built Heathkits, bought Commodores and Radio Shack TRS-80s with all kinds of strange attachments so we could own our own computers to do with as we pleased. My Heathkit H-89 had more compilers and interpreters than the research facility where I spent my daylight hours.
We were dumb struck when IBM put the 8088 in the first “Personal Computer” instead of the 8086 but we bought them any way and anticipated the 286 and 386 processors. We joined computer users’ groups like HUGs for Heathkit Users Group and the IBM Users’ Group over at Cal Tech. We evaluated and reported back on every piece of software anyone sold for any of these and we collaborated with Digital Research, IBM, and Microsoft about what an OS should have and how it should work, which was amazingly Unix-like since many of us had a PDP sitting around a lab somewhere at work. We argued endlessly about operating systems and user interfaces. We were disappointed that Windows didn’t even approach the functionality of DR’s GEM until Windows 3. We cried when IBM dumped OS2. By the time ComputerLand was gone, Software Central absorbed, and Egghead closed its doors, we had stopped paying more for our computers than our home mortgages and started buying off the shelf in regular retail shops and chuckling at the thought of being able to buy software at the grocery store.
Some of us still have attics full of hardware that required hundreds of hours to construct, and others of us only mumble to non-appreciative museum visitors when we see one of our old machines on exhibit. We were the ones who got the bug – the very first computer virus – the need to own computer power. Now we just smile at the twenty-something kid who uses his ever-present iPhone to snap a shot (with time, date and location of course) of one of our relics. But we know that because of what we did in response to the computer bug, he has that wonderful piece of technology and human engineering along with the tablet in his backpack and an aging desktop at home he’s thinking about giving to charity – hey, maybe the museum could use a six year old iMac, the niece of the very short lived but oh-so-important Lisa.
Notice that Bob’s book is not very much a history of the industry or its trends or its hardware, but stories of the human personalities and quirks of its people. This is what impressed me most when I first read it years ago: Bill Gates was an actual human being, so was Steve Jobs, so were the others.
And they mostly had no real idea of the implications of what they were creating, mostly ad hoc and making up the rules–then breaking them–as they went.
No computer-history book before or since has matched the gossip, charm, and astonishment of Accidental Empires. Unfortunately: It would be great if others had followed in Bob’s path.
But few others have had his peculiar combination of opportunities–the Cringely rumor column must have been a huge advantage–and very particular, novel insights.
Which is why I’m glad he’s heading a project to update and expand his original.
Oh yes, taking me back. Thank you! Programming my 6502 homebuilt that had 512 bytes of RAM, with the top 256 bytes mapped to the screen. Writing to byte 511 scrolled the all screen’s 256 bytes, crashing my ‘invaders’ program.
Really, Bob, you rock!
It’s interesting that you touched on the ARM processor in your comments Bob. I think any update of Accidental Empires must include the computing history of the UK because that’s how the ARM processor was born. Acorn, one of the main UK computer companies at the time, decided that other microprocessors weren’t powerful enough for their next computer design so opted instead to design their own processor. They believed that a RISC instruction set micro could offer sufficient power for a desktop computer, and at the time lauched the Archimedes lines of computers, which included a full 32-bit WIMP OS. This whole system was in many ways much more advanced thany any Intel based Windows system of the time.
But the power and force of Intel / Microsoft / Windows was too much so they looked for other uses for the ARM processor, and embedded systems was where it found it’s niche. Even that nearly never happened. Yes, Apple used one in their Newton PDA, but it was Nokia who they pitched micro to which really changed things. Nokia, however, didn’t like the 32-bit instructions as it took up too much space. On the fligth back from a meeting with Nokia the engineer came up with the idea of compressing 32-bit instructions down to 16, thereby reducing the memory footprint and data paths. They called it the Thumb instruction set and ARM became the standard for a very high performance, low power, CPU.
What I’d suggest is two things. Firstly, the history of Sinclair and Acorn. Sinclair was a cheap alternative to the rather expensive Acorns (a good warning to the future of Apple perhaps?) and a fun way would be to watch Micro Men:
http://uk.imdb.com/title/tt1459467/?ref_=sr_1
Secondly, just the ARM micro by itself. Remembering that it’s at the core of many tablets today, and many more mobile phones, smart or otherwise:
https://www.ot1.com/arm/armchap1.html
Too bad it isnt available on Kindle….would love to read it tonight!
Sadly, have not been able to read the book yet. But I was a faithful follower of the columns.
“Grew-up” getting my first TRS80 (Please don’t call them Trash-80!) Model 1, 9-pin dot-matrix printer, and dual cassette decks, (imagine!) when married and in my 20’s. Prior to that was all punch-cards and room-size mainframes.
I remember we were one of the first to get Cable TV because the un-shielded Model 1 would blow-out channels 2-5 at least on antenna-driven TV’s.
Joined, and eventually presided-over TRS-80 User’s group, which had a friendly rivalry with the local Apple-II club. We actually collaborated on several local computer “fairs”, like our own mini PC-Expo’s.
We sponsored a bulletin board, remember those? It ran TBBS (The Bread Board System), originally on another Model 1, 3 720k dual-sided floppy drives for storage, and a whopping 1200 BPS Hayes linking it to Ma Bell. Our BBS had the distinction of being visited by Hilliary Clinton, herself. I chatted with her a good half hour. To prove it was really her, a week later a photo arrived in an envelope with a White House return address.
We grew over time, and when Tandy gave-up on their proprietary line of machines, we also migrated to the IBM and Clone worlds. Most of the gang were “hardware hackers” and occasional coders, teaching ourselves assembler, and that miserable substitute for it, C.
(Engineers love C, I.T. coders prefer the VB-Java path)
(Oh, please? let’s not devolve into that debate here, OK? This is about History)
But if you never upgraded your computer by inserting 18-pin DIP memory chips, 287/387 math co-processors, added or lusted for an AST 6-pack, struggled with extended verses expanded memory, and the original screwy Intel-inflicted 640k memory “barrier”, well, you missed-out on a heckuva ride.
May our far-pointers rest in peace. 😉
Ah, DIPs.. I remember tripping over a little cardboard box a few years ago.. inside was a foil-covered foam block with eight 4k x 1-bit chips. On the outside was a green customs declaration for US$129!!!
I shudder to think how much my current stable would be worth at those prices… Billions!
Wait, Pammy was real? I retain a clear memory of one of the last columns you wrote for Infoworld where you described Pammy as “the woman of my dreams.” I took that to mean she was an invented character.
PBS/American Experience broadcast a nice “prequel” to this last night on the early days of “Silicon Valley”, concentrating on Fairchild and ending with Intel.
I have fond memories of the Good Ol’Days, my first computer was a Sinclair zx80, my next was a Commodore 64, that I used thru university, as a terminal (much better than waiting for a teletype to free up in the computer room) and when i got a printer, as a word processor. Great times
But as Bob eluded to in the comments, Now is just as exciting a time. I am thinking that my next computer will the a RasberryPI, and the adventure continues….
Yah (or yeah for spelling yuks), miss those days too. One my first home built systems was S-100 based (remember those) with 2 full size HD’s which held 5MB each! That was more than the CDC 1700 system I was learning on at the time which we also entered code into front panel w/lights and switches (machine code). Same thing later on a Nova 1200 at Solar Turbines Int. Chose to use Z80 over 8080 or 8002 or Rockwells 6502 cause supposed have been latest/powerful CPU around. I loved it and here I am now on something no-one could imagine at the time.And now so many are saying to desktop is dead? No. And someone said all you now is people waling around (in their own worlds) on mobile devices/social nets/games. This is the future? Sad. Talk to the real world, set down you play time devices, get on with life around you and if your serious, pick up the mouse/keyboard/display and do some serious programming.
Oh Wow – I just sold the Processor Technology Sol20 I built waaaay back in 1978. I was one of the first nerds in Toronto, Canada to own their own computer but at least fit in with all the other nerds at TRACE (Toronto Region of Amateur Computer Enthusiasts). It’s been interesting growing up from the 50’s onward, working with RTL then DTL then TTL and finally CMOS devices morphing into little black LSI devices, while all the time trying to keep up with the latest ‘toys’ coming out. Expensive too. My complete SOL20 cost about $10,000 in 2012 money! I couldn’t believe the amount of data I could store on that 80K (formatted) NorthStar hard-sectored 5.25 floppy system and that 5.25 floppy drive only cost $450 – what a bargain. But of course, that was short-lived. I got my very first Shugart 5MB hard drive – an unbelievable storage device back then. Ah, the good ‘ol days,eh?
And to the gent near the beginning of this list of comments, remeber getting to the point where not only you could hand assemble uP neumonics but THINK in them as well? After all, the 8080 instruction set wasn’t that difficult to memorize…….
Like other posters here, I too have the book and TV series on tape (on one of the many dead formats – Video8).
I really hope you get the story of ARM RISC into the new book. Along with the many of non-PC manufacturers that fell by the wayside during the 80s and 90s.
I was fortunate enough to not encounter the minis and mainframes. But got my first Sinclair ZX81 in that year. Amazing price, but too quirky to survive. ditto the rest of Sinclair computers products. It’s also hard to fathom why a ZX80 will fetch upto $400 on ebay these days. Go figure.
If you do another TV series, wax your crack! (re: the hot tub scenes) LOL
Ooh!.. I almost forgot to mention the demise of Digital Equipment Corp and the post-Gatesian Microsoft which has gone from the perceived leaders (in the public eye) to followers (chasers, me thinks)
Some of the newer developments in Cloud facilities. Plenty of mileage in evaporation puns.
The Future: genetic computation.
Sure, the old days of TTL/SSI/MSI/VLSI/CMOS/FPGA etc. And CRT’s with a Z80 and 8″ 1.43 KB floppy (sorry – Trash 80). Wouldn’t have chosen to work on that thing and was surprised that at the time I was working for UCSD for a very prominent PI (principal investigator) and member of the UC regents board, loaned me out to 2 very high tech brain research people at Salk Institute and the VA next door to write TRS-80 programs for their research. And to amuse you some more, I built my own 16 bit registers at home out of mercury relays that interfaced to my system at the time via a parallel port. Part of 2 standards at the time (RS-232 and IEE488? parallel port) all built with TTL logic (registers/shifters/barrel shifters, buffers, clocks, …) eventually coming out on their SSI/MSI chips known as UARTS. Should have heard those relays a buzzing when I loaded all 1’s. That was about same time I joined up to one if not 1st major networks (DARPA) via reeeel fast modem around couple Mb/s! Just read article describing on of 1st telephone providers using all relays to do the switching! Also was little mad scientist but that’s another story. Another day, lets say thank to that and look forward to tomorrow.
Once more, does anybody remember when B. Gates was just an MBA from Harvard, bought an OS from Canada called QDOS (Quick and Dirty Operating System), remolded/reshaped it and sold it to IBM which later became known as PC-DOS. Gates went on to form his own company and simply renamed it to MS-DOS. He never was a tech person he was a used car salesman from Harvard. Now we are all at his feet for future OS’s (or someone else in MS) for OS’s/Languages/Platforms. I remember one my favorites from early 70’s was CP/M, concurrent CP/M, and MP/M and the best pascal compiler just up to coast was Pascal-MT plus. MT MicroSYSTEMS of Solana Beach, California!
I too loved CP/M. My first machine that had it was a Morrow MD-3. I learned so much using that computer.
You left out the part were QDOS was basically reverse engineered from CP/M in the first place. =)
Which parts of the book do you intend to update?
Seems that the stuff going on in Europe doesn’t matter.
Apple went shopping for a power-efficient processor for the Newton PDA and couldn’t find one ‘in their own back yard’, so to speak. Thankfully for everyone who owns a mobile phone or Android device, can thank Apple for investing in an ailing RISC processor design here in the UK.
The good old days are back.
Raspberry Pi
Yay!
Back to RiscOS – My old RiscPC died a couple of years back.
Hi Bob, I think it is awesome you are writing this new book. I haven’t read your old book, however I have been ‘in touch’ with computers since 1980. The first computer they allowed me to touch was a PDP-11 in college with 16KB of memory allocated over 4 terminals. We wrote BASIC programs in 4KB and soon figured out (after ‘discovering’ the sys admin password) how to obtain privileged access. My buddy and I wrote an alternative BASIC interpreter which we appropriately called ‘logoff’. Of course our interpreter didn’t execute anything but fooled the user with weird responses. Things like ‘bring your money to the bank’ as response to the ‘SAVE’ command. Good times!
I sincerely you cover the impact DEC had with the PDP-11 and later the VAX-11. It shaped my career big time.
I still have, and it still works, my Intel 8008 system that came to life in Feb 1972.
If you want pictures, just let me know.
This plus examples of just about all of the uPs that followed till about 85.
I was certainly a wild ride.
Is this a new era or a repeat of the golden era. Smartphones and tablets are part of a pattern. Like PCs before them, they represent: a simplification of computing, an increase in power for users to get things out of them. Just like personal computing through to the mid-noughties.
Back in the mid-90s each new processor or graphics card brought a discernable difference in performance, something that now only happens on smartphones and tablets.
Mark Twain is attributed as saying history doesn’t repeat itself, it rhymes. I guess that makes the mobile era a haiku tribute to the era described in your book. What is less clear is where it wall go, the only thing for sure is that it will be an interesting ride.
According to Wikipedia (http://en.wikipedia.org/wiki/Robert_X._Cringely)
“Evidently references to an earlier wife, Pammy, were a hoax”
One of the best books I have ever read on the history surrounding the shaping of Silicon Valley, not to mention the computer technology industry as a whole, an era that I am privileged to have witnessed and still involved in.
Roberts book is one you cannot put down and you never forget. I am so pleased that this new updated version is out as the first was amazing.
This book will never date and always be part of my top bookshelf reading. Robert you are a talented story teller and writer and selfishly pleased you were ‘guided’ away from becoming a billionaire!
They’re talking about your Accidental Empires movie on Techdirt this week:
https://www.techdirt.com/blog/innovation/articles/20130409/09212322633/documentary-history-apple-microsoft-show-it-was-all-about-copying-not-patents.shtml#c1494
Read this 10 years ago before the updated version was available. Going to read it again now, great story made more powerful by the fact that it’s true.
Lent this book to my mate and he never gave it me back!!!!, wasn’t happy, we did spent alot of time talking about it though.
Classic, even if your not into computers.
Bob, I wished I could tell you how much your book changed my life. I still have the original copy. The pages are yellow, and the glue on the binding has crystallized, and when I open it, I have to be careful the pages don’t fall out.
I wrote to Bob sometime after I had read his book, and later purchased the DVD set. I had started my small computer company in 1978, grew it to a thriving business by 1985, and I watched it slowly disintegrate beginning in 1987. From 1987 until today, I gradually converted my business to a Cloud Computing company, always believing in Bob’s predictions, I never gave up. I saw Bob’s vision of the future. No one could have seen how far into the future it would be.
Today we are seeing Bob’s vision become reality. Bob’s book is the only book I know of that presents an accurate timeline of the unfolding of the personal computer industry, and as we watch its final days… finally, I never considered for a minute that his predictions would eventually be disproven. It’s probably the best “I told you so” in the history of literature.
Accidental Empires is easy to read. It’s a collection of hundreds of little stories that are organized in a way that makes perfect sense; even more sense when you realize it’s all true.
It’s a shame that great progress only comes from pain, exasperation, and frustration. Bob never gave up, and neither have I.
Thanks, Bob, for being an important part of my life for the past twenty-five years.
[…] 2013 Intro Chapter 1A Chapter 1B Chapter 1C Chapter 1D Chapter 2 Chapter 3 Chapter 4 Chapter 5 Chapter 6 Chapter 7 Chapter 8 Chapter 9 Chapter 10 Chapter 11 Chapter 12 Chapter 13 Chapter 14 Chapter 15 Chapter 16 Chapter 17 […]