CHAPTER SIX
CHAIRMAN BILL LEADS THE
HAPPY WORKERS IN SONG
William H. Gates III stood in the checkout line at an all-night convenience store near his home in the Laurelhurst section of Seattle. It was about midnight, and he was holding a carton of butter pecan ice cream. The line inched forward, and eventually it was his turn to pay. He put some money on the counter, along with the ice cream, and then began to search his pockets.
“I’ve got a 50-cents-off coupon here somewhere,” he said, giving up on his pants pockets and moving up to search the pockets of his plaid shin.
The clerk waited, the ice cream melted, the other customers, standing in line with their root beer Slurpies and six-packs of beer, fumed as Gates searched in vain for the coupon.
“Here,” said the next shopper in line, throwing down two quarters.
Gates took the money.
“Pay me back when you earn your first million,” the 7-11 philanthropist called as Gates and his ice cream faded into the night.
The shoppers just shook their heads. They all knew it was Bill Gates, who on that night in 1990 was approximately a three billion dollar man.
I figure there’s some real information in this story of Bill Gates and the ice cream. He took the money. What kind of person is this? What kind of person wouldn’t dig out his own 50 cents and pay for the ice cream? A person who didn’t have the money? Bill Gates has the money. A starving person? Bill Gates has never starved. Some paranoid schizophrenics would have taken the money (some wouldn’t, too), but I’ve heard no claims that Bill Gates is mentally ill. And a kid might take the money—some bright but poorly socialized kid under, say, the age of 9.
Bingo.
My mother lives in Bentonville, Arkansas, a little town in the northwest part of the state, hard near the four corners of Arkansas, Kansas, Missouri, and Oklahoma. Bentonville holds the headquarters of Wal-Mart stores and is the home of Sam Walton, who founded Wal-Mart. Why we care about this is because Sam Walton is maybe the only person in America who could just write a check and buy out Bill Gates and because my mother keeps running into Sam Walton in the bank.
Sam Walton will act as our control billionaire in this study.
Sam Walton started poor, running a Ben Franklin store in Newport, Arkansas, just after the war. He still drives a pickup truck today and has made his money selling one post hole digger, one fifty-pound bag of dog food, one cheap polyester shirt at a time, but the fact that he’s worth billions of dollars still gives him a lot in common with Bill Gates. Both are smart businessmen, both are highly competitive, both dominate their industries, both have been fairly careful with their money. But Sam Walton is old, and Bill Gates is young. Sam Walton has bone cancer and looks a little shorter on each visit to the bank, while Bill Gates is pouring money into biotechnology companies, looking for eternal youth. Sam Walton has promised his fortune to support education in Arkansas, and Bill Gates’s representatives tell fund raisers from Seattle charities that their boss is still, “too young to be a pillar of his community.”
They’re right. He is too young.
Our fifteen-minutes-of-fame culture makes us all too quickly pin labels of good or bad on public figures. Books like this one paint their major characters in black or white, and sometimes in red. It’s hard to make such generalizations, though, about Bill Gates, who is not really a bad person. In many ways he’s not a particularly good person either. What he is is a young person, and that was originally by coincidence, but now it’s by design. At 36, Gates has gone from being the youngest person to be a self-made billionaire to being the self-made billionaire who acts the youngest.
Spend a late afternoon sitting at any shopping mall. Better still, spend a day at a suburban high school. Watch the white kids and listen to what they say. It’s a shallow world they live in—one that’s dominated by school and popular culture and by yearning for the opposite sex. Saddam Hussein doesn’t matter unless his name is the answer to a question on next period’s social studies quiz. Music matters. Clothes matter, unless deliberately stating that they don’t matter is part of your particular style. Going to the prom matters. And zits—zits matter a lot.
Watch these kids and remember when we were that age and everything was so wonderful and horrible and hormones ruled our lives. It’s another culture they live in—another planet even —one that we help them to create. On the white kids’ planet, all that is supposed to matter is getting good grades, going to the prom, and getting into the right college. There are no taxes; there is no further responsibility. Steal a car, get caught, and your name doesn’t even make it into the newspaper, because you are a juvenile, a citizen of the white kids’ planet, where even grand theft auto is a two-dimensional act.
Pay attention now, because here comes the important part.
William H. Gates III, who is not a bad person, is two-dimensional too. Girls, cars, and intense competition in a technology business are his life. Buying shirts, taking regular showers, getting married and being a father, becoming a pillar of his community, and just plain making an effort to get along with other people if he doesn’t feel like it are not parts of his life. Those parts belong to someone else—to his adult alter ego. Those parts still belong his father, William H. Gates II.
In the days before Microsoft, back when Gates was a nerdy Harvard freshman and devoting himself to playing high-stakes poker on a more-or-less full-time basis, his nickname was Trey— the gambler’s term for a three of any suit. Trey, as in William H. Gates the Trey. His very identity then, as now, was defined in terms of his father. And remember that a trey, while a low card, still beats a deuce.
Young Bill Gates is incredibly competitive because he has a terrific need to win. Give him an advantage, and he’ll take it. Allow him an advantage, and he’ll still take it. Lend him 50 cents and, well, you know …. Those who think he cheats to win are generally wrong. What’s right is that Gates doesn’t mind winning ungracefully. A win is still a win.
It’s clear that if Bill Gates thinks he can’t win, he won’t play. This was true at Harvard, where he considered a career in mathematics until it became clear that there were better undergraduate mathematicians in Cambridge than Bill Gates. And that was true at home in Seattle, where his father, a successful corporate attorney and local big shot, still sets the standard for parenthood, civic responsibility, and adulthood in general.
“There are aspects of his life he’s defaulting on, like being a father,” said the dad, lobbing a backhand in this battle of generations that will probably be played to the death.
So young Bill, opting out of the adulthood contest for now, has devoted his life to pressing his every advantage in a business where his father has no presence and no particular experience. That’s where the odds are on the son’s side and where he’s created a supportive environment with other people much like himself, an environment that allows him to play the stern daddy role and where he will never ever have to grow old.
Bill Gates’s first programming experience came in 1968 at Seattle’s posh Lakeside School when the Mothers’ Club bought the school access to a time-sharing system. That summer, 12-year-old Bill and his friend Paul Allen, who was two years older, made $4,200 writing an academic scheduling program for the school. An undocumented feature of the program made sure the two boys shared classes with the prettiest girls. Later computing adventures for the two included simulating the northwest power grid for the Bonneville Power Administration, which did not know at the time that it was dealing with teenagers, and developing a traffic logging system for the city of Bellevue, Washington.
“Mom, tell them how it worked before,” whined young Bill, seeking his mother’s support in front of prospective clients for Traf-O-Data after the program bombed during an early sales demonstration.
By his senior year in high school. Gates was employed full time as a programmer for TRW—the only time he has ever had a boss.
Here’s the snapshot view of Bill Gates’s private life. He lives in a big house in Laurelhurst, with an even bigger house under construction nearby. The most important woman in his life is his mother, Mary, a gregarious Junior League type who helps run her son’s life through yellow Post-it notes left throughout his home. Like a younger Hugh Hefner, or perhaps like an emperor of China trapped within the Forbidden City, Gates is not even held responsible for his own personal appearance. When Chairman Bill appears in public with unwashed hair and unkempt clothing, his keepers in Microsoft corporate PR know that they, not Bill, will soon be getting a complaining call from the ever-watchful Mary Gates.
The second most important woman in Bill Gates’s life is probably his housekeeper, with whom he communicates mainly through a personal graphical user interface—a large white board that sits in Gates’s bedroom. Through check boxes, fill in the blanks, and various icons, Bill can communicate his need for dinner at 8 or for a new pair of socks (brown), all without having to speak or be seen.
Coming from the clothes-are-not-important school of fashion, all of Gates’s clothes are purchased by his mother or his housekeeper.
“He really should have his colors done,” one of the women of Microsoft said to me as we watched Chairman Bill make a presentation in his favorite tan suit and green tie.
Do us all a favor, Bill; ditch the tan suit.
The third most important woman in Bill Gates’s life is the designated girlfriend. She has a name and a face that changes regularly, because nobody can get too close to Bill, who simply will not marry as long as his parents live. No, he didn’t say that. I did.
Most of Gates’s energy is saved for the Boys’ Club—212 acres of forested office park in Redmond, Washington, where 10,000 workers wait to do his bidding. Everything there, too, is Bill-centric, there is little or no adult supervision, and the soft drinks are free.
**********
Bill Gates is the Henry Ford of the personal computer industry. He is the father, the grandfather, the uncle, and the godfather of[PS7] the PC, present at the microcomputer’s birth and determined to be there at its end. Just ask him. Bill Gates is the only head honcho I have met in this business who isn’t angry, and that’s not because he’s any weirder than the others—each is weird in his own way—but because he is the only head honcho who is not in a hurry. The others are all trying like hell to get somewhere else before the market changes and their roofs fall in, while Gates is happy right where he is.
Gates and Ford are similar types. Technically gifted, self-centered, and eccentric, they were both slightly ahead of their times and took advantage of that fact. Ford was working on standardization, mass production, and interchangeable parts back when most car buyers were still wealthy enthusiasts, roads were unpaved, and automobiles were generally built by hand. Gates was vowing to put “a computer on every desk and in every home running Microsoft software” when there were fewer than a hundred microcomputers in the world. Each man consciously worked to create an industry out of something that sure looked like a hobby to everyone else.
A list of Ford’s competitors from 1908, when he began mass producing cars at the River Rouge plant, would hold very few names that are still in the car business today. Cadillac, Oldsmobile—that’s about it. Nearly every other Ford competitor from those days is gone and forgotten. The same can be said for a list of Microsoft competitors from 1975. None of those companies still exists.
Looking through the premier issue of my own rag, InfoWorld, I found nineteen advertisers in that 1979 edition, which was then known as the Intelligent Machines Journal. Of those nineteen advertisers, seventeen are no longer in business. Other than Microsoft, the only survivor is the MicroDoctor—one guy in Palo Alto who has been repairing computers in the same storefront on El Camino Real since 1978. Believe me, the MicroDoctor, who at this point describes his career as a preferable alternative to living under a bridge somewhere, has never appeared on anyone’s list of Microsoft competitors.
So why are Ford and Microsoft still around when their contemporaries are nearly all gone? Part of the answer has to do with the inevitably high failure rate of companies in new industries; hundreds of small automobile companies were born and died in the first twenty years of this century, and hundreds of small aircraft companies climbed and then power dived in the second twenty years. But an element not to be discounted in this industrial Darwinism is sheer determination. Both Gates and Ford were determined to be long-term factors in their industries. Their objective was to be around fifteen or fifty years later, still calling the shots and running the companies they had started. Most of their competitors just wanted to make money. Both Ford and Gates also worked hard to maintain total control over their operations, which meant waiting as long as possible before selling shares to the public. Ford Motor Co. didn’t go public until nearly a decade after Henry Ford’s death.
Talk to a hundred personal computer entrepreneurs, and ninety-nine of them won’t be able to predict what they will be doing for a living five years from now. This is not because they expect to fail in their current ventures but because they expect to get bored and move on. Nearly every high-tech enterprise is built on the idea of working like crazy for three to five years and then selling out for a vast amount of money. Nobody worries about how the pension plan stacks up because nobody expects to be around to collect a pension. Nobody loses sleep over whether their current business will be a factor in the market ten or twenty years from now—nobody, that is, except Bill Gates, who clearly intends to be as successful in the next century as he is in this one and without having to change jobs to do it.
At 19, Bill Gates saw his life’s work laid out before him. Bill, the self-proclaimed god of software, said in 1975 that there will be a Microsoft and that it will exist for all eternity, selling sorta okay software to the masses until the end of time. Actually, the sorta okay part came along later, and I am sure that Bill intended always for Microsoft’s products to be the best in their fields. But then Ford intended his cars to be best, but he settled, instead, for just making them the most popular. Gates, too, has had to make some compromises to meet his longevity goals for Microsoft.
Both Ford and Gates surrounded themselves with yes-men and -women, whose allegiance is to the leader rather than to the business. Bad idea. It reached the point at Ford where one suddenly out-of-favor executive learned that he was fired when he found his desk had been hacked to pieces with an ax. It’s not like that at Microsoft yet, but emotions do run high, and Chairman Bill is still young.
As Ford did, Gates typically refuses to listen to negative opinions and dismisses negative people from his mind. There is little room for constructive criticism. The need is so great at Microsoft for news to be good that warnings signs are ignored and major problems are often overlooked until it is too late. Planning to enter the PC database market, for example, Microsoft spent millions on a project code-named Omega, which came within a few weeks of shipping in 1990, even though the product didn’t come close to doing what it was supposed to do.
The program manager for Omega, who was so intent on successfully bringing together his enormous project, reported only good news to his superiors when, in fact, there were serious problems with the software. It would have been like introducing a new car that didn’t have brakes or a reverse gear. Cruising toward a major marketplace embarrassment, Microsoft was saved only through the efforts of brave souls who presented Mike Maples, head of Microsoft’s applications division, with a list of promised Omega features that didn’t exist. Maples invited the program manager to demonstrate his product, then asked him to demonstrate each of the non-features. The Omega introduction was cancelled that afternoon.
From the beginning, Bill Gates knew that microcomputers would be big business and that it was his destiny to stand at the center of this growing industry. Software, much more than hardware, was the key to making microcomputers a success, and Gates knew it. He imagined that someday there would be millions of computers on desks and in homes, and he saw Microsoft playing the central role in making this future a reality. His goal for Microsoft in those days was a simple one: monopoly.
“We want to monopolize the software business,” Gates said time and again in the late 1970s. He tried to say it in the 1980s too, but by then Microsoft had public relations people and antitrust lawyers in place to tell their young leader that the M word was not on the approved corporate vocabulary list. But it’s what he meant. Bill Gates had supreme confidence that he knew better than anyone else how software ought to be developed and that his standards would become the de facto standards for the fledgling industry. He could imagine a world in which users would buy personal computers that used Microsoft operating systems, Microsoft languages, and Microsoft applications. In fact, it was difficult, even painful, for Gates to imagine a world organized any other way. He’s a very stubborn guy about such things, to the point of annoyance.
The only problem with this grand vision of future computing —with Bill Gates setting all the standards, making all the decisions, and monopolizing all the random-access memory in the world—was that one person alone couldn’t do it. He needed help. In the first few years at Microsoft, when the company had fewer than fifty employees and everyone took turns at the switchboard for fifteen minutes each day, Gates could impose his will by reading all the computer code written by the other programmers and making changes. In fact, he rewrote nearly everything, which bugged the hell out of programmers when they had done perfectly fine work only to have it be rewritten (and not necessarily improved) by peripatetic Bill Gates. As Microsoft grew, though, it became obvious that reading every line and rewriting every other wasn’t a feasible way to continue. Gates needed to find an instrument, a method of governing his creation.
Henry Ford had been able to rule his industrial empire through the instrument of the assembly line. The assembly-line worker was a machine that ate lunch and went home each night to sleep in a bed. On the assembly line, workers had no choice about what they did or how they did it; each acted as a mute extension of Ford’s will. No Model T would go out with four headlights instead of two, and none would be painted a color other than black because two headlights and black paint were what Mr. Ford specified for the cars coming off his assembly line. Bill Gates wanted an assembly line, too, but such a thing had never before been applied to the writing of software.
Writing software is just that—writing. And writing doesn’t work very well on an assembly line. Novels written by committee are usually not good novels, and computer programs written by large groups usually aren’t very good either. Gates wanted to create an enormous enterprise that would supply most of the world’s microcomputer software, but to do so he had to find a way to impose his vision, his standards, on what he expected would become thousands of programmers writing millions of lines of code—more than he could ever personally read.
Good programmers don’t usually make good business leaders. Programmers are typically introverted, have awkward social skills, and often aren’t very good about paying their own bills, much less fighting to close deals and get customers to pay up. This ability to be so good at one thing and so bad at another stems mainly, I think, from the fact that programming is an individual sport, where the best work is done, more often than not, just to prove that it can be done rather than to meet any corporate goal.
Each programmer wants to be the best in his crowd, even if that means wanting the others to be not quite so good. This trend, added to the hated burden of meetings and having to care about things like group objectives, morale, and organizational minutiae, can put those bosses who still think of themselves primarily as programmers at odds with the very employees on whom they rely for the overall success of the company. Bill Gates is this way, and his bitter rivalry with nearly every other sentient being on the planet could have been his undoing.
To realize his dream, Gates had to create a corporate structure at Microsoft that would allow him to be both industry titan and top programmer. He had to invent a system that would satisfy his own adolescent need to dominate and his adult need to inspire. How did he do it?
Mind control.
The instrument that allowed Microsoft to grow yet remain under the creative thumb of Bill Gates walked in the door one day in 1979. The instrument’s name was Charles Simonyi.
Unlike most American computer nerds, Charles Simonyi was raised in an intellectually supportive environment that encouraged both thinking and expression. The typical American nerd was a smart kid turned inward, concentrating on science and technology because it was more reliable than the world of adult reality. The nerds withdrew into their own society, which logically excluded their parents, except as chauffeurs and financiers. Bill Gates was the son of a big-shot Seattle lawyer who didn’t understand his kid. But Charles Simonyi grew up in Hungary during the 1950s, the son of an electrical engineering professor who saw problem solving as an integral part of growing up. And problem solving is what computer programming is all about.
In contrast to the parents of most American computer nerds, who usually had little to offer their too-smart sons and daughters, the elder Simonyi managed to play an important role in his son’s intellectual development, qualifying, I suppose, for the Ward Cleaver Award for Quantitative Fathering.
“My father’s rule was to imagine that you have the solution already,” Simonyi remembered. “It is a great way to solve problems. I’d ask him a question: How many horses does it take to do something? And he’d answer right away, ‘Five horses; can you tell me if I am right or wrong?’ By the time I’d figured out that it couldn’t be five, he’d say, ‘Well if it’s not five, then it must be X. Can you solve for that?’ And I could, because the problem was already laid out from the test of whether five horses was correct. Doing it backward removed the anxiety from the answer. The anxiety, of course, is the fear that the problem can’t be solved—at least not by me.”
With the help of his father, Simonyi became Hungary’s first teenage computer hacker. That’s hacker in the old sense of being a good programmer who has a positive emotional relationship with the machine he is programming. The new sense of hacker —the Time and Newsweek versions of hackers as technopunks and cyberbandits, tromping through computer systems wearing hobnail boots, leaving footprints, or worse, preying on the innocent data of others—those hackers aren’t real hackers at all, at least not to me. Go read another book for stories about those people.
Charles Simonyi was a hacker in the purest sense: he slept with his computer. Simonyi’s father helped him get a job as a night watchman when he was 16 years old, guarding the Russian-built Ural II computer at the university. The Ural II had 2,000 vacuum tubes, at least one of which would overheat and burn out each time the computer was turned on. This meant that the first hour of each day was spent finding that burned-out vacuum tube and replacing it. The best way to avoid vacuum tube failure was to leave the computer running all night, so young Simonyi offered to stay up with the computer, guarding and playing with it. Each night, the teenager was in total control of probably half the computing resources in the entire country.
Not that half the computer resources of Hungary were much in today’s terms. The Ural II had 4,000 bytes of memory and took eighty microseconds to add two numbers together. This performance and amount of memory was comparable to an early Apple II. Of course the Ural II was somewhat bigger than an Apple II, filling an entire room. And it had a very different user interface; rather than a video terminal or a stack of punch cards, it used an input device much like an old mechanical cash register. The zeroes and ones of binary machine language were punched on cash register-like mechanical buttons and then entered as a line of data by smashing the big ENTER key on the right side. Click-click-click-click-click-click-click-click—smash!
Months of smashing that ENTER key during long nights spent learning the innards of the Ural II with its hundreds of blinking lights started Simonyi toward a career in computing. By 1966, he had moved to Denmark and was working as a professional programmer on his first computer with transistors rather than vacuum tubes. The Danish system still had no operating system, though. By 1967, Simonyi was an undergraduate computer science student at the University of California, working on a Control Data supercomputer in Berkeley. Still not yet 20, Simonyi had lived and programmed his way through nearly the entire history of von Neumann-type computing, beginning in the time warp that was Hungary.
By the 1970s, Simonyi was the token skinny Hungarian at Xerox PARC, where his greatest achievement was Bravo, the what-you-see-is-what-you-get word processing software for the Alto workstation.
While PARC was the best place in the world to be doing computer science in those days, its elitism bothered Simonyi, who couldn’t seem to (or didn’t want to) shake his socialist upbringing. Remember that at PARC there were no junior researchers, because Bob Taylor didn’t believe in them. Everyone in Taylor’s lab had to be the best in his field so that the Computer Science Lab could continue to produce its miracles of technology while remaining within Taylor’s arbitrary limit of fifty staffers. Simonyi wanted larger staffs, including junior people, and he wanted to develop products that might reach market in the programmer’s lifetime.
PARC technology was amazing, but its lack of reality was equally amazing. For example, one 1978 project, code-named Adam, was a laser-scanned color copier using very advanced emitter-coupled logic semiconductor technology. The project was technically impossible at the time and is only just becoming possible today, more than twelve years later. Since Moore’s Law says that semiconductor density doubles every eighteen months, this means that Adam was undertaken approximately eight generations before it would have been technically viable, which is rather like proposing to invent the airplane in the late sixteenth century. With all the other computer knowledge that needed to be gathered and explored, why anyone would bother with a project like Adam completely escaped Charles Simonyi, who spent lots of time railing against PARC purism and a certain amount of time trying to circumvent it.
This was the case with Bravo. The Alto computer, with its beautiful bit-mapped white-on-black screen, needed software, but there were no extra PARC brains to spare to write programs for it. Money wasn’t a problem, but manpower was; it was almost impossible to hire additional people at the Computer Science Laboratory because of the arduous hiring gauntlet and Taylor’s reluctance to manage extra heads. When heads were added, they were nearly always Ph.D.s, and the problem with Ph.D.s is that they are headstrong; they won’t do what you tell them to. At least they wouldn’t do what Charles Simonyi told them to do. Simonyi did not have a Ph.D.
Simonyi came up with a scam. He proposed a research project to study programmer productivity and how to increase it. In the course of the study, test subjects would be paid to write software under Simonyi’s supervision. The test subjects would be Stanford computer science students. The software they would write was Bravo, Simonyi’s proposed editor for the Alto. By calling them research subjects rather than programmers, he was able to bring some worker bees into PARC.
The Bravo experiment was a complete success, and the word processing program was one of the first examples of software that presented document images on-screen that were identical to the eventual printed output. Beyond Bravo, the scam even provided data for Simonyi’s own dissertation, plunking him right into the ranks of the PARC unmanageable. His 1976 paper was titled “Meta-Programming: A Software Production Method.”
Simonyi’s dissertation was an attempt to describe a more efficient method of organizing programmers to write software. Since software development will always expand to fill all available time (it does not matter how much time is allotted—software is never early), his paper dealt with how to get more work done in the limited time that is typically available. Looking back at his Bravo experience, Simonyi concluded that simply adding more programmers to the team was not the correct method for meeting a rapidly approaching deadline. Adding more programmers just increased the amount of communication overhead needed to keep the many programmers all working in the same direction. This additional overhead was nearly always enough to absorb any extra manpower, so adding more heads to a project just meant that more money was being spent to reach the same objective at the same time as would have the original, smaller, group. The trick to improving programming productivity was making better use of the programmers already in place rather than adding more programmers. Simonyi’s method of doing this was to create the position of metaprogrammer.
The metaprogrammer was the designer, decision maker, and communication controller in a software development group. As the metaprogrammer on Bravo, Simonyi mapped out the basic design for the editor, deciding what it would look like to the user and what would be the underlying code structure. But he did not write any actual computer code; Simonyi prepared a document that described Bravo in enough detail that his “research subjects” could write the code that brought each feature to life on-screen.
Once the overall program was designed, the metaprogram-mer’s job switched to handling communication in the programming group and making decisions. The metaprogrammer was like a general contractor, coordinating all the subcontractor programmers, telling them what to do, evaluating their work in progress, and making any required decisions. Individual programmers were allowed to make no design decisions about the project. All they did was write the code as described by the metaprogrammer, who made all the decisions and made them just as fast as he could, because Simonyi calculated that it was more important for decisions to be made quickly in such a situation than that they be made well. As long as at least 85 percent of the metaprogrammer’s interim decisions were ultimately correct (a percentage Simonyi felt confident that he, at least, could reach more or less on the basis of instinct), there was more to be lost than gained by thoughtful deliberation.
The metaprogrammer also coordinated communication among the individual programmers. Like a telephone operator, the metaprogrammer was at the center of all interprogrammer communication. A programmer with a problem or a question would take it to the metaprogrammer, who could come up with an answer or transfer the question or problem to another programmer who the metaprogrammer felt might have the answer. The alternative was to allow free discussion of the problem, which might involve many programmers working in parallel on the problem, using up too much of the group’s time.
By centralizing design, decision making, and communication in a single metaprogrammer, Simonyi felt that software could be developed more efficiently and faster. The key to the plan’s success, of course, was finding a class of obedient programmers who would not contest the metaprogrammer’s decisions.
The irony in this metaprogrammer concept is that Simonyi, who bitched and moaned so much about the elitism of Xerox PARC, had, in his dissertation, built a vastly more rigid structure that replaced elitism with authoritarianism.
In the fluid structure of Tayloi ‘s lab at PARC, only the elite could survive the demanding intellectual environment. In order to bring junior people into the development organization, Simonyi promoted an elite of one—the metaprogrammer. Both Taylor’s organization at CSL and Simonyi’s metaprogrammer system had hub and spoke structures, though at CSL, most decision making was distributed to the research groups themselves, which is what made it even possible for Simonyi to perpetrate the scam that produced Bravo. In Simonyi’s system, only the metaprogrammer had the power to decide.
Simonyi, the Hungarian, instinctively chose to emulate the planned economy of his native country in his idealized software development team. Metaprogramming was collective farming of software. But like collective farming, it didn’t work very well .
By 1979, the glamor of Xerox PARC had begun to fade for Simonyi. “For a long while I believed the value we created at PARC was so great, it was worth the losses,” he said. “But in fact, the ideas were good, but the work could be recreated. So PARC was not unique.
“They had no sense of business at all. I remember a PARC lunch when a director (this was after the oil shock) argued that oil has no price elasticity. I thought, ‘What am I doing working here with this Bozo?’ ”
Many of the more entrepreneurial PARC techno-gods had already left to start or join other ventures. One of the first to go was Bob Metcalfe, the Ethernet guy, who left to become a consultant and then started his own networking company to exploit the potential of Ethernet that he thought was being ignored by Xerox. Planning his own break for the outside world with its bigger bucks and intellectual homogeneity, Simonyi asked Metcalfe whom he should approach about a job in industry. Metcalfe produced a list of ten names, with Bill Gates at the top. Simonyi never got around to calling the other nine.
When Simonyi moved north from California to join Microsoft in 1979, he brought with him two treasures for Bill Gates. First was his experience in developing software applications. There are four types of software in the microcomputer business: operating systems like Gary Kildall’s CP/M, programming languages like Bill Gates’s BASIC, applications like Visi-Calc, and utilities, which are little programs that add extra functions to the other categories. Gates knew a lot about languages, thought he knew a lot about operating systems, had no interest in utilities, but knew very little about applications and admitted it.
The success of VisiCalc, which was just taking off when Simonyi came to Microsoft, showed Gates that application software—spreadsheets, word processors, databases and such—was one of the categories he would have to dominate in order to achieve his lofty goals for Microsoft. And Simonyi, who was seven years older, maybe smarter, and coming straight from PARC—Valhalla itself—brought with him just the expertise that Gates would need to start an applications division at Microsoft. They quickly made a list of products to develop, including a spreadsheet, word processor, database, and a long-since-forgotten car navigation system.
The other treasure that Simonyi brought to Microsoft was his dissertation. Unlike PARC, Microsoft didn’t have any Ph.D.s before Simonyi signed on, so Gates did as much research on the Hungarian as he could, which included having a look at the thesis. Reading through the paper, Gates saw in Simonyi’s metaprogrammer just the instrument he needed to rule a vastly larger Microsoft with as much authority as he then ruled the company in 1979, when it had around fifty employees.
The term metaprogrammer was never used. Gates called it the “software factory,” but what he and Simonyi implemented at Microsoft was a hierarchy of metaprogrammers. Unlike Simonyi’s original vision, Gates’s implementation used several levels of metaprogrammers, which allowed a much larger organization.
Gates was the central metaprogrammer. He made the rules, set the tone, controlled the communications, and made all the technical decisions for the whole company. He surrounded himself with a group of technical leaders called architects. Simonyi was one of these super-nerds, each of whom was given overall responsibility for an area of software development. Each architect was, in turn, a metaprogrammer, surrounded by program managers, the next lower layer of nerd technical management. The programmers who wrote the actual computer code reported to the program managers, who were acting as metaprogrammers, too.
The beauty of the software factory, from Bill Gates’s perspective, was that every participant looked strictly toward the center, and at that center stood Chairman Bill—a man so determined to be unique in his own organization that Microsoft had more than 500 employees before hiring its second William.
The irony of all this diabolical plotting and planning is that it did not work. It was clear after less than three months that metaprogramming was a failure. Software development, like the writing of books, is an iterative process. You write a program or a part of a program, and it doesn’t work; you improve it, but it still doesn’t work very well; you improve it one more time (or twenty more times), and then maybe it ships to users. With all decisions being made at the top and all information supposedly flowing down from the metaprogrammer to the 22-year-old peon programmers, the reverse flow of information required to make the changes needed for each improved iteration wasn’t planned for. Either the software was never finished, or it was poorly optimized, as was the case with the Xerox Star, the only computer I know of that had its system software developed in this way. The Star was a dog.
The software factory broke down, and Microsoft quickly went back to writing code the same way everyone else did. But the structure of architects and program managers was left in place, with Bill Gates still more or less controlling it all from the center. And since a control structure was all that Chairman Bill had ever really wanted, he at least considered the software factory to be a success.
Through the architects and program managers, Gates was able to control the work of every programmer at Microsoft, but to do so reliably required cheap and obedient labor. Gates set a policy that consciously avoided hiring experienced programmers, specializing, instead, in recent computer science graduates.
Microsoft became a kind of cult. By hiring inexperienced workers and indoctrinating them into a religion that taught the concept that metaprogrammers were better than mere programmers and that Bill Gates, as the metametaprogrammer, was perfect, Microsoft created a system of hero worship that extended Gates’s will into every aspect of the lives of employees he had not even met. It worked for Kim Il Sung in North Korea, and it works in the suburbs east of Seattle too.
Most software companies hire the friends of current employees, but Microsoft hires kids right out of college and relocates them. The company’s appetite for new programming meat is nearly insatiable. One year Microsoft got in trouble with the government of India for hiring nearly every computer science graduate in the country and moving them all to Redmond.
So here are these thousands of neophyte programmers, away from home in their first working situation. All their friends are Microsoft programmers. Bill is a father/folk hero. All they talk about is what Bill said yesterday and what Bill did last week. And since they don’t have much to do except talk about Bill and work, there you find them at 2:00 a.m., writing code between hockey matches in the hallway.
Microsoft programmers work incredibly long hours, most of them unproductive. It’s like a Japanese company where overtime has a symbolic importance and workers stay late, puttering around the office doing little or nothing just because that’s what everyone else does. That’s what Chairman Bill does, or is supposed to do, because the troops rarely even see him I probably see more of Bill Gates than entry-level programmers do.
At Microsoft it’s a “disadvantage” to be married or “have any other priority but work,” according to a middle manager who was unlucky enough to have her secretly taped words later played in court as evidence in a case claiming that Microsoft discriminates against married employees. She described Microsoft as a company where employees were expected to be single or live a “singles lifestyle,” and said the company wanted employees that “ate, breathed, slept, and drank Microsoft,” and felt it was “the best thing in the world.”
The real wonder in this particular episode is not that Microsoft discriminates against married employees, but that the manager involved was a woman. Women have had a hard time working up through the ranks. Only two women have ever made it to the vice-presidential level—Ida Cole and Jean Richardson. Both were hired away from Apple at a time when Microsoft was coming under federal scrutiny for possible sex discrimination. Richardson lasted a few months in Redmond, while Cole stayed until all her stock options vested, though she was eventually demoted from her job as vice-president.
Like any successful cult, sacrifice and penance and the idea that the deity is perfect and his priests are better than you works at Microsoft. Each level, from Gates on down, screams at the next, goading and humiliating them. And while you can work any eighty hours per week that you want, dress any way that you like, you can’t talk back in a meeting when your boss says you are shit in front of all your co-workers. It just isn’t done. When Bill Gates says that he could do in a weekend what you’ve failed to do in a week or a month, he’s lying, but you don’t know better and just go back to try harder.
This all works to the advantage of Gates, who gets away with murder until the kids eventually realize that this is not the way the rest of the world works. But by then it is three or four years later, they’ve made their contributions to Microsoft, and are ready to be replaced by another group of kids straight out of school.
My secret suspicion is that Microsoft’s cult of personality hides a deep-down fear on Gates’s part that maybe he doesn’t really know it all. A few times I’ve seen him cornered by some techie who is not from Microsoft and not in awe, a techie who knows more about the subject at hand than Bill Gates ever will. I’ve seen a flash of fear in Gates’s eyes then. Even with you or me, topics can range beyond Bill’s grasp, and that’s when he uses his “I don’t know how technical you are” line. Sometimes this really means that he doesn’t want to talk over your head, but just as often it means that he’s the one who really doesn’t know what he’s talking about and is using this put-down as an excuse for changing the subject. To take this particularly degrading weapon out of his hands forever, I propose that should you ever talk with Bill Gates and hear him say, “I don’t know how technical you are,” reply by saying, that you don’t know how technical he is. It will drive him nuts.
The software factory allowed Bill Gates to build and control an enormous software development organization that operates as an extension of himself. The system can produce lots of applications, programming languages, and operating systems on a regular basis and at relatively low cost, but there is a price for this success: the loss of genius. The software factory allows for only a single genius—Bill Gates. But since Bill Gates doesn’t actually write the code in Microsoft’s software, that means that few flashes of genius make their way into the products. They are derivative—successful, but derivative. Gates deals with this problem through a massive force of will, telling himself and the rest of the world that commercial success and technical merit are one in the same. They aren’t. He says that Microsoft, which is a superior marketing company, is also a technical innovator. It isn’t.
The people of Microsoft, too, choose to believe that their products are state of the art. Not to do so would be to dispute Chairman Bill, which just is not done. It’s easier to distort reality.
Charles Simonyi accepts Microsoft mediocrity as an inevitable price paid to create a large organization. “The risk of genius is that the products that result from genius often don’t have much to do with each other,” he explained. “We are trying to build core technologies that can be used in a lot of products. That is more valuable than genius.
“True geniuses are very valuable if they are motivated. That’s how you start a company—around a genius. At our stage of growth, it’s not that valuable. The ability to synthesize, organize, and get people to sign off on an idea or project is what we need now, and those are different skills.”
Simonyi started Microsoft’s applications group in 1979, and the first application was, of course, a spreadsheet. Other applications soon followed as Simonyi and Gates built the development organization they knew would be needed when microcomputing finally hit the big time, and Microsoft would take its position ahead of all its competitors. All they had to do was be ready and wait.
In the software business, as in most manufacturing industries, there are inventive organizations and maintenance organizations. Dan Bricklin, who invented VisiCalc, the first spreadsheet, ran an inventive organization. So did Gary Kildall, who developed CP/M, the first microcomputer operating system. Maintenance organizations are those, like Microsoft, that generally produce derivative products—the second spreadsheet or yet another version of an established programming language. BASIC was, after all, a language that had been placed in the public domain a decade before Bill Gates and Paul Allen decided to write their version for the Altair.
When Gates said, “I want to be the IBM of software,” he consciously wanted to be a monolith. But unconsciously he wanted to emulate IBM, which meant having a reactive strategy, multiple divisions, poor internal communications.
As inventive organizations grow and mature, they often convert themselves into maintenance organizations, dedicated to doing revisions of formerly inventive products and boring as hell for the original programmers who were used to living on adrenalin rushes and junk food. This transition time, from inventive to maintenance, is a time of crisis for these companies and their founders.
Metaprogrammers, and especially nested hierarchies of metaprogrammers, won’t function in inventive organizations, where the troops are too irreverent and too smart to be controlled. But metaprogrammers work just fine at Microsoft, which has never been an inventive organization and so has never suffered the crisis that accompanies that fall from grace when the inventive nerds discover that it’s only a job .
“Watch the white kids and listen to what they say.”
Sounds racist to me!
I read it as more of a snark. as in, “hmph, all these white kids, like I was once, look at how little they care about things.”
“Gates deals with this problem through a massive force of will, telling himself and the rest of the world that commercial success and technical merit are one in the same.”
According to grammarist.com, this phrase should be “one and the same, instead of “one in the same.”
http://grammarist.com/usage/one-in-the-same/
You and the grammarist may be technically right, but no one else agrees. Just listen to ‘the world’ say “it”
“One in the same” makes no sense, regardless of the grammarist. Perhaps I’d get your point if I understood the “it” comment.
Wondering where you want to go with the Bill Gates stuff now that he is out of the day-to-day picture at Microsoft. You could leave it as is, but it’s telling only half the story (or less). On the other hand, you can tell the rest of the story ending with “and eventually Bill Gates got married and now he is the chief philanthropy officer…” you lose all the edge.
I’m not sure the “edge” is worth keeping. It just sounds like the ramblings of someone jealous of Bill’s success. Yes, it’s interesting, well written, but a little out of date, and way more negative than needed. Kind of reminds me of Cicero’s orations against Catiline.
I’ve had the unique (I think) experience of working in the same company for the past 17 years, which affords me the opportunity to peruse the code I wrote way back when.
I’m always appalled at that young man’s [my] lack of wisdom, or even a whiff of skill. But then I realize “it works”, and resist the urge to refactor my own code.
We all view the world through our own filters – that sometimes says more about us, than what we are observing. This passage is no less true for that. I say leave it alone, but provide a bit of mea culpa where it is warranted to bring readers up to date.
Gates’s work in philanthropy has been controversial. His emphasis on accountability is like a breath of fresh air, but the amount of control he demands and the financial investments of the endowment are worrying.
Also, he’s still Chairman of the Board of Microsoft. He just stated in public that the Windows Phone strategy was a mistake because it has not led to monopoly. It’s not outside the realm of possibility that he could storm in to save his company, like Michael Dell.
I think you’ve gone soft in your old age, Ronc. I grew up during the Microsoft Reign of Terror, and it’s important to preserve the sentiment of the time. Bill Gates was not nice, and he’s still not nice. He’s now working on very noble goals, but he’s not nice.
As Steve Jobs said in the PBS documentary version, we’re not jealous of Bill Gates’s success. He earned it, for the most part. We’re just upset with what he has done with it.
“Microsoft Reign of Terror”. My perspective was that of a customer starting in 1996, never a competitor. Microsoft gave us stuff for free with Windows and set badly needed defacto standards. All I had to do was buy a laptop and a printer, add dial-up service from Microsoft, and I was all set with the Win95 start button. Still using that printer with Windows 8 and the old laptop is a Win98 webserver.
Gates’s work in charity has been controversial. His emphasis on accountability has been a breath of fresh air, but the amount of control that he demands and the investments of the charity’s endowment have been worrying. He still doesn’t care about the free agency of individuals.
Also, Bill Gates is still the Chairman of the Board of Microsoft. Just last week, he said in public that the Windows Phone strategy is a mistake, because it did not lead to monopoly. It’s not outside the realm of possibility that he could storm in and try to save his company, like Michael Dell.
(If there are 2 or 3 variations of this comment appearing, it’s because of some glitch with the comment system. This footnote is a response to some glitch in the comment system.)
Many years ago, I bought the Cosmos DVD set. By this time a certain amount of the information was out of date, much like this series. Carl Sagen, (Still alive) added an update to every episode, which brought the content up to date. Perhaps you should consider something similar. I bought this book when it first came out, but enjoy it now as much as I did then.
I believe that is what is wanted here. Cringely is asking his audience to help fill in more details of the times and help bring it up to date.
Didn’t Cringely say in the introduction to this series that he plans to do exactly that, and that he’s publishing the chapters here to stimulate discussion and contributions from readers who have extra information about the events in question? Or am I misremembering?
Bob still hasn’t told us whether this quote from the book “I’m writing this in 1991” should or should not be changed.
Bob is a writer who is quite capable of incorporating “I’m writing this in 1991” in the context of “I’m writing this in 2013”–what’s to change?
The confusing part has to do with the corrections we call to his attention. If he starts the book with that sentence then we have nothing to add. If he leaves it where it is, as it is, the question becomes what does “this” refer to: the current sentence, paragraph, chapter, or book.
Throughout the 90’s people moaned and complained about Microsoft and how every one of its products had a competitor that offered superior function and features.
But I remember the transition that Motorola made from an Apple environment (they supplied the PowerPC chips, so they used Apple products) to a Microsoft environment, and the increase in productivity, at least in the areas where my teams worked, was immediate and large.
Sure, Microsoft products may have been derivative and lacking a touch of genius, but they all worked, and they all worked together. Bill Gates surely succeeded in emulating Henry Ford in this respect, and Cringely hit the nail exactly on the head with this analogy.
For me it wasn’t so much the features in application space, as it was their choice of “standards” – which coupled with dissembling statements lead me to believe I could actually use their tools for general purpose development across platforms.
J++ was the straw that broke the camel’s back for me; Microsoft’s choice to depart from the Java standard VM implementation made what I learned useless for cross platform development. Write once – run anywhere. Right.
I never believed in Java’s Write Once Run Anywhere. It worked miserably on Macs, and it was a while until it was available on terms so that Linux distributions were comfortable with it. Then I learned to program in that inconsistent bondage and discipline language, and was further displeased with it.
Now you have US-CERT recommending that people uninstall the Java plugin, and Apple and Mozilla willy-nilly disabling it in their browsers. Other than Minecraft, I’m not aware of any popular consumer program that requires it. Android breaks the entire concept with their Apache-licensed partial clone of Java libraries with proprietary Android API and Dalvik virtual machine.
Java is still important for enterprises doing enterprisey things. It is tremendously important, but it’s not my area of passion.
Their slow adoption of W3C standards has made web-development a true nightmare.
J++ died pretty quickly by comparison.
The reason J++ was non standard was Microsoft were not allowed to write a Java compiler, with it belonging to Sun. That was not entirely their fault, but that is what lead to them creating C# which ultimately become a better version of java. I’d say C# is an example of innovation actually, it’s a well designed language and been kept up to date.
That’s not true. Sun allowed Microsoft, or anyone, to clone Java, but required the clone to pass a validation suite. The idea was to avoid incompatibilities between Java implementations. Remember the “write once, run anywhere” slogan?
Of course, MIcrosoft’s whole intention with J++ was to introduce incompatibilities: the idea was to pollute the Java ecosystem and either kill it or turn it into something controlled by MS. Sun sued, and won in court. That was why Microsoft dropped J++ and started work on C#.
Hah, and I’m sure that Motorola was the paragon of good management.
It’s far easier to be productive with Microsoft when the entire ecosystem is built around Microsoft. I’m typing this on a computer running Windows 7, not because I like Windows more than anything else, but because I can’t hope to get my display drivers to run correctly with another operating system. Likewise, I’m sure your active directory and proprietary development programs run more smoothly on Windows than Mac.
(Ryan, you haven’t said when you switched to Microsoft, nor what exactly your division did. These are crucial details. In my rather smaller organizations, Macs were much more productive during the 90’s.)
Truth is, some systems work better for some purposes than other systems. My biggest annoyance with Microsoft during the 90’s was how they avoid compatibility with anybody else. You buy a little of Microsoft, they try to take over your life. So, they try to make you use Microsoft for web development and Internet servers, purposes for which Microsoft products are poorly suited now, but for which they were disastrous during the 90’s. And, for example, Microsoft SourceSafe is still anything but safe.
I still keep my old G3 Mac around because I feel more productive using that than when I use machines with an internet connection & being interrupted for software updates.
Oh! I just realized my own hypocrisy.
My small organizations were doing mainly desktop publishing and image editing. We had no use for user accounts and directory services (everybody else was just on the other side of the room), and Macs were just much better at color and at being less frustrating.
For example, before USB 2.0, it was much easier to share external drives between Macs with SCSI than between PCs with parallel ports and their buggy device drivers. And SCSI drives were bootable! So handy.
Yep, SCSI was more dependable. But a P.I.T.A if you had forgotten to attach the terminating block on the last device. 😀
“As inventive organizations grow and mature, they often convert themselves into maintenance organizations, dedicated to doing revisions of formerly inventive products and boring as hell for the original programmers who were used to living on adrenalin rushes and junk food. This transition time, from inventive to maintenance, is a time of crisis for these companies and their founders.”
Maybe an epitaph for Apple in the coming years?
> Maybe an epitaph for Apple in the coming years?
If you had said that the year before the iPod came out, you’d have had plenty of evidence you were right and nobody would have had anything they could point to to prove you wrong. Likewise if you’d said it the year before the iPhone came out.
Apple are a rare example of a company that has managed to avoid that trap, so far, but there’s no way for sure to tell if that’s likely to change.
My first Bill Gates Review by Joel Spolsky
https://www.joelonsoftware.com/items/2006/06/16.html
Read it as it occurred during this time and gives an interesting perspective to the above.
That article is well-written and made me laugh – thank you!
Thanks for the link. I especially liked this part: “It was a good point. Bill Gates was amazingly technical. He understood Variants, and COM objects, and IDispatch and why Automation is different than vtables and why this might lead to dual interfaces. He worried about date functions. He didn’t meddle in software if he trusted the people who were working on it, but you couldn’t bullshit him for a minute because he was a programmer. A real, actual, programmer.”
Yes, Gates was technical, but he sure had his own ultimate assessment of what was “good” and “bad”. I think him and his gang called “good enough” a whole load of stuff that should never have been allowed out the door, and perhaps never been allowed to grow in the first place.
The book, “Show Stopper” by G.P. Zachary, is another interesting read about Microsoft culture, from the perspective of Windows NT and its backdrop of the late 1980s and early 1990s.
Interesting that Spolsky makes the same comment about Microsoft in 2005 as Bob will in the next chapter about IBM in 1990, something to the effect of “a culture of perpetual, permanent meetings”.
Bob,
I have enjoyed every chapter of Accidental Empires so far. But I wonder if there should be a change in the philosophy of this book. What is more important, making fun of the nerds that can’t get a date, or explaining how modern day robber barons can provide worthwhile benefits to Western civilization despite their many personal shortcomings?
Personally, I like both kinds of information — the humorous and the insightful.
some typos:
– of Tayloi ‘s lab
of Taylor‘s lab
– because the troops rarely even see him I probably
because the troops rarely ever(?) see him. I probably
I don’t know why you credit Charles Simonyi with identifying the problems of communications within software development teams. Fred Brooks had discussed exactly this issue extensively years before, in his famous book “The Mythical Man Month.” Simonyi had to have read it, or at least absorbed its message, as it was a well-discussed part of the software culture of the time.
It’s a pity no one bothered to read it a Microsoft around the time that Vista (Longhorn) was in development.
No ideas as to what the extra features, apart from tarted-up UI, 8.xx GB of files on my hard disk provided.
Well, I don’t know how technical you are… Just kidding. But I’ll forge ahead anyway.
Vista had a tumultuous development. Ultimately, Microsoft just had to quit and start over. That’s why it took so many long years after Windows XP until Vista was released.
The improvements were mostly in the basic structure of Windows. Several things that used to cause Windows XP to crash, would only cause minor glitches in Windows Vista. In the user interface, the most welcome improvement was the searchable Start Menu. The major user interface improvements came with Windows 7.
Other things were more experimental and not always better, but I’m getting used to them so much that I’m resenting the times that I have to deal with Windows XP again.
The biggest irritation with Vista was the lengthy wait after logon for the system to become responsive. But it was a similar problem when XP came out in 2001. The hardware had to catch up.
So did Vista fix the volume control not showing up problem, or have they decided it’s been like that for so many versions that they’re just going to call it a tradition and leave it as is?
No. I’ve had that problem with Windows XP, so it’s not a Vista-specific problem. I have no idea what causes it.
Actually, I’ve had that problem with only a couple XP computers, and not with Vista and up nor 2000 and down. But my sample size is small and, again, I have no idea what causes it.
I’d expect that Windows 8 and the interface formerly known as Metro are supposed to make that problem go away for good.
Well, I don’t know how technical you are… Just kidding. I’ll continue anyway.
The main benefit of Vista is the large number of fixes in the structure of Windows. Bugs and events that would cause a crash in Windows XP, would merely cause a minor glitch in Windows Vista. And the security upgrades, most noticeably UAC, make Vista significantly more secure than XP.
Compared to Windows XP, 7, and 8, I thought Windows Vista had a pretty subdued UI. Except for that Aero transparency thing, but that can be turned off easily. XP was Fisher Price, 7 was getting rid of colors, and 8 has the interface formerly known as Metro. But the part that I really like is the searchable Start menu.
Ugh. This blog didn’t tell me that the previous version of the post had succeeded.
The new site seems prone to not always refreshing the page. If you say exactly the same thing again, it tells you when you try to resubmit it. But if you change anything it accepts it as a new post. I often save the longer posts in case I have to resubmit due to a server problem or a connection problem on my end. Yes, Metro has become Tifkam. 🙂
Bob.
This line seems awkward. (diputing chairman Bill)
“which just is not done”
* Thanks a lot for your interesting technical information, that I’ve been reading for years.
And also for the amusement in “Accidental Empires” and former posts.
I like written info; spoken american-english is too difficult !!!
* Having (more or less) your age, I’m writing from Bilbao, Spain, Collapsing Europe…
This blog is having major problems with posting and with comment visibility. I’m suspecting that one problem is the caching headers, so Firefox is giving me stale versions of the comments.
Same with IE10/Win8.
[…] Intro Chapter 1A Chapter 1B Chapter 1C Chapter 1D Chapter 2 Chapter 3 Chapter 4 Chapter 5 Chapter 6 Chapter 7 Chapter 8 Chapter 9 Chapter 10 Chapter 11 Chapter 12 Chapter 13 Chapter 14 Chapter 15 […]