Paul Otellini this week resigned his position as CEO of Intel as I’m sure you’ve already heard or read. Analysts and pundits are weighing-in on the matter, generally attributing Otellini’s failure to Intel’s late and flawed effort to gain traction in the mobile processor space. While I tend to agree with this assessment, it doesn’t go far enough to explain Otellini’s fall, which is not only his fault but also the fault of Intel’s board of directors. Yes, Otellini was forced out by the board, but the better action would have been for the board to have fired itself, too.
If there was a single event that triggered this end to Otellini’s tenure at Intel I’m guessing it is Apple’s decision to abandon Intel chips for its desktop computers. There has been no such announcement but Apple has sent signals to the market and Apple doesn’t send signals for fun. The question isn’t if Apple would drop Intel but when and the way product design changes are made the when is not this Christmas but next.
I’m sure that Intel just lost its second or third-largest customer, a company important not just for its size but for its position as a design leader in the desktop space. This alone would have doomed Otellini.
But here’s the thing to notice: Otellini resigned this week not just from his position as CEO but also from the Intel board. This is no retirement and it is more than just a firing, too — this is something ugly. Normally Otellini would have remained on the board for a year or two and that he isn’t suggests that his relationship with the board is totally poisoned.
Whose fault is that? It is the fault of both sides.
The simplistic view most people have of boards of directors are that the CEO runs the company and the board hires and fires the CEO, simple as that. If that was the heart of it, though, there would be no need for committees and sub-committees and board meetings could be done on the phone with an up or down vote four times per year. Modern boards share power to some extent with the CEO, they help set company policies, and they are responsible for setting off alarms from time to time.
In the case of Intel, the alarms have been at best muted and it is pretty easy to argue that the board simply didn’t do its job any better than did Otellini.
Intel under Otellini has been a model of Bush era corporate responsibility which is to say manically cutting costs while doubling down on its desktop processor business and giving little to no thought to market shifts like the current one to mobile that could screw the whole business. Intel spent a decade fixated solely on fighting AMD, a battle it won a long time ago yet still keeps fighting. That was Otellini’s fault but it was also the Intel board’s fault.
The company was too busy fighting AMD to notice the rise of mobile. And while the pundits are correctly saying ARM-this and ARM-that in their analysis of the Intel mobile debacle, the source of the successor technology is less important than the fact that the two largest high-end mobile manufacturers of all — Apple and Samsung — are making their own processors. They will never be Intel customers again.
It’s like Mitt Romney talking about the 47 percent: no matter what Intel does — no matter what — the company will be a minority player at best in the mobile space. This explains, too, the $50 billion in market cap that has been effectively transferred from Intel to Qualcomm over the last five years.
Intel had a chance to buy Qualcomm twice, by the way. Discussions happened. But twice Intel walked away and one of those times the guy leading the departure was Paul Otellini. Intel could have owned Broadcom, too, with the same story, but they (board and all) were too busy being fat, dumb, and happy fixating on AMD.
Here’s what should happen at Intel now. Company vision has failed from top to bottom. It’s time for new leadership including a new board, because the existing board hasn’t shown itself competent to replace Otellini — who by the way should be replaced, since he‘s as clueless as the board that’s firing him.
Intel needs new leadership and a bet-the-company move toward dominating some new technology, I’m not sure which one, but there are several from which to choose.
Given that the Intel board isn’t firing itself, I expect they’ll hire another bad executive to replace Otellini and Intel’s fall will continue. It’s still a rich and profitable company and can go a decade or more with a cargo cult corporate culture based on hope that desktops will return.
Apple could buy AMD and make it’s own chips, X86 or ARM based.
Yes it would be easy for Apple to buy AMD, though with their tilt toward ARM the only reason I can imagine they’d do so is for ATi graphics technology. And remember AMD no longer owns its fab, having spun off that business, so Apple wouldn’t actually be MAKING its chips at all.
I also see an advantage with the CPU division. AMD’s new architecture (bulldozer -> Piledriver -> Steamroller) has some serious advantages for programs that can take advantage (highly threaded and parallel, like video encoding). They’re at piledriver now, which is finally faster than the previous phenom II architechture in Windows, and as fast as Intel in highly threaded apps (but their single thread performance still ain’t there). Apple writes the OS, so it can take advantage of this to reduce their COGS because AMD’s chips are less expensive. AMD is also way ahead of Intel in integrating a GPU and CPU, which I can see Apple using in their iMacs so you can also game on them in OpenGL.
Don’t know that Apple would buy them (though they could do it relatively cheaply), but I wouldn’t be at all surprised if they switched to AMD CPU’s and APU’s (they already offer AMD graphics cards, though the ones they offer are like 3 years out of date with what’s available in the PC world and far too expensive) and they could tailor the OS to the strengths of the AMD architecture (and the arch and instruction set to the OS as well if they purchased AMD or gave AMD enough money).
I agree, Apple doesn’t need to be a chip fab. But they do need AMD’s IP or a license from Intel to design x86 based CPUs. AMD’s Graphic chip know-how is also potentially invaluable to Apple in the near future.
Apple could really use a low power dual core x86 with on chip mid-range graphics for it’s entire line. Intel HD4000 doesn’t cut it on retina displays and Intel seems to have bugs in their HDMI implementation (see current Mac Mini owners forums). Apple doesn’t need to beat Intel’s top end i7; just be good enough on the CPU side to hold their own, ala AMD. Given Apple’s experience in making a combined ARM/GPU package for the iPhone/iPad, I find it logical that they now attempt the same in the desktop/laptop x86 dominated space. Apple wants even thinner sleeker designs that require lower thermal specs without sacrificing performance; not necessarily benchmark performance, but user experience: it must feel quick. When Apple controls the whole enchilada, they can include whatever functions in the hardware they wish to implement in OSX and discard everything else.
In the future they could combine this hypothetical Apple x86 chip with an ARM based chip and run both at full power in desktop mode and rely on only the ARM in laptop/tablet mode sipping along on the battery. This would give a user the ability to run any given set of Apps in Apple’s store on the same system at the same time. In other words: a tablet/iMac hybrid. Put it on your desk and plug it into the wall and it’s a desktop. Grab and go and it’s an iPad running on batteries. Apple controls the OS so they can leap whatever technical hurdles are required to run the OS on two architectures and switch back and forth (allow iOS Apps to run on ARM natively and OSX Apps to run on x86 at the same time).
I don’t believe Apple currently has the expertise or the patents required to do this on their own. Contracting Intel to design this chip for Apple would be clunky, Intel doesn’t do things the Apple way and would want to then sell the resultant chip to Apple competitors. A better solution is for Apple to buy AMD, reorganize the company and have it do what Apple needs done, the Apple way, by the people who know how. It would cost Apple about 1.5 Billion dollars to buy AMD, that seems cheap for a company with many times that sitting around accruing interest.
I like the 2-chip idea for combining portability and desktop use. But it looks like manufacturers are more interested in selling us more devices rather than create one that does everything: smart-phone, tablet, desktop, mp3 player, and “cloud services” to sync them together.
Apple does not make its own chips. they DESIGN their own chips. for the quantity and level of integration they need, only Samsung has the fabrication capacity at present. I am nut sure the fab, as well as the designer, needs to have an ARM license, but I suspect so.
thus Intel deciding to open the curtains and fab custom chips for customers to their design wouldn’t change a thing for them.
AMD spun off their plants a few years back, so why in hell would Apple buy AMD? that would be the Wail Street favorite “rumor,” alias paid plant by some weasel M&A analyst who wants a fat payout now, that Bigco should buy Brokeco now and burn billions of dollars that could have been glued together to make a ladder to the moon.
Not AMD. I hope Apple buys Google. Before Samsung does.
Google market cap is much higher than Samsung’s (unless you count all of Samsung’s divisions together, like their shipbuilding, insurance, and chemical divisions). Apple would have serious anti-trust issues trying to acquire Android.
Uh, Google began through CIA front funding, you will not see Apple or Samsung buy it
How about giving us thumbs up or thumbs down on every post. I like this article but have nothing further to add. I would give it a thumbs up, if I could.
some posts need another digit 😉
thumbs up on that!
I completely believe that Apple would like to make their own desktop CPUs and given their demonstrated competence in this area with the A(n) series of mobile processors, I’m sure they’ll be capable of it within the next few years as well. However there are a number of problems with them moving as soon as next year.
First is performance. ARM CPUs have unparalleled performance-per-watt but just don’t come close to the raw performance of an Intel desktop processor. There’s no way Apple can leap that performance gulf in just a year.
The second is focus. Apple is hitting bottlenecks in it’s internal organisational capacity and efficiency that will take time to solve. We saw this recently with the year+ gap with no iMac, Mac Mini or Power Mac update and then a sudden glut of product announcements. Their design and engineering teams are clearly struggling to keep up with the demands of their ever-expanding number of product lines. They’re expanding manpower, but Apple HQ isn’t like a Foxcon factory. You can’t just hire another 5,000 engineers and have them instantly start working at full speed on a new project, no matter how much money you have.
Of course their new mobile products get the highest priority, leading to erratic product release schedules in their desktop business. There’s no way their relatively new processor design team will be allowed to distract themselves from doubling down on iPhone and iPad processors with an unnecessary desktop processor project.
Finally, there’s fabrication capacity. Apple it desperately trying to move their ARM production away from Samsung and failing. Adding in the demands of desktop CPU manufacturing wouldn’t make that problem any easier, especially given desktop class processors are physically a lot bigger than mobile processors and so take up proportionately more fab capacity per unit.
All this aside, I don’t think the ARM architecture will be well suited to desktop use even within the next few years. Intel may well be blinkered and unable to compete in new and emerging market segments, but they certainly know exactly what they’re doing in the laptop/desktop and server space.
Give the iPad and the A(n) CPUs another 5 years of iteration and this whole issue will be moot. The tide is flowing the way Apple wants it to, and there’s just no need to expend energy pushing it along faster.
These are all great points that I disagree with to some extent. Rather than fight it out here, however, I think I’ll just do a separate column on it. Look for something in a day or two.
Excellent, I’m always up for learning something new.
Desktops are one thing, but…
I haven’t used an Apple desktop, well, ever. I didn’t like classic MacOS, but was a NeXT fan from way back (even learning Objective-C back when the NeXT OS was at version 0.8). When Apple acquired NeXT and announced MacOS X, I bought my first “new” Macintosh hardware — a Pismo.
I’ve been an Apple *laptop* user for ages. And while I understand the jump from PPC to Intel from a lot of perspectives, my own perspective is that it’s caused great harm to their laptop line.
When I put a second battery in my Pismo, with old hard drive tech and old wifi tech, it could still get eight to ten hours of run-time on battery, and it did not get hot. My TiBook ran warmer and less long, but was still within acceptable parameters.
My first x86 MacBook, however… I hadn’t internalized just how very much hotter it ran, how much more power it burned. I literally burned my leg, using my laptop normally. (Yes, the vast majority of my time, I used my laptop ON MY LAP.)
These days, I’m resigned to using one of those “laptop desk” things purely for heat mitigation. Do not like.
An ARM chip may not be the right thing for the highest of the high-end desktop systems. But Apple has had a mixed CPU architecture before. They’re still using the old Mach-O binary format, supporting fat binaries. In the later days of NeXT, we had quad-architecture fat binaries, with x86, m68k, Sun SPARC, and HP Snake all in one executable file.
Imagine MacOS getting fat binary support for ARM. Imagine an OS that would run on both x86 and ARM, like it used to run on both PowerPC and x86. Imagine the ARM only being used in laptop, set-top, and budget (eg. “mini”) models.
Screw the desktop. I want a MacOS laptop built on ARM, that may not have the processing punch of the desktop, but that runs for *days* on a charge and never ever gets hot. I’d jump to that in a *heartbeat*.
Oh, and now imagine one more thing. You know how PowerPC MacOS had the “Classic” execution environment, letting it run pre-OS-X binaries?
Imagine the ARM version of MacOS having an environment that let you run iOS apps directly. Boy howdy, that wouldn’t suck, eh?
You want a mac laptop with an arm processor? Get the samsung chromebook. For $250 you get an ARM based macbook air wanna-be. I really like mine, am using it now.
Won’t happen. If the CPU doesn’t take significant power, then the screen will take all the battery life. Or have you noticed how the iPad 2 has an ARM processor, but still only 15 hours battery life while playing video.
https://www.anandtech.com/show/5789/the-ipad-24-review-32nm-a5-tested/1
I love that word “only” in conjunction with the figure 15 hours.
Go back 4-5 years and you’d have laughed at the idea of being able to watch a full length film before the battery died on something the size of an iPad 2.
The future is pretty neat. Can’t wait to see where we are in 5 years time.
running a chip fabrication plant is like a giant race to see where you can throw your cash away the fastest. once you get it up, after a year or two of retailoring the equipment and trying to dial it in, you need lithography half the size to meet market demands for more oomph and less power, so you have to do it all over again.
unless a fab is running pretty much full bore 24x7x365 with an order list half a year thick, you lose money. it’s like a tech stock on steroids, the plant is always a liability.
Apple would be stonefaced idiots to go there, and they gave up their final assembly plants with the Color Mac Classic because they figured it out. they don’t wade back into the Pit of Stupid once they’ve left it.
Simon,
you are simply wrong concerning desktop. Look at the Samsung Chromebook (Cortex A-15), it already at least equal to the latest Intel Atom offering therefore the normal (low performance) desktop performance (=net tops, net books) is already lost to ARM.
Look at Microsoft Surface — Office works “good enough” on ARM and for the rest most people are finding out that the reintroduction of “full screen” monotasking and no new need for permanent Anti-Virus scans alleviates the need to have a Intel CPU by a large margin. Otherwise even Intel bets “on more cores” which are easy to add for ARM.
Next year will be the Mac Air category which satisfies roughly 50% of all people except for CPU power users — if you are happy with a 2010/2011 Mac Air, it’s done deal.
For power number crunching you could use a Nvidia Tesla or the cloud.
For the rest fine, Intel won’t be going away and you keep your Win7 Stuff for the next ten years or switch to Win9 when Microsoft has resolved in dual personality.
Intel x86 is destined to go down the Cray, DEC Alpha, Itanium Path more than ever before.
I hope you’re joking about Microsoft Office on the WinRT platform. It’s way too slow. I’m no typing demon, but I could easily outpace the interface. Response was slow and the whole experience was unusable.
ARM is great for phones and tablets because they’re single window devices. There’s very little background processing going on. It’s going to take a few more generations for ARM to be able to give Intel a run for its money in the multi-window desktop arena.
However, ARM will become a desktop capable processor before Intel is able to make a true portable processor for phones and tablets.
Apple has outsourced all manufacturing…they are not about to jump back into that arena.
A while back, Cringely asked what is Apple really doing in Maiden NC, showing that it was too big for iCloud. Look at the aerial photos. Doesn’t it look more like a wafer fab than a server farm? Why the segregated shipping and receiving docks on either end, and why all the plumbing at the back?
Forgot about this article, but that was exactly the comment I had came up with but didn’t post.
Apple in recent years has been having big issues with leaks etc. and most of it due to having no control over alot of the people making its products. American fabs manufacturing core IP tech such as processors would begin to give it back control over its secrets.
I bet it’s not far off to say that they still dominate the server market and will probably continue to for the next 20 years.
The answer is rather simple: Intel should manufacture Apple’s processors for them.
Apple have fallen out with Samsung, unsurprisingly given the blatant copies of the iPad (and Google’s Android). Biting off the hand that feeds them! Samsung have just responded by increasing processor prices by 20%.
Intel have seen their market drop, so they’ve got fantastic Fabs but less demand. Given that Apple wants state of the art, and Intel have state of the art Fabs, it’s a perfect fit.
But at the same time, Intel really should be manufacturing their own ARM incorporated designs and letting the Atom die gracefully.
Can I get the CEO job?
Apple is shifting to a lot of other manufacturers and obviously isn’t planning on relying on Samsung anymore. Samsung know they will lose alot of income so only good business to up costs to prepare for the future.
Yes, but as Samsung loses business from Apple, their own marketshare has been increasing.
Yeah – because they have been buying those sales: https://www.asymco.com/2012/11/29/the-cost-of-selling-galaxies/
How sustainable is that? Any way you slice it, loosing the volume of a company like Apple is going to hurt. I’ll wager a significant portion of Samsungs profits on phones are massaged from production infrastructure funded by Apple parts orders. I guess we will see when Samsung and their other customers have to pick up the slack.
If anyone has ever ran a business you understand just how much of a balancing act financing and operations are. This has nothing to do with anyone being a fanboy but are just business common sense.
Should Intel decide to manufacture ARM processors for Apple, and to a lesser extent for MS, Intel would be completing with its Intel Chips, thus changing to a much, much lower profit model, perhaps as much as 80% lower profits compared to how much it gets from its Intel CPUs.
Moving to partly doing contract semiconductor manufacture may be Intel’s only option. But Apple’s business is probably big enough to justify (assuming they’re competitively priced enough) taking on a this type of work.
Better to have Apple’s ARM business than no business?
The death nell of a company sounds like this: “We can’t do that! That will eat into our own business!
Blockbuster first missed DVD sales because they didn’t want to hurt their VCR rentals, and later they didn’t want to go into streaming video because it would hurt their DVD sales. Best Buy didn’t want to go into Internet sales because that would hurt their bog box stores.
Apple, on the other hand, moved from iMacs to MacBooks to MacBook Airs to iPads knowing full well that each would eat into the other’s sales. Apple’s business is 80% iOS and it’s the biggest company in the world.
If Intet had any brains, they would immediately license ARM and build A(x) chips for Apple, and maybe even use their expertise to help design them. Yes, it’s going to hurt their x86 sales, but someone else will build the ARM chips and then Intel won’t have x86 sales or ARM sales.
That’s a bit of brilliant insight there.
I’m reading “Why Nations Fail” – and one of the notions has to do with embracing “creative destruction” or environments that tolerate it. On the national scale, the state either has a monopoly or grants a monopoly and that monopoly does not allow for anything to eat into its margins or profits. The book’s highlights are in pointing out instances when a society moves from tolerance for creative destruction to intolerance (see Venice circa 1300).
On broader scale, the emphasis is on institutional innovation which should be seen from a view point of change of structure. This is the fundamentally most interesting concept of all.
Here is where viewing Samsung becomes really interesting. Samsung was just one of several companies struggling to obtain a dominant market share in the cell phone business. They pretty much arrive at that point by around 2000 thru 2005. Then comes Apple’s Iphone and blots out the sun. Down goes Blackberry, down goes Nokia, down goes Ericson. On the ropes is Motorola. But Samsung, and to a lessor extent LG? Samsung was faced with the kind of change that Intel is faced with, that Microsoft is facing, that IBM was facing and so many others. What was Samsung’s response?
Well, first, they got busy, playing catch up. Meanwhile South Korea protected Samsung and LG by delaying their entry into the South Korean market. They got lucky in that Android came along as they probably couldn’t have produced it themselves (I hazard to guess). Samsung and to a lessor extent LG pumped out tons of android devices of every possible size and market. They had to be poring tons of resources into this effort and throwing anything and everthing at the market in hopes that it stuck. This reminds me a bit of when Microsoft came out with Windows – it wasn’t perfect, but it managed to preserve their position in the market enough to preserve their franchise until other players dropped out (OS/2) and they got better traction (windows 95, then XP).
This gets back to the larger theme of institutional innovation (the right kind) and the larger theme of the renaissance in East Asia (which is the biggest event of the last 40 years in our time). That renaissance is driven by institutional innovation/change in two areas: Public Policy formation (political science) and Corporate models.
In Public Policy, the East Asians, beginning with Japan (see MITI and the Japanese Miracle) have stumbled upon a new organizational arrangement for government. Essentially a fourth branch, the neo-mandarinate (meritocractic) bureaucracy, that exercises smart pragmatism (not ideology) in pursuit of public
interest, including economic growth, including industrial policy – and with experience that policy is getting smarter all the time and no one practices it better now than the Koreans and the Taiwanese.
In corporate models, the Koreans, like the Japanese, have widespread life time employment (tenure). (The devil is in the details, but) That changes how a company is governed, and how employees function inside a company. For over 40 years now our Business Schools, leaders and pundits have been telling us that the Japanese (and Koreans and Taiwanese) would, sooner or later, have to abandon widespread tenure, and learn how to layoff employees like the rest of the world- and in some instances they have. Our CEOs don’t want you or anyone to know, but the truth is the East Asian model has functioned much better over time. Japan’s auto market has half as many consumers as ours, but it has more than twice as many producers. In the economic implosion of 2008-9, not one of Japan’s car companies declared bankruptcy, while 2 of our 3 did.
Samsung is not solely in pursuit of margins. They have to provide employment for their lifetime employees now and in the future. That means they are in pursuit of market share. Not just now, but 20 years from now. Funny as it may seem, but that’s exactly what the employees want too. Everybody is pulling on the same rope, in the same direction and has the same interest and risk.
More work needs to be done to understand how slight changes in institutional arrangements, let alone big ones, produce better institutions and that study would have to include the formal and informal influence of Korea’s public policy making bureaucracies. Whatever those mix of arrangements are, LG and Samsung have survived a major convulsion in the cell phone markets, caused by Blackberry first, then Apple, that normally destroys most companies, including the list I have above, plus many Japanese companies, such as Sony (the only Japanese company making cell phones that come to mind are Sony-Ericson and Kyocera). In the past, Apple’s intrusion into the cell phone market would have destroyed the existing player’s position – but in the case of Samsung and LG, they ultimately look like they will survive, and someday in the future Apple with it’s limited offerings and market share, in cell phones, once it innovative edge draws thin, might fall back to the position it was in back in 1998. Given that Apple, from the second coming of Jobs, has been a much better performer than most American companies (less shor term oriented, more big picture oriented, and more tolerant of creative destruction), I’m not sure what this says for other giant American technology companies, let alone the specific case of Intel. Given the overall dynamism in the mobile platform markets, It seems strange that they haven’t found a way to have a bigger and stronger presence there.
As a side note, I think Motorola’s struggle in this market has been interesting and note worthy, though they appear to be losing market share.
Thank you for the insightful post, and I wonder what Bob’s take on it would be.
Of course, Korea has put its own spin on the East Asian-style corporate model, in the form of Chaebols. These are typically portrayed by the press in a somewhat negative light (too powerful, prone to corruption, etc.), but to be honest I know too little about the Chaebol system to judge one way or the other.
Your overall point remains valid, though. The difference in corporate cultures between East Asia & the West has always fascinated me. I can break it down to this basic test: When your employer tries to get you to “cheer-on the company” (through various means), do you respond genuinely, or does the exercise strike you as hollow and/or cheesy?
For me it has always been the latter, but with my exposure to certain Japanese companies, I can honestly say that I would rally behind the company if I worked under that type of system. The difference being that in such a system, the companies are also rallying behind their employees.
There was a Dr. Who episode where the Daleks and the Cybermen both showed up at the same time, both with plans to take over the Earth. But first, they each needed to “exterminate”/”delete” the other. While they were busy getting it on, the Doctor eliminated them both.
Apple’s a relatively minor player on the desktop and moving away from Intel to their own walled-in processor won’t be popular move for high end users now that the Apple cool-aid is weakening.
Apple is in danger of following through on what I hope are seen as big anti-competitive mistakes. They’ve certainly alienated many customers and are losing their mojo… not good for a fashion design company.
Sorry, how would switching to ARM processors be “anti-competitive”? How are ARM processors “walled in”? You do realize there are actually more of them being sold than Intel CPUs, right?
Was switching from Motorola/IBM to Intel anti-competitive as well – or was that pro- competitive? Kindly justify your thinking n
“Big anti-competitive mistakes” was referring to Apple’s actions in the patent arena and “going nuclear” and more that close down competition and hurt consumers. No , they’re not the only ones behaving badly and in some cases it’s just good business sense and legal… taken to an extreme, underhanded and unfriendly.
ARM processors aren’t necessarily walled in.. Apple products are… for better ( easier to use ) or worse ( unexpandable, noncompetitive etc.).
Expandability is a red herring, especially on the desktop, for all but a tiny number of users. Back in the ’90s I built my own PCs so that I could be “expandable”; maybe it made sense back then, but all I ever “expanded” was RAM, updated modem cards (53kbaud! woo-hoo!), and graphics cards. In the late ’90s I gave up building my own machines because it was more cost-effective to buy from a manufacturer and get a bunch of bundled SW; I still upgraded memory, but no longer needed to upgrade anything else. Finally, in early 2007, I switched to an Intel iMac; I can still upgrade memory (currently have 12 GB in my mid-2011 iMac), but there’s really nothing else I need or want to upgrade.
Sure, some people truly need to update their graphics every 6 months, but the vast majority of people don’t. Just about everthing else can be done with Thunderbolt expansion (external). What exactly is “expandability” buying me, again?
At 13.6% of global PC shipments they’re the third biggest PC manufacturer. But they’re also single-handedly responsible for over one third of all PC manufacturers combined profits.
They’re a lot more than a minor player in the PC market, and their market and profit shares are still growing strongly.
The “just a fashion house” comment is just laughable though. Name some other PC manufacturers that not only develop their own desktop and mobile OSes, but also develop new consumer market technologies like retina displays, firewire, thunderbolt, airplay and fusion drives and even designs their own CPUs?
> Name some other PC manufacturers that not only develop their own desktop and mobile OSes, but also develop new consumer market technologies like retina displays, firewire, thunderbolt, airplay and fusion drives and even designs their own CPUs?
Microsoft comes to mind. They obviously develop their own mobile and desktop OSes. The XBox, surface, and (late kin) are/were arguably new consumer technologies, and the Xbox has their own custom designed 3-core PPC CPU.
I’m just grumpy about the tendency of tech companies to move toward closed products. I see Apple going down that road a bit further before the pullback ( as happened with IBM and Microsoft ).
I’m also disappointed in the way they’ve deceived their customers in the name of marketing. If you look into Apple developed technologies you’ll find that Apple did much less than they take credit for. Firewire and Thunderbolt are Intel developed technologies. Retina – another name for high-res. That’s why I oversimplified and called them a fashion design house. Maybe that’s why you said “consumer market technologies” as in “consumer markets for technologies”; see, your subconscious knows.
Considering HP, Microsoft, IBM, Nokia, Sony, Samsung, RIM, Mozilla, KDE, Dell, who hasn’t made a mobile OS?? Plenty of them also make a desktop OS, or like Apple build off an open source OS.
Firewire was developed by Apple.
USB and Thunderbolt are from Intel, but, of the PC manufacturers, only Apple has the courage to make them prominent.
Likewise, Retina screens have long been technically possible, and the panels are actually built by LG or Sharp or whoever, but only Apple has the OS development in-house to make it practical. And Apple had the courage to release the Retina screens, first.
Thanks for making my point.
Also, your companies:
HP: No mobile or desktop OS. License it from Microsoft, or buy it and in an Apotheker-lyptic fit, cancel it. (Sorry, trying to pun on “apoplectic fit.”)
Microsoft: Their innovations are strangled by their backwards compatibility. Short-term profitable, but long-term troubling. I’m thinking particularly of why they never had Retina screens before; their API was the worst.
IBM: Haven’t had a mobile OS in forever.
Nokia: No smart mobile OS of their own, but managed to steal Symbian. Horribly mismanaged, and now suffering the Elopocalypse.
Sony: Nobody likes their mobile OS. It’s insanely closed. Now they use Android.
Samsung: Okay, they have mobile OS efforts.
RIM: Okay, they have mobile OS efforts.
Mozilla: They’re don’t have anything out, yet.
KDE: They’re not even a company, and they don’t have anything out, yet.
Dell: No mobile or desktop OS. License it from Microsoft.
I’m not sure what you were trying to say by bringing up these companies.
I was just pointing out to Simon that there were plenty of mobile OSes developed.
I guess I’m just going through the “the emperor has no clothes” phase with Apple. I ask myself:
If Apple was so Innovative then why are the patents they are using to fight Samsung so disappointing?
Now I’m thinking of shorting stock.
First, I agree with Simon Hibbs’ comment 100%. I just don’t see the switch to ARM being possible so soon — certainly not before Apple has a rock solid reliable replacement for Samsung’s CPU fab.
Switching to ARM for Macs would be a huge change in one other respect: killing Windows compatibility via Boot Camp/VMware/Parallels. I guess Apple feels it doesn’t need it anymore, but I think it has been a huge factor with switchers. “You can always run Windows on your Mac” is an excellent argument towards any anti Mac OS X FUD, and very reassuring to the continuing wave of switchers — even if very few take advantage of it.
It is also an interesting question what portion of Mac users are set up to run Windows: 5%? 10%? 20%? Does anyone know?
I agree, Apple is just making its intentions quietly known. It will take years for the software to be transitioned to ARM by the likes of Adobe etc. You don’t rebuild Rome overnight, Intel may not get another chance to stay relevant if its doeesn’t put together a good strategy to move forward in the future CPU market.
Apple will have a runtime translator for probably 1 or 2 minor revisions, just like when the dumped PowerPC for Intel. after that, support will cease. my eMac is an orphan, and with one more chip change, will be an outcast as well.
Ted and Ian touch on some further issues that complicate an Apple transition to ARM on the desktop. Intel’s CPUs have a whole raft of features that just don’t exist in the ARM world. VTx virtualisation that makes Windows and other VMs run so well, hyperthreading and TurboBoost. Then there’s Intel’s industry leading production processes now down to 22nm with Ivy Bridge.
There’s also the issue of graphics acceleration. Currently Apple can use off-the-shelf graphics cards, but for ARM systems the GPU manufacturers would have to produce custom cards, which would be expensive. Also while the ARM world is ramping up it’s embedded graphics capabilities, the GPU cores used in current ARM devices are a world away from the capabilities of the embedded graphics in AMD and recent Intel chips.
let’s see here, we have massively parallel supercomputers made with Intel, AMD, and various other RISC chips… as well as… GPU graphics engines. some with mixed processors, depending on what resolves to tight code best when compiled.
I suspect Apple engineering has not been ignorant of techniques to split load and optimize.
I agree with Simon, for the graphics/video processing, I have a hard imagining an ARM working.
I’m curious about the windows compatibility issue as well. I’m in education, and the ability to have one hardware platform to satisfy both Mac and (most) Windows users has been a godsend. Not looking forward to going back to the Mac vs. PC battles. But I do wonder what percentage of Macs are running Windows, I fear we’re too small a niche for Apple to care about.
Often your writing is a little too bold. I feel like you are making big assumptions on wild speculation. Apple probably would like to leave Intel but it’s just not happening in the near future, certainly not by next Christmas.
I think the rumors about Mac OS and ARM are incorrect. Why bother with a processor switch when the Mac itself will be gone in a few years? What I think Apple is working on is getting Mac apps running in iOS. Which of course is on ARM. This won’t happen next Christmas. I’d say it’s two or three years out.
I doubt Apple’s shaking Intel’s tree because they’re abandoning x86.
I think they want to have Intel fab their Ax parts on 22, 14, and 10nm. Intel has a lead of 4 years in process node tech over Samsung et al.
Their iDevices would then be much more performant in power and battery life, allowing them to shape the performance envelope by specific device.
New leadership and a bet-the-company move into an exciting new technology … like, perhaps, foil hard drives?
No, that’s an opportunity whose time may have already passed, but there are plenty of other markets to dominate and grow.
Maybe, just maybe foil hard drives could still find a niche in RDX backup cartridges. My bet would then be Freecom/Verbatim/Mitsubishi as product champion.
Just my .02€
Guess Texas Instruments might have an interested buyer for its OMAP division.
A few things to comment on. First, Bob, you are right on the money with this one. The timing may not work out, but the predictions will come true.
Intel’s board is made up of old rich guys. They will hire one of their own, who will not do anything radical. The business will make money for decades before anyone realizes how broken it really is.
Don’t forget, Intel makes lots of stuff, not just x86 processors. Their SSD group is in my neighborhood, there is money to be made their.
It is fascinating that in a post about Intel, most of the comments are about Apple. If you ask why Apple is winning, the answer is right there, they are inside your head man.
I have seen and worked with the new ARM parts. The capabilities are there, and they will get more cores and faster. Multi-core software is the real technical problem. As Simon Hobbs said, the tide is flowing in Apple’s direction. ARM is also working on 64 bit parts. Give them a couple of generations and they will have a solution that is true RISC and a lot less power hungry.
Also, the total cost is not just in the processor. An x86 needs 3 other big parts to work, north bridge, south bridge and pmic. ARM has a lot of IP that makes them a two chip solution, processor and pmic. Apple can make more money with a lower cost BOM. That fits their CEO personality now.
Like he said!
Let’s not forget that Intel has spent enormous amounts of R&D on fab technology and parallelization of software. Clearly, the future is low power, multi-core CPUs that require much more specialized software to take advantage of them. Intel has 2/3rds of what they need. Unfortunately, to Bob’s point, Intel may not be willing to go all the way. It may not even matter since ARM is already multi-core and competing fabs seem to be good enough. That leaves software as the open question. That comes down to Apple and Google who together are the big software players in this context. Google has shown quite a bit of prowess when it comes to highly parallel software.
You obviously have not kept up to date on Intel x86 architecture. All platforms (server, mobile and desktop) are two chip solutions and upcoming devices will be system on a chip.
So are you saying this statement is wrong: “An x86 needs 3 other big parts to work, north bridge, south bridge and pmic.”?
Intel has treated the semiconductor business like a cash cow, not a business to innovate in, for a long time.
Had AMD not brought 64-bit to the market, your choice today would likely still be pentiums or itaniums. Dirk, Jerry and Hector’s prodding of the x86 business into the modern era is the only reason Intel moved into it.
For ARM, repeat the above paragraph above with suitable replacements for the terms “AMD”, “x86”, “64-bit”, “Dirk”, “Jerry” and “Hector”.
nice column bob, love the scoop. what are some of the other markets you think intel could come to dominate?
Interesting take, but you could do without the political shots/analogies. Just makes you look small from my perspective.
Good analysis of how success clouds your mine and poisons your soul.
Obviously Intel should have bought Qualcomm, but a very difficult merger now that Qualcomm has larger marker cap. Ain’t gonna happen, but Intel could still get Broadcom which would only help things out.
Desktop PC technology is stagnant, it has plateaued and Intel hasn’t figured this out. Intel is coasting and heading not to a cliff, but a series of mini-cliffs as usage on the desktop fades away, laptops go ARM and even eventually servers go ARM.
My prescription would be to bring in former Qualcomm chip division VP and Motorola CEO Sanjay Jha. Sanjay built up Qualcomm’s chip powerhouse and pulled up Motorola enough so that Google bought them (although they are now gutting them).
Sanjay cut through those six sigma processes at Motorola like a razor and Intel needs the same focus to execute on mobile.
As twin giants Apple and Samsung become increasingly vertical it is a big question how a company that requires a horizontal-based industry survives. Bob raises this question. My guess is Intel survives, but is at peak chip penetration, and the long decline has begun.
Motorola Semiconductor got carved up like a turkey, some to Freescale, some to MA/COM, some to Harris… when you say Motorola, you have to specify which part number.
For intel to get back into the ARMs race assumes that they are not excluded from doing so by T’s and C’s associated with the XScale sale to Marvell. Suppose that as well as knocking a huge hole in the hull of the good ship Intel, Otellini has also thrown away the blueprint for ever building a lifeboat.
I find it interesting that many comments touch on having ARM as a desktop processor (and laptop). I used to own one of the first ARM based computers, made by Acorn called the Archimedes.
A small British computer company, Acorn decided that the processors available to them at the time weren’t powerful enough. Rather than buying their processor from Intel or Motorola they decided to design their own, and so born was the ARM processor.
Built on top of this was their own operating system. Originally called Arthur, and quickly replaced by RISC OS, it was powerful. Fully WIMP, 32-bit OS, years before Windows was useable.
You might want to check Chris’s excellent website http://acorn.chriswhy.co.uk/Computers/A400.html#A440
Interesting article. Two points for your consideration:
1) Otellini is not a technologist. He’s a business guy. He’s an MBA who headed sales at Intel and served as COO before becoming CEO. He’s about the quarterly driven bottomline and would not be able to expand Intel’s product line from desktop processors into the mobile processor market. In fairness to him, he does not even have the training or background to do so. To your point, passing on acquiring Qualcomm and Broadcom were two big misses which Otellini could have bought Intel’s way into mobile processors, so there is that. Intel having Otellini as CEO was in many ways like Microsoft having Ballmer as CEO.
2) What makes you think Intel’s Board isn’t getting the result it wants? Since when have Larry Page, Sergey Brin, Eric Schmidt (Google execs) and John Hennessy (RISC Reduced Instruction Set pioneer and MIPS processor developer) ever had the WIntel duopoly’s interests at heart? The Google guys would be perfectly fine with Intel’s decline, but Hennessy would probably be gleeful, bordering on giddy. Since x86 processors are CISC (Complex Instruction Set Chips), the rise of the ARM (Advanced RISC Machine) architecture would be a victory for Hennessy’s RISC engineering philosophy over Intel’s CISC. I don’t know for sure that the Google guys and Hennessy are so Machiavellian as to purposefully encourage Intel’s slow decline, but Intel having these guys are the company’s Board brings a whole new meaning to keeping your enemies closer…
Intel’s value to Apple was enabling Windows and Windows applications to run on Macs. Less than a decade ago, in 2005, Apple’s move to intel cpu’s created value by enabling business users and gamers to dual-boot with Windows and run all their Windows applications.
Intel is now in it’s perfect storm: not simply Apple abandoning Intel cpu’s, but Microsoft actually making an ARM version of Windows, the so-called WIndows RT ( https://www.pcworld.com/article/2013405/arm-microsoft-collaborating-on-64bit-windows-version.html ).
With Microsoft now making Windows for ARM, Intel is removed from the equation completely. You can still get Windows on a Mac, only now its without an Intel cpu. Since you can run Windows without Intel, Intel’s architecture brings no value to the bargaining table.
See Steve Jobs in 2005 announce Intel cpu’s ( https://www.youtube.com/watch?v=ghdTqnYnFyg ). And see Steve in 2006 celebrating on stage with Paul Otellini ( https://www.youtube.com/watch?v=cp49Tmmtmf8 ).
Oh jeez, Intel knew fully well what it was doing when it *sold* its own ARM division. There was no real money for them to be made in ARM. They would just be another player. The money is in their own architecture: x86.
There is no magic in ARM. The reason these chips are not power-hungry is because they are slow. With every new iteration of ARM the processors get faster and more power hungry. Meanwhile, the shrinking x86 chips get less and less power hungry, while still beating the performance of ARM chips a few times over.
And don’t forget that Intel has the lead in manufacturing smaller transistors than the competition. This advantage can’t be beaten by Apple.
The first Intel chips that can compete with ARM are already here. For future chips, check out Anandtech overviews. You’ll see the Core processors will be able to run in tablets in about 2014. And they will run faster than ARM chips. And the world will yet again rely on Intel’s own architecture. So they will make all the money again, instead of being yet another ARM player.
Apple doesn’t care what processors they use. As long as they can have an edge over the competition. As long as they can work with Intel for that, like they did with the MacBook Air processor and the Thunderbold connection, they will use Intel.
Intel didn’t miss the mobile revolution. They just calculated they could take over the competition a few years later by sticking with and improving on x86. Like the new out-of-order-processing Atom chips.
Just repeating everyone that Intel was stupid is far too easy.
I can’t believe I had to read so many comments until someone mentioned what’s on my mind: Intel is catching up with ARM in the mobile space.
AnandTech reviewed the FIRST phone using Intel’s FIRST real smartphone processor named Medfield. Guess what? It’s already competitive.
That combined with Intel’s leading fab tech and engineering talent means it’s too early to write off Intel.
https://www.anandtech.com/show/5770/lava-xolo-x900-review-the-first-intel-medfield-phone
Then you missed my point that Intel has no chance at all with either Samsung or Apple, which between them represent about half of the smart phone market. Competitive or not, that limits Intel’s potential market share.
If Intel does better at making competitive chips, I expect Apple and Samsung will switch to Intel.
If you grow your own tomatoes, and you are good at it, why would you buy someone else’s, even if they are a little better at it? You still have to pay more, and you lose control.
Thanks Glenn; Although Apple has been known to switch suppliers, I guess Bob’s point is that the available chip suppliers are also phone competitors, so Apple had better get their chip act together. Perhaps they haven’t done so yet because there is always the possibility of another Maps fiasco.
I spent two days with Intel in Beaverton last spring discussing exactly this and you are absolutely wrong: they completely missed the mobile revolution.
> You’ll see the Core processors will be able to run in tablets in about 2014.
BTW, here are plans for Atom 2014:
http://liliputing.com/2012/11/intel-atom-bay-trail-quad-core-chips-coming-in-2014.html
“Here are some of the biggest changes:
Intel is moving from 32nm to 22nm processors.
Bay Trail-T chips will have quad-core CPUs, but no hyperthreading (so 4 cores can handle 4 threads, while today’s dual core Clover Trail chips have 2 cores and support up to 4 threads).
Intel is dropping the PowerVR graphics found in today’s Atom chips and instead using Intel HD 4000 graphics, based on the same technology used in Ivy Bridge processors.
That brings support for DirectX11 (today’s chips top out at DirectX 9)
According to the leaked documents, devices with Bay Trail-T chips should offer better performance than tablets with Clover Trail processors while consuming less power. That means, all other things being equal, a tablet would get around 11 hours of battery life instead of 9.
Intel says the new chips also support better audio, support for display resolutions as high as 2560 x 1600, and more.
Make no mistake, Atom processors will continue to be low power chips aimed designed to offer acceptable performance and long battery life rather than bleeding-edge performance. If you want a super-speedy tablet in 2014, you’re probably going to want to look for a model with a Haswell or newer chip.”
As an INTC AND AAPL investor:
1)To hell with mobile. There will always be a huge market for full-fledged CPU’s, and Intel is the ONLY player left in that space.
2) I really DOUBT AAPL has the silicon to replace Intel this decade. I follow Apple CLOSELY– most of my IRA depends on AAPL– and I just don’t see another chip transition for their actual COMPUTERS in the near future. The tablets and phones– I appreciate the revenue they generate, but otherwise don’t give a “fig” about that segment.
For now, this is true.
I wonder about the history of other small market players growing bigger (The Innovator’s Dilemma comes to mind as a book that talks about this). Perhaps while ARM slowly grows as “the slower little brother” chip, Intel will figure it still has the server market wrapped up nicely. But there is the possibility that at some point, data centers are going to go for lower-power, and if ARM has something that’s “fast enough” and less power-hungry (whole motherboard power consumption, not just the CPU), it may get the nod. A few more chip generations past that, and Intel’s dominance there may be contested.
Just a thought; who knows?
Ironically, Apple was once a one of the three investment partners that started ARM Holdings. I believe they started selling their shares in 1999. I don’t believe Apple still owns a significant share of ARM Holdings any more.
Lets see, while Intel was suing everyone from making chips like theirs ARM was licensing their chips and IP to anyone and everyone. Intel has built a near monopoly on one platform. Anyone who knows even the basics of economics and product life cycles know a single product platform won’t last for ever.
Intel’s first clues the world was moving away from it began years ago. First they lost out in the cable and satellite converter market. They lost out in the embedded processor market. They lost out in the game system market. The amazing loss though was in the cell phone market. It started up slow with phones becoming gradually smarter and smarter. Intel lost out in that market too. Then came the true smart phones and Intel was totally unprepared. The evolutionary path was obvious and Intel completely missed it.
Another thing… all of the new operating systems are more processor independent. No one, not even Microsoft is married to Intel anymore. This did not happen over night either.
I feel sorry for Intel’s employees and stock holders. They have the most to lose from Intel’s years of mismanagement.
It won’t be long until some of Intel’s really nice fab facilities become available for hire or for sale.
The SMART move would be for Microsoft to get into the chip making business. They need to diversify their business. Having their own chip business could help them compete better in the tablet business.
Bob’s article and all of the posts from individuals about it seem to miss the point that Aplle was a founding member of ARM Holdings in 1990. Intel, IBM, AMD, Samsung, etc pay huge sums of money to ARM Holdings to be allowed to manufacture ARM CPUs. Apple is part owner of ARM Holdings. Isn’t this a critical point regarding everything written on this topic on this web site? Why would anyone write anything about Apple and ARM and not base it on that fact? Am I missing something?
I knew that, of course. My old friend Larry Tesler was for many years Apple’s representative on the ARM board. But your point isn’t nearly as important as you think it is. While I’m sure it is nice for Apple to make money from ARM that revenue pales next to what they make from their own products. If there was a better low-power platform they’d choose that one over ARM, but there isn’t right now and so you have it.
Intel can buy some time with her own version of big.LITTLE namely Atom.Ivy Bridge.
[…] here is a quite sarcastic but also quite true reporting:- While the Intel board was firing Paul Otellini they should have fired themselves, too [Cringely on technology, Nov 20, […]
Apple computers and ARM…
Let’s see what everyone thinks about this combination after their notebooks get 24 hours of non-stop run time.
The seems to be an assumption here that Apple will replace Intel in Mac OS with ARM but it also has the very reasonable option of following Microsoft’s lead here and doing a Mac OS-RT in parallel to their Intel range. They’d start with an ARM based Macbook Air with better battery life. Maybe even with the possibility of switching back and forth between Mac OS and iOS as an option. They can then sit back and see where the market votes and wind back on Intel over time if that’s how the takeup proceeds.
Apple are not averse to building product that eats in to their other sales and an ARM based Macbook Air may be late to the party (something they don’t normally do) but it sure would make that segment look interesting.
Man, this post and its comments is one of the most thought provoking Cringleys ever.
In the US, there is an emphasis these days on simple solutions: “Down with big business!” “Down with big government!” “Down with big banks!” “Down with big education!”. These all have merit within their particular scopes, but they’re heavy hammers banging on a more subtle problem, I think.
There seems to be a general attitude in this country that says, “We’ll be back on top again, we just have to wait it out.” Unfortunately, no one has any real idea of what the ‘it’ is that we’re waiting out.
I don’t know either, but I suspect our problems are more systemic than anyone realizes, and that scares me. Negative feedback loops can be hellishly difficult to break out of.
And with everybody yelling and/or overly smug, would anybody listen to the real answers anyway?
[…] While the Intel board was firing Paul Otellini they should have fired themselves, too, here. Apple is ditching Intel for it’s desktops so Otellini gets fired. The company was too busy […]
Seems like I’m the only one here wondering how well the high performance computing customer will be served. With the mainstream going mobile and servers going ARM I think that high end desktop to run Solidworks for MEs and various simulation and synthesis tools for the EEs are going to get much more expensive.
Yes I too have a smart phone and enjoy its convenience but there is a limit to what Moore’s law can provide. When “battery life” become a concern you look for opportunities to lower power consumption by using fewer active transistors. A DSP can act like a codec but at much lower efficiency. To watch a 1080i video, a code using 100x less power but is a special purpose device. With enough transistors provided by Moore’s law, one can put both codec and DSP but only power what you are using.
By extension of this argument, space for many peripherals will eventually be possible including the north and south bridge functions one finds in Intel architectures. But how likely is that to happen? Market forces begin what they are, if the peripheral isn’t multimedia or games, it ain’t gonna happen.
Where am i going to get high performance double precision floating point operations on muti-billion data point analysis on my phone? When I arrive at work where will I be running Solidworks on a thin-net virtual machine on an ARM based server? Will high performance be a thing of the past?
With the sweet spot of the foundry cutting edge design rules have about a 3 to 4 year life span (the time interval were a majority of production is at a given design rule). It won’t be long where a majority of return on investment at the foundry for CPU fab will be mobile, and nothing in CPU design on the bleeding edge will be non-mobile.
I suspect that High performance PC that used to be $10K five years ago and came down to $2K recently, is likely to go back up to $50K or more.
So much for progress
I feel your pain; my job relies on those high-end Intel chips (Xeons optimized for FP performance) running large scientific applications, many home-grown. It’s been a big deal over the last five years when the cost of those high-end scientific workstations (running 64-bit WinXP, alas, instead of MacOS X) dropped below $5k, so that they can be expensed instead of requiring capital dollars; I’m finally at the point where most of my work is no longer compute-bound but instead constrained by how long it takes me to think about the results. But our needs are being eclipsed by the mobile space, and you may be right that we’ve just about seen the last of really cheap, fast floating point performance.
This post is interesting in that it reads as just another example from Clayton Christensen’s The Innovator’s Dilemma. The book explains how difficult it is for many companies to adapt to changing markets and why they are so slow to adapt to those changes. Often enough – these companies seal their fate by two things – (1) focusing on profit and (2) allowing innovative technologies to gain traction in often low-margin characterized business units.
Now that Apple has decided to make their own chips for their hardware, what does it mean for the other vendors? Will Apple start creeping up the supply chain to where they develop the whole product in total? In an effort to maintain the top performance they require of every single component? I suppose you could argue that they have already done so – even getting into the glass (e.g. Apple’s Gorilla Glass case).
How does Intel, or any similar supplier, keep up with the innovation it needs to – in order to survive?
[…] Why Paul Otenlli got fired: He makes pickup truck engines […]
I like Apple! Why? Because they give other businesses the means to make better products and some of these businesses failed to take advantage.
Motorola’s 680000 CPU’s ended up in Macs or printers and eventually died for lack of market penetration or development or resentment.
Apple went to IBM and with Motorola’s input made PowerPC CPU’s which ended up in Macs or printers and eventually died for lack of market penetration or development or resentment.
In the mean time it went to England to expand the RISC architecture with Acorn Computers to make Newton CPU’s. If Apple had kept their 45% share OR bought ARM, Google and most other battery komputer phones would be stuffed.
Apple had to mark time with poor quality specs for their CPU in Macbooks about 3 years ago. I thought it was history repeating with Intel giving Apple crap.
And Apple has still not learned. Your take on Apple’s data farm being too big some time back got me thinking.
It bought two RISC designers and GPU companies. Why? To design their own CPU’s But with Samsung and Taiwan not bowing to Apple’s demands and maybe theft, Apple should build their own CPU chip factory in their unused data farm space.
II
Steve Jobs and his Apple believes in Darwinian Evolution — change to survive or die!!
America in general and most American Companies in particular don’t, they believe in Heaven, immutable laws and divine right to stay fixed in thought, deed, USA Measurements and business plan. (And deception — another evolutionary adaptation)
When the two beliefs meet divine belief wins in the short term then the perfect storm, perfect battle, perfect length or perfect drought kill the illusion of divine right. ( But deception keeps the weak adaptively strong.)
Intel’s battle happened three or so years ago when Apple kept superseded CPU’s in their laptops and not the crap Intel offered as better. And the MacPro is only marginally better because Intel has put the breaks on fast development of top end CPU’s because it got info from Apple on its future unbeknown by ignorant Apple (the deceptive insight).
Apple has a difficult learning curve with CPU providers — they all fail to perform to Apple standards. And still does not know the power of its thoughts or intelectual property!
Apple’s alliance with Acorn Computers was productive, far sighted, and non Stevee inspired. The reason is it went to the core of what a CPU was. I do not know if the sale of ARM was compelled by the dire straights of Apple in the late 90’s or Jobs’s dislike of a non-Jobs product.
Apple gave as I believe the holy grail of CPU design – the LightPeak technology – to Intel and Intel returned to Apple a business plan that would not see its incorporation into computers for at least ten years. (And time to patent the idea.)
I’m sure Apple’s business plans are a lot shorter.
I sent to Steve Jobs an email that said he was a dickhead fool, and ignorant of the power of LightPeak technology at the time.
So Apple now has no option but to design, prototype and MANUFACTURE its products or at least the heart of them.
Otherwise it will be cuckolded by its friends as it has been in the past!!
Apple wasn’t an alliance with Acorn. Acorn was sold to Olivetti, but Olivetti didn’t know what to do about the ARM design. So ARM was split off from the sale. A separate company. Acorn = Olivetti.
Apple did the deal with ARM.
[…] message on her phone while driving killed an old man and his dog and faces five years in jailThe Intel board should have fired themselves, too – I, Cringely .recentcomments a{display:inline !important;padding:0 !important;margin:0 […]
I was first exposed to Robert Cringely when he was writing about PCs in the very early days – probably late 70s. Always found him insightful, interesting and fun. His PBS work “Triumph of the Nerds: …” is of course fabulous. So, his take on Intel really got my attention – political references ignored. I had assumed that Intel was a Carrier – like the USS GHW Bush (CVN-77) – that was just beginning to “turn into the wind” and launch its full attack. Slow – either in decision or in assembling the chips and ecosystem – Intel is now about to launch all of its fury. Am I wrong? Too little too late? Why?
Top reasons not to count Intel out in mobiles.
1. They must succeed. All oars are now in the water.
2. “On the beach” IA mobiles have just arrived. The timing of future iterations has been accelerated. Beside tablets with Andriod, the iPad is about to face an Intel hive of stinging competitors. To Foundry or not to Foundry can be postponed at least a year. Content dominant “fondleslabs” may not be the end-all.
3. The chips with the best features, performance and cost will always win.
4. Intel will continue to drive Moore’s Law beyond transistor shrinks – e.g., stacking, III-V materials & other. If anyone doubts Intel’s commitment to R&D, suggest checking out their Ronler Acres campus with $3B upgrade for completion in late 2013 and, although not yet announced, a recently surmised doubling of this facility by 2015 (from contractor discussions reported in the Oregonian).
5. Intel has a vast depth of resources, staying power and a unique ability to be a platform industry orchestrator. The Xeon Phi is a great example at the maxi-end.
Many may recall AMD reporting their Q2 business results in July 2007. AMD’s results were terrific with serious share gains over Intel, and an AMD executive declared that Intel’s chips were “pathetic.” So, while it may be au courant to pound on the 900 Pound Gorilla, her formidability should not be misjudged.
So – I say we love you, but not so fast Robert. And, remember what Yogi said about it being “over.” Always with great respect, Erin.
The thing is the ARM ecosystem to Intel is entirely different from what AMD is to Intel. Unlike AMD, ARM doesn’t fight battles where Intel is strong at (high performance x86 chips), but in the mobile space ARM has already grown way above the critical point where entering it with x86 already puts Intel at a serious disadvantage, let alone whether they have the actual chips with comparable performance/battery life to do it.
The AMD had not brought 64-bit to the market, your choice today would likely still be pentiums or itaniums. this is the modern era is the reason Intel moved into it.
Intel has treated the semiconductor business like a cash cow instead of a business
Internet board is necessary to us.. i hope the manufacture can pay more attention to our users.
intel board is really important to us… thanks for sharing this information
[…] days after I wrote a column saying Apple will dump Intel and make Macintosh computers with its own ARM-based processors, along comes a Wall Street analyst […]
[…] days after I wrote a column saying Apple will dump Inteland make Macintosh computers with its own ARM-based processors, along comes a Wall Street analyst […]
REVIEW is the new video magnifier product in 2011. It has remarkable design and superb performance. You may use it to read a book, to look far away, or to wear your make-up. Unlike other magnifier, you can use it to take and review photos.
Whenever there is a ARM vs Intel debate the performance/watt/battery life metric always pops out which IMO is a red herring, and this are the real reasons why Intel will lose:
1. Intel NEEDS to sell high margin chips such as their upcoming Haswell their entire R&D is wholly dependent by this model, which means their offerings will be EXPENSIVE. In contrast, ARM players thrives on razor thin chip margins, which isn’t crippling for them because most are not in the business of selling chips but entire end consumer products.
2. Non of the current ARM players will want to get locked into architecture that is wholly controlled and manufactured by a single company. And a historically monopolistic one, no less. Sticking to ARM provides flexibility, for example, Samsung can source SoCs from itself in-house, or from Qualcomm or Nvidia, no matter which they choose they still remain in the ARM ecosystem.
There’s nothing special about Intel x86 anymore. It’s also baffling why so many cling to the arguments about backwards compatibility. After all nobody runs 16bit software on real hardware anymore. (virtual machines now provide that option)
Modern 64-bit Operating Systems don’t support the 16-bit subsystem anyhow so why does the x86 design still carry that baggage?
These comments are 6 months late, but here you go:
Intel should buy Imagination Technologies. They make the GPU for Apple, so Intel can get one back at Apple by yanking that. Imagination Tech now own MIPS, which arguably has better power-performance ratio than ARM. Intel could then continue to milk high margin x86 and target MIPS at mobile and low power servers. With MIPS they could offer a real contrast to ARM by offering a consistent boot environment to vendors like x86 does, so they only need to generate one MIPS linux image to support the architecture, unlike the sad state of affairs that exist in the ARM SoC world.
Intel is friendlier to open source than Imagination, so that is a win for all of us. Intel would benefit from the graphics technology.
It will never happen though, because MIPS was not invented there.