A rumor surfaced yesterday in Japan that Apple would by the end of the year introduce a radical new kind of Macintosh computer. That was it — new Mac, radical — yet dozens of sites ran with this non-information simply because Apple is a hot company and, who knows, it might be correct. In that same spirit, then, here’s my guess about what might be correct: I think Apple’s Macintosh Pro line of computers is dead.
Mac Pro’s are Apple’s big box PCs. They haven’t been refreshed since last summer and new models were expected this month with the new Minis, but for some reason the new Mac Pro’s failed to appear. Apple said nothing because, well, because Apple never says anything, instead relying on dopes like me to write non-stories like this one. But while the Mac pundits are generally still waiting for new Mac Pro’s to appear, I don’t think they are coming at all and will be replaced with a whole new approach toward high performance computing from Apple. Maybe this is what the Japanese writers are picking up on.
Mac Pro’s were Apple’s most powerful computers, though the new Mac Mini I7 servers get pretty darned close, and that’s part of my point in making this prediction. Apple likes a simple product line and eliminating the Mac Pro’s, just as Apple last year dropped its xServe line, would certainly simplify things.
Dropping xServe, an Apple move that was wildly unpopular in some IT quarters, was I suspect some sort of Steve and Larry each bargaining with the Devil thing in which Apple steered even more clearly away from the enterprise in exchange for who knows what from Oracle/Sun. But I think dropping the Mac Pro, if it indeed happens, is a very different move intended to simplify the computer line while boosting the display line.
Mac Pro’s are dinosaurs in many respects. That big beautiful aluminum case with its clever air ducting is eight years old and enormous compared to most PCs. It exists primarily to allow users to pack their Macs with extra drives and third-party graphics cards for high-end gaming. But Apple is changing its whole approach to storage, presumably moving as much of it as possible to that big North Carolina data center. Apple hates foreign cards (or indeed cards at all) installed in its machines. And then there’s those new 10 gigabit-per-second dual-channel Thunderbolt ports; where do they come in?
I expect Apple to move to a modular architecture where the building blocks for high performance computers are generally Mac Minis. Start with a new Mini or with a Thunderbolt iMac and expand both storage and processing by adding a stack of up to five more Thunderbolt-connected Minis. A maxed-out system would have six I7 processors with 24 cores, 24 gigabytes of DDR RAM (expandable to 96 GB!) and at least six terabytes of storage. And at $6000, it would be half the price of an equivalently tricked-out Mac Pro.
Yeah, but what about the Graphics Processing Units (GPUs)? What real gamer wants to be limited to the somewhat lame integrated Intel graphics found in the Mac Mini line? That’s where the displays come in.
Apple’s Cinema Displays, while still lovely, have fallen way down the price-performance curve. They are too darned expensive for what you get. But Apple can hardly be a PC company without displays. They need to either (shudder) start to compete on price or more likely find us a new flavor of Kool-Aid, which I think we’ll see in upcoming Apple Thunderbolt displays.
There are only two Light Peak displays on the market right now. I use the term Light Peak, which is what Thunderbolt is called in the non-Apple world, because while one display comes from Apple the other is from Sony and uses a different connector. I think that Sony display gives us a hint to Apple’s plan, because the Sony screen features an integrated GPU. The new Apple Thunderbolt display may include a GPU, too, but nobody seems to know.
There are good reasons to put the GPU in the display. All those zillions of calculations, after all, are being performed specifically to drive the display. And putting the the GPU inside the screen allows the highest possible bandwidth connection between video memory and display pixels. In some ways putting the GPU in the screen may actually make the screen cheaper to build at such a high performance level. Whether that is true or not, I am sure it is what Apple will tell us.
When Apple announces a 27-inch or 30-inch Retina Display, you can bet it will have an integrated GPU.
POW! Apple will be back in the business of selling $3000 displays and Hollywood, New York, and San Francisco will be back in the business of buying them. Mac Minis will become the Boeing 737 of performance computers. And Apple can at that point probably drive enough connections on its own to create a vibrant market for third-party Thunderbolt accessories.
Or I’m wrong.
Maybe the Japanese will know.
You may be right and I may be crazy, but I see Apple shredding the core “fan-boys” in the professional graphics and entertainment customers.
Look at the hatchet job they did on Final Cut “Zero” upgrade… The video folks are starting to exit stage right into Adobe’s open arms.
Apple is the number one consumer computing company in the world. They don’t need a niche market to keep them floating anymore.
They may go modular as you predict, but I suspect they really don’t care about a big PC anymore or the users that actually need that big PC.
Gamers? I think the bigger audience for the Pro line is the video content creators. They will be the ones hit hardest by such a change to the products because it requires replacement of all current displays. The Final Cut X debacle does not help either. My guess is that video production companies will move to AVID or Premiere Pro using PC hardware and disown the new mac gear.
That’s exactly what I did – Premeire Pro does everything I need – no more FCP for me. It’s also too damn long and difficult getting mac’s worked on when thing go kaput.
Exactly what I thought. But Robert’s followup with Retina makes sense. Everyone is talking, *expecting* Retina displays on iPad 3 but I’m in the camp of believe-it-when-I-see-it. No way a performance GPU could exist to drive an iPad sized Retina and not be 3 inches thick. Unless it’s for fonts only and vector, but just scales dirty bitmaps I don’t see it happening. There just isn’t enough memory, or processing power, throw in native 3D at that resolution and just forget it. Impossible!
For the desktop which is even larger, Robert’s hint at GPU on the screen hardware appears to be the only way. Maybe switching to Adobe is something to delay as their antiquated code base is hardly even aware of such hardware much less of today’s GPUs. I suspect Retina or ultra high film like resolution is going to be an Apple only game. Not unlike iPad is to being the defacto tablet. I doubt that technology will be coming from anyone else (and certainly not HP). But I still won’t believe it till I see it.
The world has changed and Apple now has all the pieces to get rid of big computers.
1. Intel’s I7 and up are more powerful than the previous workstation class processors and run relatively much cooler reducing the need for massive cooling systems along with their low power requirements reducing the power supply size and cooling requirements.
2. Thunderbolt is exactly what Steve Job’s has been looking for since the beginning of the Mac, an external serial connection to a (compact) computer with all of the power of an internal buss, eliminating the need to support the size and power requirement of add on cards in the system Current implications of Thunderbolt (in everything but the Macbook Air) are capable of two bi-directional 10 GBSec streams. And their protocol is a serialization go the PCIe buss. So once intel revs up the interface chips (and get the price down) this has the power to replace every high speed connection you need including the internal PCIe buss. And by pushing the to the peripherals externally there is not need for the cost, heat and size of a power supply in the computer big enough to power “any” possible card.
3. The big question is how fast Thunderbolt peripherals come to market and that seem mostly to depend on intel’s devilry of the chip sets and their pricing along with the development of all the drivers necessary to support a wide range of peripheral “cards”. This should be relatively simple if the thunderbolt chips truly serialize PCIe and as such only require minor tweaking of existing PCIe drivers.
Imagine and Mac MiniPro – it could be everything the cube wanted to be, no need for optical drives, slots for lots of SSD’s and all in a shiny new slightly larger form factor than the mini? Apple Mac Mini Clubman anyone?
And the graphics pros are already switching to iMacs as the performance is already there for most applications, the Mini “clubman” would satisfy all but the most demanding applications – centrally Apple could make one more powerful than any existing Mac Pro with the intel parts on their higher end low power part time development schedule.
When Inetl makes a low power quad core processor with hooks for use in multiple you could easily have a 16 core machine in a box not much bigger than a Mac Mini! or 32 processors and SSD drivers in something much less in size than the original Cube.
PS I still have tow Cubes and last time I checked they were still working.
Interesting idea.
Of course, there could be another route too –
What if they were turning the iPhone, iPad, and iPodTouch into the standard computer – backed up through a wireless connection to the Mac Minis (clustered together if you wanted extreme performance) and connected to a high-end display with a built-in GPU – for that matter, the same wireless connection could talk to the display.
Now that would be something – something that would rival the Motorola Atrix.
Of course, I don’t expect it’ll be wireless to start. They’ll probably put a doc in there that the iPhone/iPad/iPod will plug into (similar to what Motorola did with HDMI+USB for the Atrix), which will ramp up everything else and load it all together.
Now think about what this means for the computing industry as a whole.
Great take on what I’m usually last to know about… 😉
Being a Mac user for the last 10-years (a youngster, I know!) I’ve grown to appreciate what I think Apple are “at”.
Yes, I can see the Mac Pro is a dinosaur. Doomed to extinction. When I was more, er, “fluid” than I am now I bought a new 2011 iMac 21.5. This must be the pinnacle of Mac computing. Don’t tell me about 27″, at that size I need four-foot arms to keep my eyes from going goo-goo as I type! As a desktop computer with plenty of power and the ability, using third-party software, to virtualise Windows in all it’s forms, it was hard to beat as a domestic computer.
With the introduction of more powerful Mac Minis and a possible clever shifting of GPU over to the LCD panel, Apple have chosen the different way (as usual) to the rest. And significantly it’s an intelligent move (if it happens).
I’d never thought of this – at all. So either Bob is doing his magic as usual or I’ve just missed the point somewhere (very, very likely!)
My present PC is a Mac Mini which I’ve just upgraded to 8gb to “future-proof” it. Ok, for games it’s a bit lacking, but for everything else it beats any PC hands-down.
I’ve always coveted those beautiful Apple Cinema displays but thought the stupendously high price a pointless goal I’d never reach. Might as well buy an iMac after all, eh?
The Mac Pro is beloved of sound and movie professionals (as I personally know) but I don’t think it is all about pure power. I actually think it is about connectivity.
Once the new standards become common-place we’ll be able to see more clearly.
Out of interest I recently built a new PC with USB3 ports. There’s nothing much available out there to the general buying public which uses this new and faster standard, and only time will tell if the investment is worthwhile at all.
History has run a full circle. There are 20 years old examples of external graphics accelerators. E.g. Evans & Sutherland’s Freedom or Kubota.
From what I can remember, however, cable link was a bottleneck.
Heck, I had an external video display for my Powerbook 170 that connected through the *SCSI* port. It worked fine.
Nobody hacks up abusively clever hardware like that anymore.
You’re wrong on some of the technical details on this one. Yes, putting the GPU in the display allows the most bandwidth between the GPU and the display… but that isn’t where it is needed. Video cards have no problem pumping the pixels out to the display… that’s small potatoes compared to all the work they do between every single instance of the display refreshing.
Thunderbolt has a bandwidth of 10Gbps (little “b” – bits). Main memory for DDR1333 peaks at 10666MBps (big “B” – bytes), or 83.3 Gbps.
Loading textures models and physics data from main memory across SHARED Thunderbolt links would be a serious constraint, as well as being much less reliable.
Yes and no. With all your problems sorted by Apple’s so called TV — a TV with a Mac mini on-board.
So Robert X’s idea of joining Mac Mini works but one with a big screen and a size specific GPU.
The beauty of a Apple TV with a Mac mini computer is that you don’t need a Mac mini but a iPod nano in the TV and using iOS with specific GPU and memory.
How cheap is that!
$100 extra and a premium of $500 more cash for that pool!
“Loading textures models and physics data from main memory across SHARED Thunderbolt links would be a serious constraint, as well as being much less reliable.”
Put enough RAM on the display and you can offload and page in the textures with just control channels. 10Gbps ought to be plenty for the physics.
I can’t see why the developer has to know where the textures are, from the OpenGL perspective. Pre-loading them would be a good optimization but not strictly necessary with a shim that would handle it for you. For low-performance apps no optimization would be needed. For high-performance apps, people are willing to do small optimizations.
Bobert, you have no clue on this one.
Gamers don’t use Mac Pro machines. Serious gamers (ridiculous, I know..) use Windows, period.
Creative pros (videographers, photographers, programmers, etc.) and scientists use Mac Pro machines.
Very, very few people with a Mac Pro ever install a 3rd party card. Any graphics card they install is likely to have been sold by Apple.
GPUs are not only used for rendering screen images. Apple has pushed more than any other company to use the GPU for computation. Apple’s Aperture pro photography app was revolutionary for offloading processing to the GPU — not for rendering the image on screen but for actually generating a JPEGs from raw camera data.
Sadly, though, it is true that Apple cares little about creative pros and scientists anymore. All Apple cares about are consumers, because that’s where the vast majority of Apple’s revenue now comes from.
I hope the Mac Pro lives on.
Gamers also use Xbox360, etc, connect to large screen TVs.
TVs, that also contain alot of processing power.
That’s the perfect igsniht in a thread like this.
1bLiqE uqanpbfaycar
IzXocT mugktgqmnpeo
I am a gamer and I use Mac Pro. Can’t wait for the next update – if it comes. If it doesn’t, I will have to live with an iMac. That sucks.
At the grand opening of the Lincoln Park Apple Store last October, I asked where the Mac Pros were, and the staff member and I could not find one on display.
Hmmm… makes sense from the Apple point of view.
No more beautiful Apple boxes plugged into a generic black plastic screen. Proprietary data, and no HDMI or DVI ports on the back of the box, so you are buying their display, at a nice margin.
Don’t worry about the bandwidth. At these margins they can include an extra Mac Mini in the screen. Actually why not just do that? Like the HP machines that are all in ones, just a big screen and keyboard/mouse in the box. For the performance user, add a stack of mini boxes.
The key is the update path. Their strategy is to upgrade by replacement. They are very smart to do this now. If Moores law slows down (someday it will, so Steve J. is hedging his bets), or performance is good enough, consumers won’t want to update every 24 months. If you can update your computer in pieces with a new SSD, graphics card or RAM, why ever upgrade.
Do it to get the new style and cool rather that performance and specifications, that is why.
Look at the M$ adverts for “Upgrade my PC”, it is killing them. Not just by not selling new product. Don’t forget the cost of supporting 3 or 4 year old software.
Buy a box. You want more or better, buy a another, different box. Brilliant!
As a design professional, I can’t say I’ll miss the Mac Pro if it goes. I’m happily committed to the Macintosh platform, but I haven’t seen the value proposition in Apple’s towers for years.
And iMac gives me all the power I need, with far less intrusive bulk. As an independent designer I’ve been happy to save the extra bucks by going with the “consumer” Mac, and in the past year or so I’ve gotten the impression that even large studios and design departments are going the same way.
I’ll just be happy if Apple responds to this by dropping the tower, rather than crippling the iMac and Mac Mini in some way that would drive professionals back to buying the more-expensive “professional” model. 🙂
I suppose there are figures in the Apple accounts which really confirm the Mac Pro is not selling in bulk any more to pros, but that the iMac is. Apple have not ignored the fact in any way; with latest iMacs using i5 I guess it won’t be very long before they tame the temperatures and go to i7.
No reason not to engineer-up to quad and beyond.
Wrong. GPUs are not increasingly used to drive pretty pixels. They are being used for general purpose computing in areas where they can excel, like video encoding. Apple bet big on this by helping develop OpenCL (although the new Intel-graphics Macs are a step backwards in this area). You want these GPGPUs close to main RAM and the main CPU.
The Mac Pros are almost exclusively bought by video, photography and science labs and after additional drives and maxed out ram is installed these machines are not replaces for an average of 4-5 years. A lot of photographers don’t mess with the OS or upgrade camera and or printer software unless a new machine dictates so. Apple knows this and sticks to a cycle of processor chip advances to make the new Pros really a new level of computing. Of course if your Mac Pro is from 2006 you might be getting antsy but I don’t believe as long as Mac OS X and Apple’s Pro Apps are being tended to there will be any end to the Pro systems. As Steve Jobs mentioned, iPads are the cars, iMacs are the trucks and if pushed he might admit that Mac Pros are the mega-trucks. Need to move a mountain anyone?
Embedding a GPU inside an LED Cinema Display? Interesting idea, but have you ever touched the rear panel of one after it’s been powered on for 15 minutes? Yikes. I don’t know how you’d keep the panel, backlight *and* GPU cool.
Somewhat unrelated: some of us just want machines we can service. That’s why I’d buy a Mac Pro. CPUs, internal SATA drives, RAM, cards, power supply… all relatively easy if not cheap. Right now, the MacBook Pros are way easier to service than any recent iMac. What is wrong with this picture?
>What is wrong with this picture?
Well to start with it allows customers to put them on six year replacement cycles.
But it also perpetuates the mindset that these are “computers” in an old-school sense of the word instead of commodity computing devices in the newer sense of the word. How often do you service an ipod or iphone? Never: you extract your data (which in their model should be online these days) and get a newer one.
[…] Macs” and “radical” changes, theorizes on Apple killing the Mac Pro and instead doing this:I expect Apple to move to a modular architecture where the building blocks for high performance […]
I’ve been arguing the ‘stackable’ mac mini model… A grand central, higher speed network (now thunderbolt… I was arguing FireWire800 at the time) where you have a Time Capsule, with a compute server, a ‘media (iTMS) server, an ‘App Server’ (in the sense of software), a personal storage server, and if you need, a cable for your monitor (why not just screen share with an iMac or Mac Laptop, iMac, or iOS device… heck, even your aTV2)
Makes sense to put ‘a’ GPU into the monitor. You could decouple processing and have multiple GPUs each doing the memory linking, physics, and the rendering in different modules (ala Grand Central).
The beast of this beauty is that NC data center. Why not a great big caching file server with NC on the back end, synced with your current set of files, and all your Time Machine backups. $25 a year per TB. You buy an app, install it, and it’s install on all your systems that are part of that ‘system.
But back to the Stack. So you start with a ‘AppleServer Base’ (a TimeCapsule), and as you need more power (apps in your office all are distributedly grand centralizable), you add to the stack with mac minis’ If you hook up via thunderbolt, your iMac’s driving the screen, but the ‘stack’ is the system. MBAirs connect wirelessly using the same technology. Content from the ITMS is cached locally (you rent a movie… you start watching on your ATV, you move to the bedroom and your iPad, and it picks up where you left off.
I think it’s definitely the move Apple has to make, to make computing ubiquitous to the consumer. And at $200-$500 a chunk. those are the consumer price points Apple loves.
“It exists primarily to allow users to pack their Macs with extra drives and third-party graphics cards for high-end gaming.”
Yep, all those “pro” high-end gamers …
Ah, beaten to the punch.
But you know, a lot of those gamers like to do heavy video and graphics editing in their free time, as a hobby, at their desks, between nine and five…
The problem with the Pro machines is that they are too damn expensive. They were a lot cheaper when the first plastic G3 towers came out followed by the Grey G4 towers. I just cleared a few out of my old office. I still liked their ease of access and upgradability.
Since the Pros went to their aluminum enclosures they’ve become a lot more expensive.
People used to be clamouring for a cheaper Pro box that could be upgraded. The fabled X-Mac.
I have a Power Mac G5 circa 2003.
I’d love to replace it with an X-Mac!
I paid $2300 for my Quadra 650 w/12 MB RAM, back in ’92 (’93?). And with Quadra 8500’s taking the top end then, pricing is comparable.
The way I read the reporting of this rumor (from a not-well-respected source) was that Apple would introduce “an entirely new product line,” not that an existing line would be replaced. I took this to mean something more like “an Apple branded television with an integrated AppleTV” or something completely different.
As spiffy as Thunderbolt is, it’s a bottleneck. Even Photoshop is using the GPU for some functions when it’s available, so putting a bottleneck between the CPU/system memory and the GPU (now, to a limited degree a GPGPU) isn’t a good idea. If GPGPUs get used for parallel crunching of wads of data, then that data could be offloaded to the remote card and crunched (maybe video encoding would be an example?), but for realtime 3D graphics, as someone already pointed out, the volume of texture and other data (like physics) makes you want the GPU on the main bus, not out on a choked off tangent.
I think Bob is addressing the lingering doubts as to whether Apple is going to remain serious about pro users. But I don’t think this rumor, or Bob’s speculation about a stack of gizmos with a GPU inside a display is likely.
Nonetheless, Bob, keep throwing out these crazy ideas. They’re spot on often enough to make the exercise very useful.
Robert X ( what does X mean) LightPeak was the name of the prototype displayed by Intel — it used PHOTONS to convey the message.
ThunderBolt used in Macs now is a downgrade of the initial concept and uses ELECTRONS to sent the message.
Therein is Apple’s problem in all its gory detail.
Apple give Intel a idea (using PC Express protocols) with optical cables AND Intel sells it back to Apple as a wire connection.
Apple even invented and patented a combined power LightPeak connector. This connector would reduce the total number of connectors for PCs to three —
1 Power/LightPeak
2 SD card slot — data storage/input
3 audio in/out
Apple even patented a external junction box that would convert ALL used connectors (USB Firewire etc) to LightPeak. To keep the pundits happy!
Apple’s vision was stymied by INTEL and its DOWNGRADING of LightPeak to the electron based thunderbolt.
I commented at the time that this downgrading was for Intel trying to keep the LightPeak technology for itself as an VERY FAST CPU to CPU communication.
AND said that Apple gave away the greatest invention for computers ever!
Super Super computers with multi core CPUs talking to 100s of similar CPU’s at 3Thz — 1000 times faster than anything now. AND projected to be 100,000 times faster.
Every Climate modeler – Every Nuclear bomb designer – Every University would want one of these Apple computers at $5 million —- cheap by Cray standards.
NEVER SELL OR GIVE AWAY YOUR IDEAS APPLE!
MacRumors.com paraphrased the start of Q&A during Apple’s July 19 Q3 conference call thusly:
“Q: 12% downtick in revenue guidance is more than usual. Why?”
“A: Let me start with the units. For education buying season, September is weighted toward higher education, and we expect increases there. We also expect increases in iPhone, etc. There is also a future product transition that we are not going to talk about today. Those factors are already in our guidance. Confident in our pipeline. Tim Cook talking about some cannibalization of Mac by iPad. But also cannibalizing Windows. Very happy with 14% growth in Mac.”
http://tinyurl.com/3h9t5m4
I wonder if the Japanese non-information rumor isn’t just a dolled-up version of the “future product transition” mentioned during the conference call.
One product transition that’s an open secret is the next iPhone. But note how the train of thought in the answer jumps from “increases in iPhone” to “also a future product transition.”
Meanwhile, everything in the answer seems to project unit sales increasing, which doesn’t answer the questioner’s query that asked why revenue guidance has decreased. (Specifically revenue, not margin, although both speak to pricing.)
Although Apple might very well be putting a fork in the Mac Pro as we speak, given its narrow audience, it couldn’t account for such a drop in revenue guidance, could it? What else could it be?
Maybe the non-information rumor is that something else. Or by “product transition,” could they really mean some sort of “service” transition, something iCloud/iTunes related? Or does the revenue drop simply reflect the demise of their pricey shrink-wrap packaged software, though why wouldn’t Apple just say so?
“Although Apple might very well be putting a fork in the Mac Pro as we speak, given its narrow audience, it couldn’t account for such a drop in revenue guidance, could it? What else could it be?”
Phasing out the Mac. It’s been a matter of ‘when’, not ‘if’, since 2001 (iPod release). Maybe they’ll announce it on the 10-year anniversary.
They’ll have something that can be used at a desk, but Apple is getting 30% of everything and full control.
The part about this that makes sense is a smaller Mac Pro enclosure with the idea being to move storage out of the Mac Pro box. But the rest makes no sense. 10 Gbps is wicked fast for storage, acceptable to connect a display, but completely inadequate for CPU-to-CPU or CPU-to-GPU communication.
Also, the reason that the Mac Pro hasn’t been refreshed yet is that Intel hasn’t released the needed high-end version of Sandy Bridge yet. That happens later this year, at which time we can expect to see a new Mac Pro (perhaps in a redesigned, smaller box).
I wonder if the Japanese rumor just refers to Air-ized MacBook Pros….
At this moment in time, Mac Pros are still used a *lot* in professional audio. This is a market segment where it is fairly common to use expansion cards for DSP power, as well using the expansion bays for the tons of storage needed both for managing projects as well as HUGE sample libraries.
If the Mac Pro goes to the trash heap, there are going to be a lot of upset people in my industry. Apple will lose some serious credibility among audio pros.
It is happening now.
“Apple’s Cinema Displays, while still lovely, have fallen way down the price-performance curve. They are too darned expensive for what you get.”
You realize, of course, that Dell’s equivalent 27″ 2560 x 1440 LED IPS Ultrasharp professional display is $1,099.00? And that’s WITHOUT the new Thunderbolt “docking station” with FW, USB, EN, etc..
But if you want a $350 27″ 1920×1080 non-IPS screen… hey, feel free. There’s lots of junk out there…
You may be on to something, the inclusion of XGRID in 10.4 seemed odd for a consumer focused company. But with the addition of Thunderbolt, a Mac grid may be good enough to deliver the data rates required for 2K and 4K editing and real time playback without dropped frames.
As I see it, Apple seems to introduce technologies as features in a particular model of a product line. The high end model might feature a new technology as part of a test market. They can test the feature, monitor its performance and look for any unforseen problems. If it performs as hoped, it may find its way into the next product line.
I have, I believe, the first dual processor machine G4 500 DP, it included a DVD-RAM drive and ADC (Apple Display Connector). Again, if I’m not mistaken, the first DP machine, which allowed them to work out some of the bugs in programming for multiple processors. The DVD-RAM drive and ADC were dropped after that.
They are very calculated about product releases and often foretell future products. Looking at a number of products in hindsight would show an evolution, I believe.
That said, this is my prediction for where things are headed.
Grand Central, testing distributed computing, combined with the MacBook Air and Apple TV, testing a storage free device. Would lead to a modular (stackable ?) design, with external storage (XSAN/Time Capsule ?). Your iPads/iPhones would be able to interface with this local cluster (Remote Desktop), if only for display, and not development. The next generation would offload that external storage and possibly processing to their NC data center.
The App Store is testing their softwar licensing and will ultimately lead to you buying CPU cycles (iCycles ?) and be charged based on the CPU cycles you use and apps you run.
They also have a habit of ‘encouraging’ users to adopt certain technologies, by dropping technologies and software thus coaxing (forcing?) users to adopt their lates “favorite child”. Sometimes this is good (ie, dropping the floppy) and other times quite frustrating when you are tied to a particular technology. At some point it might become more trouble and/or more expensive to store your data locally.
Just trying to guess where all this is heading, like it or not.
What’s the latency of lightpeek? While the throughput is amazing that doesn’t mean anything if your mouse->display latency is 20ms off.
It’s PCIe, basically, at 10Gbps. Latency isn’t a problem.
Dammit Steve!
Give me back my compact, modular Mac!
(what some call the fabled “vMac”)
To me, the IIci is still the perfect embodiment of a Mac desktop. Three expansion slots, accessible storage, powerful, compact, and easily disassembled without tools (well, it did require a Philips screwdriver for the two screws if one wanted to completely take it apart; and one of those was to lock down the cover).
But in reality, I think the only truth behind this rumor is that the Pro will get a new, smaller case design.
You’re wrong about the displays- the bandwidth between the GPU and screen isn’t a big deal, but between the CPU and main memory and the GPU is the big deal. So putting the GPU on the other side of the external interface is a losing proposition. Better to have it with a fast connection to RAM and CPU and send the results out to the screen.
A Mac mini cluster might be neat, though.
I think this is all generally on track with the coming trends. Once you can hook 2 computers together with Thunderbolt, they are speaking PCI-Express to each other, they are essentially one computer by our past definition.
But I think what you will see is the user sitting at a MacBook or iMac, but the MacBook or iMac will seem to be 20 times faster than it should be, because they are accessing additional computing resources either over the network or from additional CPU’s or GPU’s that are attached to their MacBook or iMac via Thunderbolt.
But if you are going to plug a box into your computer to add CPU cores, I don’t think Mac mini has enough CPU’s per box. You don’t want to use up all your Thunderbolt devices or have 5 power supplies and so on. There might be room for a Mac Pro that is a big brother to the Mac mini. Twice the physical size, but 4 or 8 times the CPU/GPU power. It could be sold both as a desktop system, and as a supercharger for your MacBook for when you are at a desk.
> You’re wrong about the displays
No, he is right. It’s hard to get your mind around, though.
Right now, you take a GPU that is built on a PCI-Express card and plug it into the computer via PCI slot, and hook the display onto a DisplayPort connector on the side of the GPU card. That causes the 3 devices to be connected at very high speed via PCI-Express.
In a Thunderbolt setup, all the same parts are there, and they still even talk PCI-Express to each other, and they still even use DisplayPort connectors, but they are setup in what appears to be the wrong way. The GPU card is built into a box with Thunderbolt ports. The computer and display have Thunderbolt ports. You connect one cable between computer and GPU, and another cable between GPU and display. Just as above, that causes all 3 devices to be connected at very high speed via PCI-Express.
So a GPU that is connected via Thunderbolt is no different than one that is connected via PCI slot. It can be in a separate box or it can be built into the display or it can be built into the computer. In all 3 places it is just talking PCI-Express.
Except, you know, that thunderbolt doesn’t have the throughput of a PCIe v3.0 x16 slot, let alone two or three of them for multi-GPU setups. 20 Gbit/s vs 128 Gbit/s for max theroetical throughput. Even PCIe v1.0 with its 32 Gbit/s for 16 lanes beats it. Even if it goes to 100 Gbit/s in 10 years it still won’t match current PCEe v3.0 throughput on a x16 slot.
Just curious — whatever happened to the display Postscript idea that Steve did at NeXT? Seems like offloading graphics processing into the display can work, if you can up-level the level of abstraction to keep the bandwidth demands between the GPU/Display and CPU to a minimum…
Display Postscript seemed to be a good compromise at the time … but is sooo late 80s. 🙂
OSX uses DisplayPDF – a successor to DisplayPS. PDF didn’t carry license fees and CPU’s were fast enough to handle the compression.
Product transition: hmmm… Maybe this is an entirely different device: one that fuses iOS with OS X, which incorporates a touch or gesture responsive display. Several patents seem to revolve around this idea. Nothing to do with performance, but definitely in the category of “product transition.”
I think using gamers as the “high end” users of Mac Pro’s isn’t very acurate. High end users of Mac Pro’s would be more graphic design and particularly pro audio. I’m sorry you aren’t going to run Pro Tools over the net, recording live audio to some cloud drive. You need fast local hardware.
Apple has slowly been abandoning content creation in favor of content consumption, because that’s where the money is. The iMac (or Mac Mini) doesn’t have “high-end” graphics for the applications recently returned/brought to the Mac, especially products like Smoke, or AutoCAD. The graphics chips in those computers don’t have what it takes to compete when running applications like Maya, Lightwave, or Cinema4D (let alone things like After Effects) with a Mac Pro, let alone a cost-equivalent PC – especially as those applications are leaning more on the GPU for lower-cost higher-performance processors.
What is truly sad (to me) about this is that Apple used to be about creation (and they pretend they still are, with their creation tools for the masses) but quality content creation isn’t mass produced. My home video edited with iMovie will never stack up to “Avatar”. High end production will always require quality tools, and if Apple kills the Mac Pro there is a big gap in the toolset for high-end content creation from Apple, which will only be filled by other hardware, software and operating systems – not from Apple
Apple is a very different company from the one us “faithful” kept alive back in the 90s… and the fact is they don’t need us anymore. We should be happy about this. Isn’t it what we always wanted?
I don’t know if Apple is abandoning the Pro market but it seems to me two things are pretty clear.
They see themselves as a consumer products ecosystem and the old rules of the computer industry don’t apply.
They are very intent on closing off access to the inner workings. They don’t want an environment where you can go in and tinker with terminal commands, modify plug-ins or add third party cards. They want something that “just works” and you never need to open the hood… just like the modern automobile. If the service light comes on go the nearest Apple store. Some people really like that idea, others don’t.
I’m not passing judgement on which model is better, just saying it’s pretty obvious to me which way Apple is going.
We weren’t keeping it alive. It was going to die.
It only survived because they introduced the iMac and started to thrive with the introduction of the iPod.
[…] what if a deputy is a new, modular Mac system. A stackable system, regulating a Mac Mini form factor, and sewn together with Thunderbolt — or […]
This distributed hardware architecture you describe gives me some understanding about why Linus Torvalds said that virtualization was evil. How would you virtualize such a distributed hardware system? Well you wouldn’t – or couldn’t easily. The idea of a virtualized system is to put multiple operating systems on a conventional platform – it limits the kind of operating system you can virtualize. So if you’re limited to a virtualizable operating system you probably can’t virtualize such a distributed system. Perhaps the distributed system could act as a host.
At the risk of showing my ignorance, I must say I don’t understand your point. I do think I understand Linus’ statement that virtualization is bad. This goes back to the early days of Java where it’s poor performance compared to ActiveX was evident. Too many processing cycles wasted on code translation. But building a computer with faster serial connections and fewer parallel ones, just seems like a toss up.
GPU in the display is nutty. The 27″ Apple Thunderbolt Display has 2560 by 1440 = 3,686,400 pixels x 24 bits / pixel = 88,473,600 bits x 60 Hz = 5,308,416,000 bits / sec or < 1GB/sec which is a small fraction of what a GPU needs. Thunderbolt supplies 100GB/sec. The business end of the GPU is limited by memory bandwidth which is orders of magnitude higher.
I don't know what Apple will do about the Pro. I have a 2003 dual 2GHz PowerPC G5 that is still useful but needs to be replaced. It lasted longer than any computer ever should. I bet that I am happy with whatever fills the Pro niche.
Is the Mac Pro dead?
Given today’s announcement one has to wonder if Steve is far behind.
It would be a great loss not only for Apple but for the rest of us in the tech world.
Whatever happens, we won’t have Steve to kick around anymore.
So long Steve and thanks for all the iThings.
Apple is either #1 or #2 example of why “skunk works” operations should be promoted more. they compete with Lockheed in this regard.
best of luck to Steve Jobs. the only reason he’d be unable to perform is he’s only got maybe the next iPhone product introduction left in him.
Freedom to configure Apple products in your life anyway you want, work and play from anywhere with your data – I believe that the overall plan at Apple. If embedding GPU inside cinema displays pairing with just an iPhone or iPad is good enough a home computer, that’s that will happen. Bandwidth and speed permitting, processing on the cloud will take over much of the need for local super processing power.
Given the news today, it seems unlikely the Mac Pro will immediately stop. More likely a new or dramatically changed line of products will debut later this year, so the excitement and jump in revenue will help carry the company over a transition period, and then have the Pro line replaced by something new next year. Seems like a sensible product strategy.
Taking advantage of the turmoil at other tech companies, this move will solidify Tim Cook’s position at Apple and establish long term stability. There is never going to be a good time for Jobs to leave Apple, so today is as good any. Brilliant.
Now Jobs can kick back and enjoy more afternoon tea with Schmidt.
Without Steve many things at Apple might go “dead”.
Together with the whole thing that appears to be developing within this area, your points of view are actually very exciting. Even so, I appologize, but I can not subscribe to your entire strategy, all be it stimulating none the less. It would seem to me that your comments are generally not entirely validated and in fact you are generally yourself not fully convinced of your argument. In any event I did enjoy examining it.
Boilerplate spam?
Not sure what some of you are talking about but I would be devastated not being able to upgrade my MacPro every 4 years or so.
I am a photographer and work routinely with lots of 60MP files. With some layers these quickly end up over 2Gb each. On occassion I need to process several hundreds of these. My 2009MacPro is taking several hours for this (so I can eat).
The notion an Imac could replace my RAID5 and SSD is possible but not likely or all that handy. However, my main gripe would be the display. The iMac displays are totally useless for professional use. Most professionals (that are serious about screens) either use EIzo’s or the NEC Reference screens.
It is amazing that the only people I hear talking about an iMac replacing the MacPro (or the Mini) are people that are not using MacPros for professional purposes or out of a professional need.
Do you really believe I would use a MacPro when I could use an iMac or Mac Mini?? Get real! The MacPro is very useful for some of us.
You know Steve’s retirement means? The Mac Pro will live on. Tim Cook would not kill the Mac Pro. Too bold a move so soon after Jobs steps down.
Aren’t the basic chip-sets that would likely be destined for a MacPro type application not available from Intel for some months yet? Even with a modular computer, the absence of chips might put some people off from buying. Maybe there’s a new MacPro in similar packaging coming when the chips are available. Or maybe a new package. Or maybe it will be dropped altogether. Who outside of Cupertino even knows?
Aside from that, the idea of using XGRID technology, still built into Lion I believe, would allow smaller computers to be used in arrays much like drives are used now as RAID systems. Perhaps that way a MacBook, iMac, or even an iPad could be the control front end for the overall system, with the individual pieces connected through Thunderbolt. With that approach the entire ecosystem could be scalable to a degree. In a sense, it could work a bit like those SETI analysis programs that use processing cycles distributed on computers all over. Except here it could be a higher speed process with much more local distribution. Sounds like something from a Batman movie…
It would be nice if those chained Mac Mini’s would support multiple logins too. That would be impressive. A small 5 Mac Mini “editing lab” cluster that could support multiple simultaneous logins for DV editing would probably be a niche idea but would be “building block” style computing. Buy one mini, not enough power, buy a second to nearly double performance.
Interesting view, I think Apple is one of those companies that has a solid lineup at the top that can adapt to the changing times much better than other tech companies
The big shift is not in hardware or the shape of the aluminum box used to house it. No, the ground is shaking for high end software makers that survive by selling “content creation” products to professionals at prices ranging from $1K to $30K. These are the real dinosaurs, and their world will be turned upside down as their customer’s transition away from fixed “workstations”. It’s no accident that the highest allowable price in the App Store is $999.
Forget iCloud. This looks to me like Apple truly intends to demote/decimate the “PC” by fully embracing virtualization through Thunderbold. Average consumer buys a new TV with a thunderbold connection and video chip on-board. Could be embedded iOS or XoIP? Need some computtional horsepower or local content, keep stracking mac-minis. Could this explain the enormous facility Apple is building?
Thunderbolt.
You, sir – replying to the original article, I haven’t read the comments – are someone who obviously does NOT work in a creative field – either TV/Film, or ESPECIALLY music.
“…exists primarily to allow users to pack their Macs with extra drives and third-party graphics cards for high-end gaming…”
I am an audio engineer and producer – professionally, working almost (but not completely) in digital audio, and this statement just struck me as SO WRONG that I wanted to do something I never do – leave a comment on a blog.
You see, there’s this company out there, let’s call them “Avidigi”, who dominates the digital audio market at the top end. They have this funny tendency of making all their pro-quality interfaces based upon PCI-or-related standards. There is a reason for this – not only was it [until recently] the only way to get high enough bandwidth for true pro audio (meaning 32+ channels at 24/192, or god help us DSD), it is a mature, stable technology that has been well explored. Thunderbolt is actually the first external interface standard that could even come close to what PCI can do – and it is too new, too unexplored, to be usable in mission-critical applications (such as the primary recording environment in a pro studio).
Macs dominate – well, not so much dominate as monopolize – the recording world (both in audio and video – note I’m not even getting into AMC’s hardware, as I really don’t know much about it), and Apple knows this. Even with their trend at moving towards a more “prosumer” level of technology (see FCP-X), they can’t just abandon support for the most used hardware/software platform in [my] business – they have too much invested in that sector to do so. If they dropped all PCI-supporting Macs, they would piss off way too many of their corporate partners – notably companies like Avid, Apogee, UAD, Focusrite, and many others who have partnership deals with them.
Note that this is only one issue – you also managed to imply that the new “server-grade” Mac Mini’s are in some way comparable to a Mac Pro in performance – and this is true enough, for simple operations like being a file or web server. You seem to be under a (common, to those who don’t work professionally in audio or video) misconception that games are the only reason to have a very powerful computer. But, there are other NON-GAME processor-intensive tasks out there which make having a powerful workstation necessary – notably, so-called Native Audio processing and offline video editing. For instance, that i7 Mac Mini you imply to be “powerful enough” could run less than 30% of the number of audio plugins on a complicated mix that even a lower-end Mac Pro could. I’m leaving DAE/TDM out of this – in my experience, it is a dying format, going the way of old Mix systems, just more slowly – since that complicates things further; suffice it to say, all your arguments as to why the Mac Pro is dead are based upon fallacies in your fundamental understanding of WHO THE MAC PRO IS FOR.
I’m still waiting for my Thunderbolt-equiped Mac Pro to come out, to FINALLY update my audio workstation rig – I need PCIe on there, so that my PT-HD cards will work, so that my Apogee Symphony cards will work, and so that if I chose to use the UAD-2 plug-ins, I could. I’d also like to have enough processor performance to run all my ridiculous convolution-based plugins without resorting to freezing or bouncing tracks…
“You seem to be under a (common, to those who don’t work professionally in audio or video) misconception that games are the only reason to have a very powerful computer”
Well, not the only reason, obviously, but by far the most important one. The sort of stuff you talk about (like Pro level audio and video editing, pro level 3D modelling etcetera) are tiny niches compared to gaming.
That said, not even gaming is a reason to keep big fat desktop boxes for the future, since gaming is rapidly moving to other platforms. Finally, those gamers that do stick with a desktop computer for their hobby are not going to buy a mac anyway, so this whole argument is moot.
Regardless, the point is that Apple don’t need to keep specialized groups of fanboys happy any more. The niche markets of “creative” professionals (read graphic designers) kept Apple afloat when everything else they did sucked. Now everything else they do is great, and they don’t need “creative” life support.
Maybe there will be another round or two of Mac Pro, but I think Bob is correct in the assumption that the traditional PC design is not in Steve Jobs vision of the future. Come to think of it, they aren’t in anyone else’s either. It’s not a question of IF desktop computers of the traditional “PC” design will become extinct, it’s just a matter of WHEN. It follows then, that Mac Pro is low on the priority list.
“It’s not a question of IF desktop computers of the traditional “PC” design will become extinct, it’s just a matter of WHEN.”
Very true across the board for Macs, Windows, Linux, Chrome OS, etc. The big box PC is a dinosaur and is rapidly disappearing for most consumers- frankly an iPad has all the computing power that 75% of computer users need, a laptop has all the computing power 90% of computer users need, and only about 10% need more. Some solution will remain available for them, of course- those 10% (hardcore gamers, scientific users, creative industry users) are vocal about advocating for their needs and while all are niche markets there’s enough of them that someone will cater to their needs.
We’ve been down the “you can’t be obsoleting my expansion architecture” road before. It was called NuBus.
The transition will likely incorporate PCI/PCIe expansion chassis with Thunderbolt I/O. It’s such an obvious concept that I guessed there would be a thread about the topic or a vaporware product and a quick search reveals that to be the case.
http://forums.macrumors.com/showthread.php?t=1203179
By the way, I have a Radius VideoVision/Telecast card in pristine condition if anyone is interested…
[…] what if the replacement is a new, modular Mac system. A stackable system, using the Mac Mini form factor, and sewn together with Thunderbolt — or […]
Would it be possible that Apple could use A5, A6 etc. chips in multiples to run a new Mac? If you had many of these chips running in parallel it would seem to be a very fast machine.
It would also take intel out of the picture unless the fab’d the chip.
Apple designed the processor for the iPhone, the iPad, why not a next gen mac?
Surely the main reason there has been no new Mac Pro yet is that Intel’s new dual-socket capable Sandy Bridge Xeon chips are not available yet, and aren’t due till Q4 2011? http://en.wikipedia.org/wiki/Sandy_Bridge#Server_processors
To date the Mac Pro has always used workstation-class Xeon chips, which is the only way to build 8 or 12 core systems.
I work in a group that does various forms of numeric modeling in hydraulics. The reality is that the people who do this sort of work, or any sort of number crunching, are a vanishingly small demographic. What drives high performance computing on PC’s is gaming. So when we go looking for power, we start by looking at what gamers use, and go from there.
Serious gamers will move to wherever sufficient numeric power is, and we will follow. You may view it as ironic that gaming is determining the path being followed by people doing number crunching. I am just thankful there is a large demographic that moves the hardware forward.
You got it right (which many in the “serious” computing camp don’t). Gaming have been driving PC hardware for a long while.
It’s coming to an end though, since gamers have been in the process of moving (back) to consoles for a good while. So the PC is on the way to become a dinosaur in gaming as well. There are several reasons for this, one is that the most “number-crunching” bang for the buck one can get right now is a PS3 (since Sony sell them at a loss). More importantly there are social factors that drive the abandonment of the PC as a gaming platform and of course that the game companies prefer hardware that they can control.
Neither of which spells a bright (or any) future for the Mac Pro.
[…] una torre clásica? Sebastian Anthony, amigo y redactor de la web ExtremeTech, va más allá: apoya la idea de Robert Cringely y propone la idea de un mac modular, algo moldeable a nuestras […]
[…] torre clásica? Sebastian Anthony, amigo y redactor de la web ExtremeTech, va más allá: apoya la idea de Robert Cringely y propone la idea de un mac modular, algo moldeable a nuestras […]
I’ve been saying for years… that scalable/networked computing was the way forward. (Well since Atari was poised to dominate the computing world with the Atari Transputer Workstation in 1989) want more power? Just keep adding CPUs! We do it with After Effects and 3D render farms so why not all CPU/GPU tasks.
I’m all for this idea. However, I sincerely hope that Apple release a kind of MacPro rack-mountable “control hub” that allows connection to far higher bandwidth peripherals than Thunderbolt eg: 16 lane 128Gbit PCIe 3.0 GPU arrays, HD and 4K Video Processing Cards, High Performance RAID arrays, etc. That is until Thunderbolt/Lightpeak is widely available at 100Gbits.
Coupled with a set of MacMinis (or better still – cheaper cut-down MacCPU blocks that click together like lego) all connected via one or preferably more 10Gbit Thunderbolt channels for scalable processing.
Bob – It’s worth noting that Sony don’t actually have the GPU in the monitor but rather in an external case [Sony Vaio Z Power Media Dock] that can connect to 3 external monitors. This is a great idea as long as we can add multiple GPUs for SLI gaming or GPU processing of Data!
I’m not sure where the idea came from but I personally hate the idea of having GPUs in the monitors – the extra heat and noise that Pro’s like to hide away in racks would be negated…
…I’m sure the accountants would also make sure they shipped as sealed units with no easy way of upgrading the processing power without the changing the whole display each time you want more GPU power!
The display idea ignores the increasing use of general-purpose graphics card programming.
“When Apple announces a 27-inch or 30-inch Retina Display, you can bet it will have an integrated GPU.”
I’ll be against it: the GPU needs to be close to CPU and memory – with a faster interconnect than Thunderbolt or anything similar can offer. Consider that the current interconnect speed of a decent GPU is 16GB/s in full duplex – more than 10 times that of current Thunderbolt. Yes Thunderbolt will advance (if it doesn’t die – I kinda hope it does) – but so will GPUs – probably at a faster pace.
That said, I think you’re right on target with the impending demise of the Mac Pro.
I much prefer informative articles like this to that high brow leitrtuare.
srKClI tdurorrypmyj
85WNtl abmlbxucaifb
iphone hack|android hack|cheap mobile|free mobile app|android app…
[…]I, Cringely » Blog Archive » Is the Mac Pro dead? – Cringely on technology[…]…
onedayon7…
[…]I, Cringely » Blog Archive » Is the Mac Pro dead? – Cringely on technology[…]…
downloads…
[…]I, Cringely » Blog Archive » Is the Mac Pro dead? – Cringely on technology[…]…
Game Center…
[…]I, Cringely » Blog Archive » Is the Mac Pro dead? – Cringely on technology[…]…