Amazon Web Services quietly released on Tuesday a pair of new instances on its EC2 cloud computing service. Not just new instances but a whole new type of instance aimed at 2D and 3D graphical computing. For the first time from AWS in a generally available instance, developers and users will have access to virtual machines with GPUs. It’s like putting a PC in the cloud. More properly it is like putting your PC in the cloud. I think this has great disruptive potential. And that means we’ll see similar services coming soon from other cloud providers.
Autodesk must think it has potential, too, because they’ll be offering several applications on the new platform, though notably not AutoCAD, at least not at first.
Starting at $0.65 per hour you’ll be able to run in the Amazon cloud DirectX, OpenGL, CUDA, and OpenCL applications and services. The base g2.2xlarge instance comes with 15 gigs of memory, 60 gigs of local storage, an Intel Sandy Bridge processor running at 2.6 GHz (26 compute units) and a single NVIDIA Kepler GK104 graphics card with 1536 CUDA cores.
Sounds powerful, so I asked Max Levchin whether this is important and he very diplomatically said, “GPUs have long ceased to be used only (or even primarily) for graphical processing. Because of their massively parallel architecture (you can generally do basic processing on two, or two million, pixels perfectly concurrently) they are an amazing tool in solving very CPU-intensive problems. General Purpose GPUs have been gaining real traction in the last few years in all kinds of applications.”
It’s pretty obvious to me that this graphical cloud will be useful for things like encoding and transcoding video and audio, especially for those who don’t need to do it very often, but the bigger question is whether it’s enough to kill most desktop computing? I suspect it is, though maybe not in exactly the way people expect.
Lots of experts over the years have told me there will always be desktop computing, but when you explore further it becomes clear they mean there will always be desktop computing, but not necessarily desktop computers.
Desktop computing means a large display, keyboard and mouse. It used to also mean local processing and storage because networks were so slow, but that part is changing right here and now.
Most of the time I need no more processing power than is already in my mobile phone. In the office it would be nice to have that bigger display and keyboard, but as I have written before these are becoming wireless peripherals that will shortly be activated whenever I am nearby. The industry will love having a whole new class of products to sell us.
What has been keeping desktops viable is the lack of those wireless peripherals and the problem that for some activities a mobile phone processor just isn’t powerful enough. Enter Amazon’s new graphical cloud. Now your mobile, whether through the handset or through those wireless peripherals I’m predicting, will be able to access as much processing power as you are willing to pay for. And because it’s a shared resource that cost should be pretty low.
Why keep a desktop for 10 hours per year of Photoshop picture editing? Better to pay $1 per hour or so for workstation performance with the latest version of the software. If that sounds like a kick in the head to Adobe or Autodesk it’s also likely to broaden their markets attracting new users.
A recurring argument for local computing is that cloud-based processing doesn’t work well on airplanes and that people just like to keep their own stuff. Well airplanes are getting more and more networked and how much time do you spend in flight anyway? Use that time for drinking, instead. And yes, many people like to keep their own data but more and more the eventual channel for sharing and deployment is online. Whether it’s by e-mail, Facebook or YouTube, our products are more and more destined for the cloud anyway.
This is all just a guess on my part, of course, but I think the elements are coming together fast to make your present desktop computer your last one.
sorry for being off topic, but do you have any recent insights on how Fukushima is “progressing” us towards some sort of Nuclear Radiation Apocalypse? Thanks!
maybe it should be: “one of your last ones”… the rate of adoption for disruptive new technologies always seems to be slower in Reality than in the rosy dreams of their proponents! and I think early adopters represent only a pretty small subset of all users.
New technologies almost always take longer to arrive than initially expected, and almost always have a far larger impact than predicted. Eg, tablets.
I went from 19″ dual displays, a WACOM tablet, and a full size ergonomic keyboard one month, to a Chromebook the next. I almost never fire up that big computer anymore. All I am missing is video editing. Signs are good that I will not be waiting much longer for that.
Next step, warehouses full of exotic printers and toner dust in major cities, serviced by bike messengers? How much does one pay at Kinko’s to rent a computer by the hour again?
I am still bothered by the fact that the cloud gives governments and other hackers one stop shopping for trolling through my data.
News flash … They don’t need the cloud for that.
How can anyone troll through your data if it’s not in the “cloud” (which means accessible over the internet)? They would have be given physical access to your data, a much more involved process for them.
Ronc is absolutely correct. If you don’t want someone to look through your files, don’t make them accessible to the internet. And the cloud, by definition, is accessible by the internet.
The problem with this kind of thinking is that the processors are normally not the expensive part of a desktop PC. Yes, high end gaming graphics cards are pricy, but 27″ 4K monitors are more so. The less than powerful mobile processor is kept that way because of the desire to keep power consumption low and cases thin. If there were a way to slot your cell phone into a higher draw PSU and cooling system it can run much faster. Of course, the cost would be so high (unless it were standardized, which clearly isn’t going to happen with today’s market), you might as well just build dedicated hardware.
Cloud storage suffers from the same problem. Dropbox charges $100/yr for 100GB. Amazon has a 3TB NAS for $155, one time fee. Yes, you’ll need to figure out how to forward ports on your router, but it’s exponentially cheaper.
We’re in a strange time in the computing world. The amount of computing power on your desktop is unbelievable, yet the focus of the industry is to devices that are intentionally dumbed down, pushing features like thickness and battery life over traditional benchmarks and other comparisons. I keep moving between a Macbook Pro and a tablet for sit-down mobile computing, and I just can’t get past not having multiple windows open on a large screen. Tablets are nice for looking up actors on IMDB while watching TV (they will be your universal remote control in 3 years) and other quick functions, but for any sort of deep work, forget it.
“The license is the product”, as that old article out of Berkeley Law School was titled. “Why rent when you can buy?” turned on its head is “Why sell when you can let?” And that is the holy grail of business today. Wall Street taught Silly Valley well what it learned from the New York City window-washer: exercise the right to get in your way and hassle you until you shake loose some coin.
A wireless keyboard, an optional wireless mouse and a tablet together have all the effective components of an X terminal, and then some. It’s just a matter of firmware, subject to the rules above (and to the backpressure of inquisitive hackers).
Personally, I would love for the consumer electronics industry to take the PogoPlug approach to opting the product out of the experience, but I’m not holding my breath. The user experience is not meant to inure to the ultimate benefit of the user.
Sorry, I don’t get it. I think your comment more or less makes my point. Yes, mobile processors are hobbled, but even hobbled they are more than good enough for most productivity applications that haven’t significantly changed in a decade. There’s no need for CPU chillers when we’re talking about word processing, spreadsheets, and even presentation software. And for the folks who really do CPU-intensive tasks, here’s their next computer in the cloud! They’ll always have the latest one with plenty of power and a fresh (minutes old) software installation and the price will go down every year. What’s wrong with that? And your assertion that 100 gigs on Dropbox is “orders of magnitude” more expensive than a $155 drive, I don’t get that either. An order of magnitude is 10X, “orders of magnitude” is more than 10X. That $155 drive will be useful for, what, three years? That’s $52 per year but make it $55 if power is included. Yes, $165 is less than $300 but not “orders of magnitude” less and Dropbox pricing is always going down.
That $155 NAS is a 3 TB (terabyte) drive. Dropbox rents 100 GB (gigabyte) of space/yr. Last I heard terabytes are an order of magnitude greater than gigabytes. If you’re worried about drive failure, buy 2 and mirror, and replace one each year, just to make sure (although I have mirrored drives that have been in use for years without problems, and a backup). You’re still up on storage space, but it’s not quite as easy as cutting a check to Dropbox every year.
And you’re right, I didn’t make my point well enough. The idea that one needs to rent “10 hours of Photoshop” a year because their desktop isn’t up to the challenge of rendering photos isn’t going to hold up. The processor is cheap. It will continue to get cheaper over time. The same commodity hardware that Amazon has in their data center is also on your desktop. That processor that gets stressed for 10 hours a year running Photoshop is also going to render web pages faster and compress/decompress video conferences the rest of the year. A year or so later it will also be in your pocket, pre-processing your vacation pictures before you push them up to Twitter. I’ve bought plenty of hardware over the years, and nearly every time I’ve paid much less for far more. Lately the hardware seems to have got out in front of the software… and software even seems to be regressing thanks to the mobile revolution.
There might be a market for batch processing, which is essentially what this is, but I can’t think of one that isn’t already being served. If you’re paying by the hour there’s no way an interactive session makes any sense economically, even at $0.07/hr. And from the looks of it, that’s just the price for now. The spot price looks like an auction or market bid, which is interesting but no way can you reliably predict your computing costs. If you go with a reserved instance you’re going to be paying a minimum annual upfront fee AND an hourly rate. And both choices look confusing enough to end up costing far more than you’d expect, much like trying to figure out an old long distance bill.
I don’t want all my data in the cloud, not that Google and the NSA don’t already know all about me…
However, excluding all my music and movies, my entire life’s work (business and personal) consume less than a gigabyte…. Sad really, now that I think of it…
Tax paperwork scanned to PDF, work documents, household documents (also scanned to PDF) email (worth saving) Pictures I value, scanned and native digital, are the single largest files in my collection. Music can be re-ripped, same with movies.
I think with secure encryption (https://www.indiegogo.com/projects/the-truecrypt-audit) I could store everything in free cloud storage…
But again, what’s secure?
1 Tb = 1024 x 1 Gb and that’s 3 times 10x so three orders of magnitude IMHO.
Bob, you forgot to factor in the storage space. It’s 100 G Dropbox vs. 3 T nas. For 3 T, you’d have to buy 30 Dropbox’s for $100/year. Over three years it comes to $9000 vs. $165. That’s almost 2 orders of magnitude.
Bob’s (and the industry’s) point is that people want access to all their data whether on a desktop or on a portable device, so it is being saved in the cloud. Now as long as it’s there anyway, why not do the processing there as well? The limitation is the available bandwidth and the breakthrough is fast compression in the cloud to reduce the amount of data that must be transferred back and forth to work on, over limited bandwidth, while it’s in the cloud.
With the increase of smartphone and tablet use desktops have certainly taken a hit – for those users who were, initially at least, not really “into” computers. That is something it is easy to understand.
What I find not-so-easy is to see why cloud computing can be in any shape or form better than desktop-based. For sure technology is being developed to speed up networking globally, and peripherals like monitors can still be local but it is likely there will always be data-jams. Fast quiet times and slow popular times. With geography being what it is, how can connection to any service be anything else?
My Mac or PC is within two-feet of my fingers wile I work and latency minimal.
The other fly-in-the-ointment is still (and very current) security …
If you have problems reaching a web site it’s rarely a network problems and nearly always a server problem: too many people trying to access the data at once. That doesn’t happen when you are on your own PC but then it also doesn’t happen when you are accessing your own instance or virtual machine in the cloud. Amazon and others only sell what they have to sell. The cloud doesn’t slow down. And security is likely to be better, not worse. Each time you fire up that virtual machine it gets a new OS install. Each time you shut it down the OS is purged. That’s not the case with your PC or Mac. You can’t download a virus or a trojan if what you are looking at is a TV show generated by the remote instance. Think of it as an opto-isolator.
My early computer use was with time shared main frames, and maybe that distorts my understanding of how these cloud instances will work, but, won’t most users spend most of there time setting up problems, using very few cloud computer cycles, and then select the ‘calculate’ or ‘render’ button and really utilize the GPU for for a few million computations to solve and present results? Surely Amazon will not have a physical GPU for each instance sold, and when the courier service hits ‘compute’ for their 600 stop traveling salesman route solution, there will be a queue for the next few instances that also want to compute. It seems like the optimum customer to hardware ratio must result in some amount of overload, just not enough to drive away big customers. Or is the hardware and operating power so inexpensive that there is no pressure to maximize utilization? (Or maybe they have a staff with only engineers, and not a single MBA? 😉
When at my desk, I might as well have a desktop.
When I’m using my phone, the network is usually the bottleneck. More precisely, the data connection between my phone and the cell tower is pretty bad, *even* when a voice call is working.
I’m sure there are plenty of people for whom this new model will work, but I suspect the “last mile problem” will keep it from becoming dominant before conditions change again.
Maybe this will work for you and large companies that can afford the cost of the data connection but for most small companies in the USA, the cost of a decent internet connection makes this very expensive – we’re in Baton Rouge and pay our ISP (Cox) about $250 for a 3Mb/300kb service. I can’t justify paying upwards of $600 a month for a data connection fast enough to make this viable.
And what about the folks who don’t live in the big cities with fast internet connections like I have (sic) – my mother-in-law has a 56kB modem connection and if it’s not raining I can get 22kb/s through it … how’s your cloud going to work at that speed?
Of course, if it rains in Mississippi then she’s down to 14kb/s … and the phone company tells her that she’s too far out of town (about 2 miles) for it to be worth their while running ADSL out there.
The bottom line is that America is just plain lousy at making the investment in Infrastructure – sure the phone companies collect the fees for it but they divert it to generate income and don’t give a flying donut for the majority of their customers in rural America.
Data caps will kill Amazons usefulness and kill any cost savings.
Fortunately, Mr. Bezos happens to be among the elite cadre who buy ink by the barrel and can throw his customers’ weight around a bit more handily where it matters. Interesting, though, that WaPo isn’t barking about the impending death of net neutrality…
Why do you pay so much for Internet service? When I lived in Charleston my Comcast Business Class service cost $168 per month for 24 mbps down and 5 mbps up and that was three year ago. You are being screwed. But beyond that the bandwidth requirement for an H.264 1920-by-1080 data stream at 30 frames-per-second is 1.4 megabits-per-second. Now let’s think about that as a computer screen signal. Is your screen less than 1920-by-1080? Then you’ll need less bandwidth. Does your screen data significantly change less often than 30 times per second? Then you’ll need less bandwidth. That 1.4 megabit number is for TV and what you do is less bandwidth-intensive than TV so your current Internet connection should be fine. Still, Dude, you should dump that ISP.
Holey Kao! I pay C$50/month for 15Mb/1Mb and I live deep in the rainforest on a
Pacific island!
nice island.
I pay $80/mth for 16mbps down and $768kbps up (advertised speeds, not real life) and a 90gb monthly quota (real life quota though!) for 5 people. In any given 2 week period there will be an hour or so when even fetching a google search result page will need a refresh or two to get all the content.
(My brain still boggles at the numbers. My first bought-with-my-own-money hard drive was 300mb, a monster at 2″ thick, that kept me going for years, and now I chew that up in less than a day!?? how is that even possible?)
The great computer in the cloud sounds Real Nice for those in the right neighbourhood. I don’t and won’t begrudge them their cloud computing valhalla, but feel the pain for those on the outskirts who will want and/or need to keep their desktops for awhile longer than one or two more generations, but have to pay increasingly more for them because they “just don’t make ’em any more!”.
I’m paying $70 for 3Mb down / 768Kb up….
Comcast should be in our neighborhood by “Early Next Year”, and then I can get 6/1 for the same price… for the first 6 months… then it’s very difficult to determine the price after that…
(Google, I have money I wish to give you. Please find a way to exchange this money for fiber.)
Bob,
I’ve got 10mb (up and down) business service from Comcast and pay (gulp) $1000/month. This supports offices and a warehouse of about 75 users. This seems outlandish but is about the same as the quote I got from AT&T, and a way better deal than the T1 my company had before. Some of it probably has to do with being the first building in an ancient office park to bring fiber in. But still, what gives with the regional price disparity?
Also, by your math, that 10mb would support 7 people streaming video at 1920×1080 constantly. That seems a little low (or do I need to go back to having my firewall block YouTube?)
“When I lived in Charleston my Comcast Business Class service cost…” What do you pay now?
MULTICS dream becoming reality?
I won’t be using anything “cloud” ever……it’s none of the NSA’s business. Air gap, plus encryption is the only way to keep these law-breaking bastards out of my business.
Not sure I fully understand that reasoning. The IRS is already fully into your business and its only purpose is to take away your money. At least the NSA is trying to protect your security at the expense of all taxpayers in general. The IRS is a far more personal threat than the NSA.
Cloud processing makes sense for intermittent very intensive processing processing (de-scratch an old movie and unify the colors from frame to frame).
Relying on the cloud as the working data store is surely very foolish, for guaranteed accessibility and security reasons.
And please, encrypt your stuff at your end (user end) because that’s the only secure way. Incremental backups by the cloud provider’s software always means they have access to the raw copies of your stuff.
No Bob. Won’t happen. Try running Microsoft Remote Desktop across most cable systems. Performance is borderline rubbish and not close to running apps on local PCs.
Maybe this happens for casual users. But for business? PCs are way too cheap and bandwidth expensive for the amount needed. I’m also in no hurry to trust any cloud provider to keep my apps running and secure.
Thanks for this comment and here’s why you are wrong. RDP, Citrix and other remote desktop computing technologies are OLD. They encode the screen pixel-for-pixel using the CPU (no GPU) so your whole system is near maxed-out and slows down with most of that utilization just driving the screen. New technology uses H.264 HARDWARE encoding already built into the new CPUs to take over that load. The terminal processor runs faster, the screen paints better, and the bandwidth required (under a megabit for 1920-by-1080) is significantly less. Yes, you’ll never do it using the crap technologies you mention, but with new technologies you won’t even notice the difference compared to running locally.
H264 is fine for distribution. Come on, you’ve done video work. For a 30 second commercial there might be a terabyte or more of raw video. The actual editing might take a few hours and then there’s hours of transcoding to all the output formats. (There’s never just one.) There is some potential for speeding transcoding with more virtual GPU’s but the data transmission requirements will choke an OC12.
Bob just explained the transmission requirement for one user would be on the order of 1 mb/s. OC12 is about 600 mb/s. So I guess I need more info before I will understand your point as well as I understand Bob’s.
Cringely was saying that the equivalent of screen-captures will take 1 meg/second.
Esmith was saying that a 30-second commercial requires something like 1 terabyte of raw video input — which is ~30000 times as much. That may well be largely uncompressed, and it will take longer than 30 seconds to shoot in the first place, but … being able to capture it fast enough has often driven hard drive requirements. Being able to not only capture but pre-process and upload in real time is hard enough that (even if you have the bandwidth) it will already require hardware comparable to what you would need to just process locally.
That said, people taking that much footage to keep 30 seconds may not be the 10-hour-a-year casual users, so it isn’t a problem for the mass market. And at the top end, the need to split responsibilities (camera vs editor) and archive results is such that the overhead of that initial upload may already be there anyway.
The biggest reason NOT to use a Cloud is retaining ownership of your info, and potential ease of access by governement agencies. If all the police need to do is send an administrative subpoena rather than go to court and get a search warrant, I think cloud computing will not be in my future.
I own my data and parking it in Amazon does not mean I surrender ownership although there are court decisiions says I ‘no longer have an expectation of privacy’ when my info passes through Amazon, Google or whoever.
Interesting, but no thanks
Computing has oscillated from client-server to peer-to-peer every few years. Back in the ’90s the former Network Computing Devices made a cheap/simple Xserver box that would have worked with this idea perfectly. It was killed by falling PC prices. Now the pendulum swings the other way, but will the NSA interfere with that trend? The big cost of a desktop these days is administration/licensing, so you end up dependent on the pricing whims of others.
We’ve heard this story and promise many time before.
At this point the average consumer is not only over-served by the capabilities of their current desktop computer, they’re close to being over-served by things like the A7 chip in the latest iPhones and iPads.
Amazon can’t make a market for this kind of remote computing power out of people who only want to use it “for 10 hours per year of Photoshop picture editing.”
The quality of image editing apps on the iPad has just about caught up to any non-professional’s needs already. The same could be said for iMovie and GarageBand for video and audio.
Your angle on this as something for use by average consumers, replacing their desktop PCs, is off. Better to have glossed over that and focused on use by professionals, which you mostly neglected to mention, aside from references to AutoDesk.
“It used to also mean local processing and storage because networks were so slow, but that part is changing right here and now.”
Bob, I think you are correct in that the issue of network/bandwidth performance has been addressed sufficiently from a technological/performance, but to me the biggest bottleneck against the “death” of the local PC (with its self-contained keyboard/monitor/hard drive/apps) in favor of the “virtual desktop in the cloud” is going to be the COST of all this network bandwidth.
The ISP’s and cell phone carriers seem to all be in a mad dash to raise their prices higher and higher. It’s great to tout how your Samsung or iPhone can do anything a desktop can do, but how much per month does your typical user PAY for all this bandwidth? I know plenty of smartphone users shelling out $100 or more per month!
In fact, my plain ol’ slow-speed cable-modem based ISP has raised its prices three times in the past year and is now attempting to gouge me for $48/month for just basic 3MB down / 1 MB up bandwidth (a level of service that cost me $27/month when I originally signed up just a few years ago). Personally, I think the consumer is going to revolt aginst the ever-increasing ridiculous cost of broadband access, whethered wired or wireless, and I see THAT as the biggest impediment to widespread adoption of cloud computing, not lack of available bandwidth or performance.
If you’re in industry, and your employer pays for your super-duper high-speed access, that’s one thing…but to your average Joe or Jose (private individual or small businessman), $1000-$1500/year just for internet bandwith is alot of moolah. I mean, a decent well-powered desktop computer can be had for about $300 and can give 1-3 years of adequate service….but that may only buy you 3-4 MONTHS of sufficient high-speed bandwidth.
In today’s economy, I think the general computing public is going to do all they can to avoid a high monthly bandwidth bill, and if that means doing as much as you can locally, then the PC that sits on your desk (or lap) and can be had for a few hundred bucks still has alot of life left in it.
I agree bandwidth is expensive and going up. But your comment about “$1000-$1500/year just for internet bandwidth is a lot of moolah” got me thinking. Just about any business hiring someone to use or maintain that PC will incur much more expense than the internet cost. And what about when someone gets infected with “Cryptolocker” and has to pay $300 to get their data back? As much as I hate the cloud, its cost of service, cost of bandwidth, and security problems, I can understand a business decision to get rid of the more costly desktop machines if you include human maintenance work.
I work as a developer/architect for a large hospital system. I work from home 2-4 days a week. I get into the network via Citrix and then use Remote Desktop to get to my work PC. That PC sits on a gigabit Ethernet network. I have two monitors at home and able to use both with RDI. I have an average FIOS network connection at home. For the work I do, I do not notice any lags with my mouse or keyboard.
My PC at work could easily be a virtual PC image rather than a desktop, they could rent out my cubicle, and I would be living in the world Bob described. Though I have felt better about a computer I can control (starting with PDP-8 long before any PCs), this new kind of paradigm works.
I think latency is a killer for lots of real time gaming use.
My keypress has to roundtrip to my AWS instance, then to the gaming server to process the effect of the keypress, then back to AWS to render the resulting next frame, which then gets encoded and sent to me.
I’d strongly suspect that level of latency will result in awkward, sloppy-feeling controls for first person shooters. The equivalent of developing a stutter when you can hear your own voice delayed by a fraction of a second.
Different people, and different games, have different levels of sensitivity to this, so it’s not a dealbreaker across the board. Infrequent players might not know any better, and casual or turn based games might not suffer noticeably.
The gaming servers could be co-located on AWS, which would help, but not sufficiently, I don’t think.
Also, video compression artifacts in my games would bug the hell outta me. Again, maybe casual users won’t know any better.
Part of me wants this to work so that a mobile phone with HDMI out and a bluetooth keyboard is the only ‘desktop computer’ I need; another part is very afraid of all my data living in (or even just visiting with) ‘the cloud’…
Will what you are proposing scale over millions/billions of users? I think Netflix had 40% of network traffic in the evening hours and now they want to start streaming 4K which has significantly higher bandwidth. The point is the demand for bandwidth is going up dramatically. Could the internet backbone handle the load if millions/billions of users were using such systems? Bob, maybe you should do a post on the capacity of the internet and where will we be in ten years versus projected demand.
The other thing to consider is points of failure. When you string your processing out to the cloud, you introduce more points of failure. The end user is sitting there and can’t access their stuff because something between them and/or the data center failed. The advantage of a local computer is that if the internet goes down, you can keep working locally. For many business, this is a big deal. For casual users, no so much.
4k video using H.265 compression uses the same or LESS bandwidth than 1080p using H.264
In a year or two every cell phone and set top box will have hardware decoding for H.265 and the bandwidth requirements will not change from what they are today. When they launch their 4k services there will be so few people with the bandwidth or hardware to play it, that it will not be an issue before the set top boxes are released.
What you are saying about 4K is not true. Go watch a few episodes of “Home Theater Geeks” on the “twit.tv” network and the experts there give a true picture of what is happening in that industry in terms of 4K. Anyways, you are missing the point that if everybody puts all their computing in the cloud, bandwidth requirements will jump up dramatically and the infrastructure will need to be updated or rebuilt to handle that kind of load. In the 4K example, if everyone today was streaming 4K videos, what would it do to the internet backbone? ISPs now oversubscribe their bandwidth today assuming everyone is not going to use the network all at the same time and so we the end users don’t notice.
I’m sorry but perspectives like this are incredibly short sighted. We look at the cloud and think “risk free” because we see redundant hardware, safe environments and all of the other conveniences of the data center.
But these are only the most obvious differences in cloud technology, they are not the definitive ones.
The “magic” behind the cloud is largely software. High availability, SAN, virtualization, scalable hardware allocation…these are all software tricks that would work just as well on local hardware as they do in Amazon’s datacenter, which is why you see large companies moving out of public clouds and into “private clouds” or local instances of this same technology. I know this because I am a systems engineer working with companies like this, and even before the NSA stuff the bigger fish in the pond opted for local cloud where they could get it. By the time you tack on SLA and all of the other legal gaurantees business critical operations need, cloud services start to look a lot less like Dropbox and Google Docs and a lot more like hosted solutions looked 10 or 20 years ago. Cost inefficient for most situations. EC2 looks great if you’re an individual who only needs limited time, storage and bandwidth from it, but local IT firms with their own data center space regularly beat the pants off of it on price point when you start to consider 24/7 operations and large amounts of resources.
The unspoken truth is that these technologies stay in the cloud (for now) not because they function most efficiently from the cloud but because they are more profitable for the companies involved to operate them from the cloud. Even with a small business customer, a $7000 server every 6 years is not nearly as profitable for the solutions provider as $400/month for a hosted equivalent over the same term. Additionally, it’s much harder to fire a company hosting you in the cloud because you can’t simply walk them out the front door and bring someone else in. Your critical business software is on their servers, and now VMs must be exchanged around a planned downtime and at no small cost (assuming you aren’t using SaaS, in which case a far worse scenario emerges….database migration). As another reader pointed out, even DropBox is silly cost-inefficient compared to local storage. You may laugh but local storage with cloud-like file systems and cloud-like NAT traversal is essentially the same thing as DropBox, except you own it. Check out the Lima project on KickStarter for an example of this.
We get all gee whiz about the cloud because it matches up with our mid-20th century visions of science fiction with massive centralization and automated conveniences everywhere. But eventually there will come innovators who care more about efficiency and advancing the state of the art more than they care about making money. These people are probably already working on ideas of this nature. These people will take these technologies out of the cloud and put them onto consumer-level hardware because it’s cheaper and it makes more sense. And this will be more disruptive than whatever Amazon thinks it can do with virtualized GPUs that likely have minimal software support from most applications that need this type of performance.
High availability, NAT traversal and ubiquitous data access, virtualization and intelligent backup….all of these technologies are absent from low cost local computing, and not because they are expensive to develop or require massive hardware infrastructure. They’re mostly UNIX technologies that are BSD licensed and waiting on the vine like low hanging fruit. That ever increasing bandwidth pipe you mention serves just as well to make the argument that you should host your own cloud rather than rent from someone else. Google and others might survive on ad revenue but the days of charging people “rent” for access to your datacenter at a rate that would pay for acquiring equivalent local hardware in a year or less are numbered. Consumers will leave once compelling local imitations arrive. Smaller, smarter, cheaper. Same old story.
Good luck telling a local small business to “host his own cloud” using UNIX and BSD. 🙂
The entire point of the lengthy and well-articulated comment Ronc imagines he just pithily dismissed is, in fact, exactly what Ronc said. Good luck telling a non-techie business owner to dream up a cloud solution = there is a business case for providing an easy-to-use cheap local cloud. Of course that means creating a nice out-of-the-box product and not asking Joe Plumber to become a UNIX geek. A few years ago the state of the art in tablet computing meant lugging around a clunky Thinkpad and wrestling with crappy handwriting recognition, etc. “Good luck” telling businesses to find a way to use those… and they mostly didn’t. Now everyone has an iPad. I love reading this site but that doesn’t mean Bob isn’t slacking on this one and unfortunately he has succumbed to the error of thinking what makes sense to him makes sense to the world. Amazon is desperately trying to find a way to make some cash off that huge network of fancy hardware and software that they used to ass rape local bookstores out of business, so they figure to rent out CPUs. Some dude logging into Photoshop on the same site where he just bought the kids an inflatable pool ain’t the same as a hospital sticking its whole network off site, having to put a call in to Sebastian in Hyderabad should the shit hit the fan.
It couldn’t have been all that clear if we can’t agree on its main point. “cloudskeptic” said “Consumers will leave [the cloud] once compelling local imitations arrive.” While that is certainly true, I’m skeptical about whether “compelling local imitations” will ever arrive. If they do, they will most likely require an IT person to manage them, making them too expensive for individuals and small businesses. But yes, if we are talking about big business, it’s true that they will compare the cost of managed cloud services with IT-person-managed local cloud applications.
Ronc: What do you consider a “compelling local solution”?
(1) A desktop (or phone) that you already have, or buy in a good month? For many truly small businesses, that works fine, and doesn’t add another fixed expense.
(2) Having a consultant add a server or two to the “computer closet”, where they hook up to the network? For not quite-so-small businesses, that isn’t much different from their current model. So what can Amazon provide that the local consultant they plan to hire anyhow can’t provide in a few extra pay-once-at-install-time hours? (“Financing”/”delayed costs” is an obvious answer, but there are limits to how much that is worth.)
It’s true that if they need a ” local consultant they plan to hire anyhow” then switching to the cloud may have no benefit. I suspect cloud services will appeal to businesses and individuals who are tired of paying professionals and wish to deal with all their business needs the same way they use debit and credit cards…essentially another cloud service
> … cloud services will appeal to businesses and individuals who are tired of paying
> professionals and wish to deal with all their business needs the same way they use
> debit and credit cards…essentially another cloud service
I’m not talking about the continuing maintenance fees; I’m talking about the initial setup. And these guys can be pretty cheap, as “consultants” go.
There are many businesses that do not set up their own hardware — they go to a local consultant (possibly the service section of a local computer store) and say “we have this many people doing these jobs; can you get us all computers and hook them up?” They trust the consultant about what sort of monitor and speakers to get, and how powerful a computer is “fast enough”. They trust the consultant on whether they’ll need a central server (or several) to act as a firewall/email server/file server. They trust the consultant on what to do about backups, which often turns out to be “nothing”. (A service that automatically backed things up to cloud storage would have a market. The continuing costs and security fears make that market smaller than it should be, but I think the real limiter is that these products aren’t yet packaged in a way that these consultants can easily resell them without much extra setup hassle.) Paying somebody even just to plug in the cords between components (keyboard/mouse/screen/…) is worthwhile, because doing it themselves for the first time seems slow and frustrating. They will be paying someone (possibly minimum wage, likely as part of the purchase price), even if cheaper is supposedly out there.
So the market is essentially people who *are* comfortable setting up their own personal computers (and relying on that for business purposes), *but* who do not already have a powerful enough computer (because sunk cost is free). Even for these people, using a cloud service means signing a whole bunch of a agreements and setting up payment. For some, that administrative hassle is worth avoiding, even if someone else has already told them which preset-VM to use. If they have to specify which software (let alone configurations) should be on it, then the cloud isn’t any simpler than a local install — it would normally be more intimidating.
A larger market does exist for pre-packaged VMs. If the choice is “install this software” vs “sign up for an account at this website”, many people will choose the website. Amazon or mainframe2 should make it easier for other vendors to provide things like photoshop or autocad in a managed environment (avoiding a local install) even without a full for-the-web rewrite, but I don’t see the appeal of the raw VMs themselves to regular small businesses.
We seem to visit this scenario on a cycle. The trouble with ‘graphical cloud’, mainframe, or ‘thin client’ architecture is that personal electronics always remain one step ahead and well… personal.
The power of the GPU and CPU of today in the cloud sounds nice, but having it on my watch tomorrow will trump this. Remember when WebOS/WebApps was promoted and even the preferred style to deliver apps on iPhone? Native code was dead… but then came the AppStore and what happened?
In fact, what will be on my desktop will most likely be even more powerful, responsive, and unequaled over a WAN. While Amazon can promise gigabit connections to bring 3D tomorrow, I’m seeing Thunderbolt rolled out beyond Apple’s product lines and into the mainstream PC (read: cheaper) devices today.
That 1148 x 800 3D will seem quite retro while we’re all editing 4k home movies locally. Personal computers aren’t just personal, they get cheaper too and that’s something they’ll have to account for.
In the end, economics always win(s). If Mainframe2 or Amazon can offer a fast and powerful solution at a nominal price (and Amazon is just the company to do this, with or without “special offers”), most people will want to save the perennial hassle and not-insignificant expense of maintaining/upgrading/replacing PC hardware and software. Maybe the bandwidth (speed) isn’t universally available or inexpensive everywhere yet, but that will not take long. At first, early adopters will sign up just for fun or to run expensive applications that are a hassle to buy and install and require high-end hardware, but once they realise they can save thousands every year the decision will be obvious. If you have a family and three or more PCs in the home, it becomes an easy choice. This applies to small and big businesses just as well.
As for security, despite all the bluster and politics, most people don’t care about privacy that much, as they freely broadcast their photos, opinions and resumes all over the Internet already. As long as consumers can be protected from viruses, RATT pervs, and cyber criminals, I suspect most people don’t care about Google subtly shaping search results or the NSA scanning their messages. Being able to trust a large provider to take care of spam and antivirus is almost reason enough to let the desktop PC cpu die an almost sudden death.
Bob nailed it.
Another excellent article Bob.
This one, and the one about Moore’s Law giving a 100x increase every 10 years . . . thought provoking 🙂
I will concede that it’s possible to use a cloud service to render 3D and perform many tasks our desktops currently perform the only drawback I see is the compression on the video. While I think that H264 is fine for moving images (TV, HDTV, etc.) for use as a display technology I think it will be lacking. My concern is pixel perfect accuracy. We have that now with the old Remote Desktop type systems but with H264 being a lossy codec I’m not so sure how it will work for editing photos on a professional level.
We’ve all seen Netflix get screwy when too much latency has entered into the stream. I don’t think this would fly on a UI.
“Why keep a desktop for 10 hours per year of Photoshop picture editing? Better to pay $1 per hour or so for workstation performance with the latest version of the software. ”
This is were your argument goes against itself. New versions always need to have pointless cosmetic changes so you know it’s “new and improved”. I don’t want the newest version of Photoshop so I can discover that Adobe has invented the next “Clippy”. I want the version of Paintshop Pro X2 that I vaguely remember how to use from the last time I used it 3 months ago.
[…] X. Cringely, I, Cringely, Amazon’s new graphical cloud helps make desktops obsolete, here. Patrick Beverley 3-19-5-3-4-2 against CP3 last night, I can hear the fat lady clearing her throat. […]
As a software developer in Portland, OR I’ve been scratching an itch, and building my demos on another cloud provider. I recently tooled an instance and shifted my development to see how it pans out. Frankly, it works better than imagined. It’s a clean, dedicated instance, that can flex as my fickle need dictates.
I agree w/ Bob’s desktop point, I mean my i7 is fancy for RDP client. At the time I thought I would manage local VMs and provision myself. But it’s too much damn hassle and the VMWare upgrade treadmill is too steep. So why replace it w/ a more capable system? If anything, my next system will be much less. It may not even be a separate desktop, since I pushed the custom stack out to the cloud instance, my requirement isn’t hard-coded, it falls back to network access, display, and input.
There are definitely concerns introduced by leveraging a shared cloud resource. Security context changes, possibly offset by user encryption, availability (what happens when provider of choice goes titsup) another issue is vendor lock-in. Big shifts in hardware availability are precursor to the software cycle and predicate new solutions there, in short, we live in interesting times.
[…] Amazon is probably the key driver for the next generation of computing. I don’t mean the book selling part. I am talking about Amazon Web Services (AWS). My dear friend, Bob Cringley, wrote about some of that here. […]
[…] Amazon’s new graphical cloud helps make desktops obsolete >> * What Health Care Needs Is a Real-Time Snapshot of You >> * Smartphones seen tripling to 5.6 […]
We see this over and over again. I’m surprised Bob would fall for it. Every few years hardware vendors think up a way to sell more servers for shared services… the only thing that changes is the marketing buzz word they use. It will never beat local ownership on a cost basis. And now everyone is fully aware of dot gov snooping, so that is really a show stopper for serious businesses.
The example of “10 hours a year of photoshop” is ridiculous. If that’s how much you use the software, you’ll spend 100 hours learning it, to get that 10 hours of efficient use. Fail.
The example of drop box is silly too. The average terabyte hard drive is cheap, and most consumer internet routers have NAS ports and VPN interfaces. Easy and cheap to access your own data anywhere, totally under your own control.