The past couple weeks have been a huge adventure for my family and me as we ran from the Santa Rosa fires. We spent the first week on the Mendocino coast where there are no computer stores. You can get a computer fixed, but can’t buy a new one in Gualala, Mendocino, or Ft. Bragg. So I bought an ancient IBM ThinkPad from MacDaddy Computer Repair only to learn that it wouldn’t produce text I could read. Hence the delay in filing this column — the first of two on the current state of cloud computing. The second column will appear sometime before Thursday when I have scheduled cataract surgery on both eyes. Provided nothing goes horribly wrong I should be back driving and flying next week and will address at that time what my old friend Adam Dorrell calls “the Mineserver Jihad.”
Between technology waves there is always a tipping point. It’s not that moment when the new tech becomes dominant but the moment when that dominance becomes clearly inevitable. For cloud computing I think the tipping point arrived a month ago. That future is now.
This is a big deal. My count of technical waves in computing may not agree with yours but I see (1) batch computing giving way to (2) timesharing which gave way to (3) personal computers which gained (4) graphical user interfaces, then became (5) networked Internet computers and (6) mobile computers embodied in smartphones and tablets, and now we have (7) the cloud. This seventh generation of computing will, within 3-5 years, absorb the vast majority of the approximately $1 trillion we spend in the USA each year on IT.
If I am correct, your PC three years from now won’t be a PC at all but a PC-shaped chunk of cloud accessed through many types of devices. The desktop PC itself is almost dead except for gamers.
What happened a month ago to make this inevitable was VMware announced a new product called VMware Workspace ONE App Express. App Express would appear to be a way to deliver Windows apps that run on a server under VMware’s Workspace ONE architecture except I’m pretty sure it’s not that at all.
App Express is VMware in name only. The service, which runs now on Amazon’s cloud, shortly on Microsoft’s cloud and inevitably on Google’s cloud, actually comes from a company called Frame (formerly Mainframe2). It’s a cloud-native implementation that pays lip service to VMware in its branding but is otherwise all Frame. App Express is the product VMware would build if it weren’t, well, VMware and otherwise attached predominantly to on-premises server app delivery.
Why would VMware turn to a startup like Frame to truly cloudify its app delivery? I can think of three reasons: 1) VMware, for all its technological smarts, couldn’t build App Express on their own; 2) partnering with Frame is a quick, safe, and relatively cheap way to beat VMware competitors in offering a cloud-native solution, and; 3) App Express will eventually destroy Citrix Systems, a direct VMware competitor that at this point probably wishes it was the company to have cut the deal with Frame.
It is very hard for computer and software companies to dominate multiple waves of computing. IBM did it in batch and timesharing but not since. Yes, they defined the PC business but never really made much profit from it. One characteristic of dominant companies is high profits.
Microsoft dominated character-based PCs and then graphical PCs using Windows, so they managed to bridge two waves, but they failed miserably in smartphones. Microsoft is, however, determined to be a winner as a cloud service provider and is spending tens of billions to make that happen.
VMware and Citrix have vied for dominance in the on-premises Virtual Desktop Interface (VDI) market for delivering desktops to thin client computers, but neither has until now fully embraced the cloud. They use the word cloud a lot, but that doesn’t mean much. Legacy IT firms that are making a lot of money from old technology tend to embrace new trends first through marketing, primarily by renaming their old stuff to make it look like new stuff, which it often really isn’t.
It’s only when their very existence is threatened that legacy vendors get serious about future technologies, giving startups an edge in the short term. The legacy suppliers are typically lulled by the idea that they could, at any time, simply whip out the big checkbook and solve their problem by buying ahead of the new technology wave. That technique always seemed to work for Cisco Systems, but VMware and Citrix aren’t Cisco.
The advantage for now would seem to lie with VMware, though that’s no guarantee of eventual dominance just a good head start. Frankly, I think they just got lucky.
Tipping points don’t inevitably come. Sometimes we think some new trend is going to define the future and it doesn’t. Artificial Intelligence (AI) was the Next Big Thing back in the 1980s but didn’t really deliver for another 30 years because we grossly under-estimated the required computing power. Running Moore’s Law in reverse we can see that one dollar’s worth of computing today cost $1,024,576 back in 1987. How could they (not we — I was around back then and even had this gig, but I wasn’t that stupid and have the clippings to prove it), how could they have been so far off in their estimates? What failed was we defined AI back then as encapsulating knowledge we already had while AI today mainly means generating whole new data-driven understandings of how the world really works.
We were right about the potential of the work but wrong about what it would be used for.
Cloud computing is here to stay. Security breaches are pushing customers of all types toward virtual desktop deployments. A titanic struggle between cloud service vendors is driving the service capability up by adding things like virtual Graphical Processing Units (GPUs) while simultaneously lowering prices. Amazon, Google and Microsoft will each pay whatever it takes to be cloud competitive. And for those companies, whatever it takes is a very big number.
Other beneficiaries of the cloud era will be startups like Frame that came along at just the right moment with the proper bit of tech to do what couldn’t be done before —replacing an engineering workstation wth a Chromebook.
And the ultimate proof is VMware’s bold adoption of cloud-native code even though it was Not Invented Here. It could just as easily have been Citrix, but it wasn’t.
Someone has to be the winner.
I view “cloud computing” as a sort of corporate IQ test.
“Hey, let’s move all our business-critical information and app servers off to some third party, where we’ll have latency issues and utter dependence on both the third party and the phone company/ISP. What could possibly go wrong?”
Companies have been depending on IBM and Microsoft for decades now. Nothing is changing.
Yes, it has changed. Now the important stuff is no longer in house where you can fix it. You have to wait on someone else. Been there.
AND depending on Microsoft over the decades has not been all that wonderful, even with Windows. Here it is not so bad now, but after 21 years of suffering I finally gave up. Too bad I was so slow.
Oh, and the recent wifi KRACKup?
Perhaps you are too young to have suffered MS for decades?
So who provides your suffering now?
Not to mention: the cloud transforms capex (servers) into opex (payments to Amazon/Microsoft). I await eagerly the inevitable consequence circa middle of November: year end cash flow freeze, all running processes stopped.
Wait on someone else? If your AWS server goes down and you need it back up, you don’t wait on someone else, you start up a new one. Maybe you had it automatically fail over to the backup already, but in any case it’s not as though you’re helpless until someone responds to you. Which is not to say cloud infrastructure is right for everyone of course.
So you’re saying that, back in the good old days, companies used to build their own T1 lines at home?
Companies are always dependent on their suppliers, whether it’s for telecom services, manufacturing equipment or parts. Adding a new service/supplier to your list of vendors does have risks, but they are unrelated to technology. They’re related to the ability of the supplier to perform, regardless of the type of service/equipment they are providing.
Good insight – and good luck next Thursday, My cataract operation was 18 months ago in the UK – but in June it went wrong and a film developed on the back of the lens – 90 second YAG laser session fixed that last week – so all good again.
glad it’s better. my doc said there is always a chance that the back of the lens pocket will cloud up with all that clean bright hot light coming through the new lens, a secondary cataract. it’s a new illness, and the laser almost always cleans that up. there could be another recurrence, but that’s rare indeed, and cause for another zap.
Mine were done 10 years apart and the first one developed the film. Incidentally on the first one they gave me a LONG list of risks, each with a percent. The gray film was on it. The second time no list; just said see you Thursday 🙂
Incidentally, if after a while your vision degrades a bit go it it lazed. The doc couldn’t believe that I was so stubborn (stupid?) as to put up with it until I nearly couldn’t see again (the other eye worked fine, so why get it fixed? a bit dumb maybe).
glad to have you back, good luck on Thursday. awshit rates on cataract surgery are very low.
IMPHO cloud computing is #1 with a coat of Turtle Wax. your access device begs the almighty Operator to run your task. the almighty Operator, since you passed Radius login, queues it up, and pushes the “printout” back. being this is interactive computing, repeat until SigOther or pet peeing on the documents you have to hand out in a meeting tomorrow makes you end the session.
less messy than boxes of cards, and you don’t feel like dying if you drop a box. otherwise, show me the difference. somebody else owns the facilities, information, and you’re paying them to come whine at the door.
Everything old is new again!
“To the cloud!”
This is the year of the Linux desktop.
The price of tulips is on the way up.
By tulips, you mean Bitcoin, right?
It’s true that dollars, bitcoin, tulips, and IBM stock, for that matter, all have in common the fact that they’re worth what other people think they’re worth at the time. In order for a commodity to keep it’s people reputation, it must have some unique intrinsic value, that’s unlikely to disappear. Tulips can be over manufactured or replaced, IBM could be taken over by people who just want to milk its assets without worrying about it’s future, and more dollars can be generated by governments through inflation. Bitcoin is different in that it’s protected by unemotional, agenda-free math. In order for a government to spend bitcoin to “help” someone, they would have to tax it away from other people’s balances, making that government less re-electable. Of course the advantages of bitcoin can apply to other crypto-currencies as well, so until one becomes the default, an investment in crypto should be considered speculative.
“IMPHO cloud computing is #1 with a coat of Turtle Wax.”
.
I would say it’s actually #2. A long time ago, I was the system manager for a shop that did two things: developed software for sale and offered timesharing services. I made sure that the systems that the folks who signed up for the latter kept running. What those folks did back then doesn’t seem so different from what the “cloud” users are doing now, other than, of course, speed and size of data.
I would have to agree that it’s number 2….as one who came into the world of computing (or as we called it back then: “Data Processing”) in the early 1980’s, time-sharing (or, to use another out-of-date term: “time-slicing”) in which a single individual physical computing environment can serve many users as if they each had their own dedicated machine/device, I don’t see how cloud computing…..which once again takes processing power away from the user’s physical location and puts it into some monolithic “other place”…..is any different.
In fact, to get from the old mainframe time-sharing of the early 1980’s (and prior) to today’s cloud computing: add alot of power and add alot processors and paralellization and virtualization (and whatever else I missed), and you’re back at the same place……dozens (or hundreds or thousands or whatever) of users each running their own sessions so that they appear to have their own dedicated computing environment that is serving only them…..but in reality each user is simply each being given cycles of processing power from one server (or cluster or whatever) that is also SHARING its cycles to all its other users.
Cycles = time, so in my opinion that’s time-sharing….so by that definition isn’t today’s cloud computing the old mainframe time-sharing?
And in my opinion yet ANOTHER interesting parallel between today’s cloud computing and the mainframe time-sharing of the old days is that we are once again moving AWAY from the concept of putting the computing power directly in the hands of the users, and instead into the hands of collossal monolithic / monopolistic giant corporations who are so powerful that they don’t answer to anybody, a la IBM, Burroughs, etc.
Remember how powerful and controlling IBM et. al. were back in those days? And how you were a slave to them if you were an under-million dollar customer? How is Amazon, Oracle, Google, etc. gonna be any different? What’s old is new and what’s new is old again!
And it was the slow timesharing, slow response, surly attitude of your provider that drove you to the Personal Computer. That and you could do in Lotus 1-2-3 in about 30 minutes, what it would take weeks to do on your time-shared mainframe, while paying your COBOL programmer lots of money to produce a sorted report you could make on your own in Lotus. And then the graphics came along and it was party-over for the COBOL guys. They became “back-end” data people.
Looking forward to hearing your state of the union address come next week. I think everyone from both sides is at this point.
Cloud computing requires speedy internet access and that’s still a long way off for large areas of the US and the world. Sure, I could have some security risks on the desktop but if someone gets into the Cloud data then they own everything.
Essentially, if you store anything in the Cloud, then the government has automatic access to it and the Cloud Provider is legally forbidden to tell you that the government is reading everything – most of the time this is a “risk” that we can live with but imagine the discussions today if the two major parties in the USA stored all their data in the Cloud … did the Russians access it, did the Democrats access the Republican election tactics, did someone hack the Democrat Cloud and pass the data to Wikileaks? The possibilities would be mind-boggling …
I wish you the best of luck with your operation!
Here, for what it’s worth, is my take on the cloud and why I won’t ever use it. 1. MegaCloud Inc offers its wonderful cloud service at rock-bottom prices, waits until it has millions of clients and then doubles, triples or more its prices. Pay up or lose all your stuff. 2. MegaCloud’s bean counters decide it isn’t as profitable as forecast and pull the plug without warning. 3. MegaCloud says your data are safe from hackers, your own and certain foreign governments and even from your competitors. But can you be really sure? 4. MegaCloud miscalculated its power and cooling needs and its systems keep failing. Inevitably, this happens just when you really need access to your stuff. 5. Workers down the street stick a shovel or a backhoe through the cable connecting you to the rest of the world. Inevitably, this happens just when you really need access to your stuff.
If the answer is to make a local backup of everything, then why bother with the cloud at all?
Oh, and the long-heralded death of the desktop doesn’t seem likely to happen soon. A client of mine, a large civil engineering company, has 35,000 (yes, thirty-five thousand) desktops. its IT head told me that desktops are now so reliable that they changed from renewals every three years to renewals every five years. Pretty soon they’ll be in the market for replacements (and no way are they going to put their road, bridge, dam and railroad designs on any cloud).
There’s cutthroat competition between cloud providers, so no scope for them to raise prices, in fact prices have been falling consistently for over a decade. Any connected infrastructure can have outages, what if your company connection goes down? It happens, but cloud platforms have a pretty decent track record overall on this. It’s not some new thing nobody has ever done before.
I’m a bit skeptical about virtual desktops and virtual apps like this. I can see use cases for sure. There is some software I can’t afford a license to, but if it could pick up files from Dropbox and I could run it on demand I might just do so. I’m thinking about things like inDesign, QBase, even Word which I rarely use but do need now and then. Even owning a powerful PC, being able to access them on demand could be handy.
“no scope for them to raise prices”
Crashplan.
“no scope for them to raise prices”
Crashplan (cloud backup service) did exactly that this year when they doubled their personal consumer prices.
if MegaCloud gives a rip, they will be double-homed. if they give two rips, each of their centers will be fed from separate ducts from separate directions, from different sides of the building, and ideally from different providers out of different central offices. that engineering costs big money. but that’s how you protect against backhoe fade. as one international airline I am familiar with learned to their chagrin.
@Scott Schrader – But how many of MegaCloud’s customers will have more than one way to get the connectivity to reach MegaCloud’s access points. It doesn’t matter if the backhoe only takes out service to a small geographical area if that is your location.
probably depends on the pricing for spreading your goodies to multiple locations of MegaCloud. nothing is free. if CheapCo doesn’t, and the power goes out or the roof drain breaks or there is a birthday backhoe party atop their single fiber feed at the East Failure location, they are Stuck Outta Luck.
@Scott Schrader – You are still missing the point. Every cloud connection has two endpoints. One at the customer end and a different end point at the cloud provider. Any major cloud provider will probably have multiple physical locations which “should” each have different means of communications and power sources. You are correct that you might have to pay more to access cloud resources in multiple locations but that is an option in most cases.
I’m referring to the failure mode where the legendary backhoe (or drunk driver and utility pole) takes out the one communications path at the cloud customer end of the link. A cloud customer could have a backup generator but how many of them have a alternative way to communicate with the cloud provider? Many parts of the US only have one broadband supplier.
Important stuff first. Bob I hope your surgery goes well and you are back to seeing again.
On your cloud argument I think you sound like Larry Ellison who many years ago predicted the death of the PC for the network computer. I think the argument is still bunk.
Reason one is latency. I have experimented with Amazon Workspaces and they can work OK but it really depends the hops from your ISP to AWS. Many times it is just too slow to be productive when using something like Word.
Businesses still need to trust the Cloud (I do not) for holding their key data. I’m talking about lots of terabytes of transactional data, images, and the such. Then it must perform and not like many clunky SaaS implementations. For some reason we believe that Enterprise IT should run slower now.
Finally, hardware remains cheap and does internal network bandwidth. I cannot get inexpensive 10GB to AWS and many businesses remain single location entities with a few remote workers.
I get that we all have iPhone apps using Cloud services – makes sense. Using DropBox for some home files also makes sense. But I will always want a rich application on my device for heavy work and the Mac/PC does that for me and Corporations.
So I’m happy to place a steak dinner bet on this prediction…
Again – best with your surgery.
Good luck in surgery Bob.
Have to assume you had everything backed up in the cloud and offsite being the professional you are.
Look forward to your next update on the Minecraft adventure, but suspect it went up in smoke.
Best,
Mark
Back in 1996, Larry Ellison with Oracle and Sun Microsystems and IBM and other partners implemented the idea of a “thin client” networked computer. It never achieved widespread adoption. It is an old idea that was ahead of its time, most likely because of the long latency of the internet and speed. Here we are in 2017 and the cloud is the latest attempt and implementing the thin client. Nowadays the internet is fast enough for “real-time” video etc… and it will only get faster in the future. The thin-client is here.
I suspect Bob’s view is due to his living in an area with one of the best ISPs in the US. Meanwhile, many of the rest of us are bystanders in the ongoing bandwidth wars, waiting for &%$NO CARRIER
business of the “let’s fly to Antigua for the quarterly senior management meeting” get the bandwidth, and they are the primary users of cloud computing. businesses of the “walk down to Billy’s Hardware and get another couple chairs, a client is coming” size are the ones getting hurt.
I hope your eye surgery goes well. My wife had the same recently and she now sees better than ever. Let’s get back to flying and driving soon, and to a new home. All the best.
I look forward to the second installment of Tipping Point. All has changed much too quickly. I asked my college class if they ever listened to AM radio. None knew what AM was and only a few knew what s “radio” was – something in the car. And these college “kids” were born in the late 90s. Where is it all going?
All of my customers are rushing to the cloud as quickly as they can. The problem is: How can you sell the chicken for less than it costs to feed it? Are they buying your business so when they have you they can raise the prices? Or is someone subsidizing it so they can have access to it?
So, what you’re really saying is that in the very near future, we won’t need our own personal Mineserver — we’ll just use servers in the cloud?
.
Glad you cleared that up; guess we don’t really need those Mineservers after all.
You don’t kids have Mineservers fully on their X-1’s do you? Microsoft already has it, just hasn’t told us yet. Wait until Christmas, if not this year then next.
You don’t think kids have Mineservers fully on their X-1’s do you? Microsoft already has it, just hasn’t told us yet. Wait until Christmas, if not this year then next.
@FormerTXIBMer “Wait until Christmas, if not this year then next.”
.
No truer words; Emphasis on the “if not this year then next”. It’s a broken record around these parts of town. You’ll excuse me if I don’t hold my breathe, for Cringely nor the technology.
,
Also Microsoft already has servers built into the game but they charge a monthly fee (they’re called Realms). It’s in Microsoft’s best interest to keep things in their court as long as possible to make bank from lazy kids/parents.
“You don’t think kids have Mineservers fully on their X-1’s do you? Microsoft already has it, just hasn’t told us yet.”
.
I’m not sure what an “X-1” is, but my kid (the one that still plays Minecraft) connects to some online server for which we paid something like $50 for a year of VIP access (birthday gift). Then there’s this:
.
“You can’t play true Minecraft without connecting to a server. There are thousands of Minecraft servers on the Internet but most of them are in some way commercial, typically ad-based.”
.
[ . . . ]
.
“But there’s a better way — run your own ad-free Minecraft server.”
.
That’s from the Mineserver campaign on Kickstarter. Those are Crookely’s words, not mine. Seemed like a good idea at the time; I had three kids who were playing Minecraft back then.
.
@Roger I’m pretty sure he’s saying “X-1” to mean the Xbox One, but no one calls it an X-1. Either Xbox One or Xbone, so he’s just showing how uninformed he is while trying to sound hip. “FormerTXIBMer, stop trying to make fetch happen; It’s not going to happen.”
“The desktop PC itself is almost dead except for gamers”
Perhaps for the BYOD crowd and consumer smartphone addicts.
I don’t see it happening anytime soon in any of the segments I am called in to maintain.
Good luck on Thursday.
Bob: Someone has to be the winner.
.
Me: Can I be the winner? *holds hand out for monies*
I don’t see anything in this article that required buying a computer for the purpose of posting it. These claims are looking more and more sketchy.
Nor any confirmation that his home truly burned down to the ground, in this post or the last. He likes to leave an air of mystery around his carefully selected vague word usage. He has misdirection down to an art form.
.
Announcer: “annnnnd for his next act, the great Bob the Cringilicious will attempt to discredit all mineserver backers. But first, a word from our sponsors…”
He also likes to make promises about what’s coming in a few days/weeks/etc.. and then not follow through with that timeline.
.
Stop making empty promises and just do it. If you look up the word procrastination, there is a picture of Bob in all his glory: Definition of Procrastination
It’s gotten to the point where you can’t promise “virtually all development work is already done so risks are minimized. If we had cases we could start shipping tomorrow.”, then miss a few Christmas’s, disappointing your backer’s kids (again), while doing everything you can to avoid the subject, without provoking a bunch of whiny people whose money you accepted asking uncomfortable questions about what happened to that miner-server you and I promised my kid so long ago?
But Bob would never try to deflect blame for that onto his victims by referring to their attempts to break through his months of recalcitrance refusing to answer their inquiries as a “Jihad”. No, that’s not Bob, that’s just some friend of his who is also so blind he missed who’s side comments like “Take the gas pipe already and spare us any more of your ridiculous kvetching“ were coming from.
To continue taking orders at mineserver.com is ridiculous.
BTW, who is backing up all of the data on these cloud devices ? Are they getting backed up ?
“hail, sir, it’s CLOUD! they gots them own backups!”
until MegaCloud goes bust. or they are backing up to /dev/null. or nobody told the intern the shiny side of the tape faces the outside edges of the drive. or a plane falls on HQ and the router tables are not replicated. or the beancounters decide that MonstraCloud is cheaper, and they don’t renew, but the word never gets down to the three IT guys who are depending on MegaCloud, and the data is never ported over.
So glad to hear that you have the surgery scheduled. Did the doc offer you a choice of what distance to focus the lenses at? I chose arms length plus 6 in cuz I am a competitive pistol shooter,
BUT there is a rational reason too:
1. they can’t necessarily get them right, so off too far and you have distance; off too near and you have reading. It worked perfectly as requested, AND by golly I can both legally drive AND compute with no glasses.
Reading, and compute too, are more comfortable with glasses, but not required.
Happy Seeing
$240 per year PLUS $1-$4 an hour to do my computing?
Making some assumptions on computer cost and monthly usage, it looks like my break-even is somewhere between 3 and 16 months. Not gonna do it. Assuming I don’t need a serious system upgrade because of some yet-unknown killer resource-hog app, my computer will last years.
If Frame was smart, they would make the bare bones option Free All-You-Can-Eat Usage. The computing overhead for Office has to be fairly small, and that would help them get over the consumer’s mental hump of a monthly bill. Frame is probably targeting the Enterprise customer so I’m way outside their target segment.
In the early 1940s, IBM’s president, Thomas J Watson, reputedly said: “I think … declared on his blog: “The world needs only five computers”
Now nearly 80 years later he was right. The computers are AWS, Azure, Google, Alibaba, Oracle or IBM Cloud?
Mature markets tend towards commodities with a few big players. Computing is no different.
Rule of 2.5: a sufficiently competitive market, over time, will settle down to 2.5 competitors. That is, 2 big ones of roughly the same size, and a crowd of others that add up to about 1/2 of one of the big ones.
The Cloud business seems to be trending that way. AWS and Azure have huge backing and are sucking up a lot of business. The others probably still add up to a whole big one, so there’s room for fallout and mergers among them – hard to say which ones will die or merge away. The clouds can also repurpose: instead of being big general-access clouds, they will be focused on specific businesses – like Oracle and Adobe.
FWIW I agree with those who equate The Cloud with the former timesharing business. Ignoring the backing hardware, which is in fact very different, the business approach is very similar. My first paying computer job used the GE (business, not GEnie) timesharing service, and it was very like using the cloud now – various service available (for various fees) in addition to the overall service (charge by i/os, cpu, and storage used, plus a “subscription” fee). Use their canned stuff or write your own (we did both). And when the budget cuts nailed a project or even use of the service good luck getting your data back out (I printed stuff every way from Sunday during my layoff notice time – which was the only way they could get the data back, but keying it in to some new system later).
Well I’ve mostly switched to using a chromebook so I should be ready for this brave new world, right? This year I got rid of the last of my windows pcs, but I’m keeping several Mac Minis, mostly for logic games that can’t be loaded into the chromebook.
I’m retired now so I don’t have any business need for this new cloud environment but I would like to see some sort PC DOS app in the cloud. (Stop laughing at me!) I would love to recreate my old DOS setup in a cloud system I could access via my chromebook. Should be relatively simple, it’s mostly all text and some simple graphics. In there I could also run the desktop version of the HP 200LX palmtop apps. I only hope this comes true before my real palmtops go away…
I now return you to your boring reality.
[…] We’ve Reached the Cloud Computing Tipping Point via […]
So much for vmware horizon desktop. Write that “multi billion virtualisation difficult to configure and expensive to run software” off.
“The desktop PC itself is almost dead except for gamers.”
And music makers, and video makers, and basically everyone who creates things. You may think this is a “small market” but it also is cheap enough to include those who intend to make things but never do, or do so poorly.
Basically everyone I know uses the cloud to some extent, where it makes sense. My email provider almost definitely stores my unread email on the cloud… until I collect it. Files I’m collaborating with others on are stored in the cloud for so long as they’re actively being referenced. Several apps I use surely use the cloud to sync data between devices.
As far as the “PC market,” the people I’ve mentioned may be a niche market, but Bob you yourself pointed this out awhile ago: the upgrade cycle that the industry relied on to pump out fat profits died. As the technology powering PCs became more mature, the industry relied on leaps in hardware speed and capability to sell new boxes. This isn’t the case. I’m a scavenger that has always relied on buying up 3 to 4 year old technology for use because the prices usually dropped dramatically by then. I was recently in the market for some new boxes and I was surprised to see how different models had held their prices despite the passage of time. There just isn’t enough difference between 2017 and 2013 hardware to make much of a difference to people.
Now as far as the “Mineserver jihad”… I wouldn’t be surprised if you can’t tell this from your own POV but those people seem pretty resigned to an unhappy conclusion now. I’d advise you to tread lightly but you haven’t taken any practical advice from anyone on the proper way to address this so I don’t expect you’d start now.
Hey Faecal Encephalopathy,
It’s a “the Mineserver Jihad.” is ENTIRELY due to YOUR OWN SCUMBAGGERY!
You are a giant Fucking Crook
You giant Asshole
Wow.. You people really take your Minecraft video games seriously, eh?
Good luck with your surgery and I wish you a speedy recovery!
Good luck on the cataract surgery. I had it done over 10 years ago and I did not consider it a big deal. If you wear glasses today the new artificial lenses correct astigmatism as well as vision correction so you may not need the glasses going forward except maybe for reading. Thanks for your articles – much appreciated.
I had both eyes done a few years ago and had no problems at all. My surgeon was a top surgeon for cataracts surgery. He was so good that he was recruited by another hospital and left the area.
Who?
Adobe made it clear about 2 years ago. It’s “cloud or die” with “CC” applications . So why waste time or money on a personal drive that will eventually fail, or software that has to have a serial number to activate? As far as software, Adobe has already won the war. Subscribe or die. That’s it. There is no going back. Live with it!
Screw Adobe, there are plenty of alternatives. Nobody needs to “subscribe or die”. Only an idiot would pay extra just for the ability to rotate a pdf, for instance, but Adobe seems to think it’s a good idea. Sure, there are plenty of idiots to take advantage of, but the rest of us laugh at the idea that we have to “live with it”.
No-Cloud discussion: https://forums.adobe.com/message/5530682#5530682
The main reasons for Adobe’s shift to the CC were financial, not technology and definitely not because the technology of cloud computing opened up a whole world of new possibilities that weren’t available in CS6. First, I think it encouraged the (let’s call them) “gray market” users to enter into the fold as a small monthly fee was more manageable to people who didn’t want to pirate products but didn’t have a spare $700 lying around. But way more importantly is that Adobe’s revenues used to swing wildly: when they had an upgrade for sale it was a feast, when they didn’t (or people skipped them, as many professionals did) it was a famine. CC essentially took these massive biyearly spikes in revenue and flattened them out. And third, unlike Microsoft or many other companies, you used to only be an Adobe customer for so long as it took you to evaluate and buy the product. Now they’re in touch with you (and pitching other services to you) all year long.
screw Adobe, I went with open source apps. resident open source apps.
Bob, your background is so workstation and still hardware-centric. I should write your next article for you. While I agree that engineering workstations may be a good explanation of what “the Cloud” can do, it is hardly the center of the Cloud universe. Look at Salesforce, they are Cloud’s future. Services provided to customers through Web and APIs, literally connecting up Coca Cola’s refrigerators in stores to their service cloud through IoT. Using mobile push to tell a customer that the temperature is too high in a cooler, or sending a service person to fix it. You are trying to tell me that hardware doesn’t matter anymore, while I try to find the best Windows laptop to buy my soon to be graduated high school student. Graphics connecting to the cloud now have to drive 4K graphics. That is what I’m shopping for. But, back to Cloud. I cut my teeth on SABRE, AA’s “private network reservations Cloud” for 11 years, starting in 1986. I literally helped bring Ethernet and TCP/IP to SABRE in the 1990’s. I worked for BEA Systems, helping build middleware for early Internet banking systems, helping build their early “clouds”. I worked on EDI Systems bringing cloud to the exchange of business documents like purchase orders and invoices. It’s funny because I was building API interfaces to EDI long before Mulesoft. My final point. Customers want SERVICES, not Cloud. They want their systems to quietly run their businesses and notify them when things aren’t right. They want them to fix themselves, so they can go have a real round of golf, not three holes now and then. They want to be out selling and marketing while everything works in their network, in their connected CLOUDS of services.
Good luck on your surgery, but make sure you LISTEN to your doctor and his post-surgery recommendations. I can almost guarantee that you won’t be flying your plane, or even flying in any plane for a while. My brother-in-law recently had cataract surgery, and he had to go over some higher elevations to go home, which was a bit of a problem.
One of the possible things that can go wrong after cataract surgery is retinal detachment, which is no picnic. Be careful.
@phred14 On the bright side, Bob isn’t really looking to go flying right away, he just saw an opportunity to take a jab at the mineserver backers and took it. It’s kind of become his schtick. Expect more on that “in the next few days.”
He said sometime before Thursday. He didn’t say which Thursday the surgery is scheduled. Presumably it is the same place from which he fled.
Someone posted on Adobe why Cloud is so popular. It’s about MONEY. For both the users of the software and the providers. On the provider side, sales of hardware servers (especially mainframes) and server software is “spikey” meaning that as a new generation is released, there’s a positive spike in revenue and a huge push to upgrade, paying all the providers in the ecosystem. Then there’s a lull between the last generation and the next generation. Sometimes customers skip generations. Customers wanted 5 year upgrade cycles (or longer) to get their ROI payback, but providers want 2 year cycles (or less) because it pretty much pays them all the time. In the latest Cloud Services models, the customer doesn’t get to skip an upgrade cycle, they are upgraded whether they want it or not. Providers try to keep compatibility with older services in the model, but in the end, if something breaks, there’s a rush to “fix” it rather than the option to sit back on the old version, or skip a release. The bean counters at IBM, Salesforce, Microsoft, and Oracle LOVE the cloud services model because there’s a subscription revenue stream that is very predictable for Wall Street. Part of the subscription is in “future revenue” that they can show to investors to show that either they have that money (pre-paid) or the customer is “committed” for that money until the subscription is complete. It makes all the money people feed really good. On the customer side, users just want SERVICE, like the power company. They want to use the service whenever they want, in the manner prescribed by the provider. If the service is not mission critical, then the infrastructure can be shared with other companies (like we did in EDI services), reducing the cost dramatically for everyone. One software product being used in 10’s and 100’s of computing grids or pods, over and over again. A cost sharing dream, as long as you don’t cross the data streams, to use a Ghostbuster’s analogy. When it works, it works very well. When it doesn’t, it’s a freaking nightmare!
One more point. If you think that customers are going to be access their clouds through Chromebooks, that may be true for low-level office worker types. But I would argue that a lot of those folks may be eliminated in what IBM and Salesforce and everyone wants to call AI (Artificial Intelligence) now. Low level, low value jobs that can be done with XML, EDI, JSON, and automation will automatically be done through AI-enabled rules engines which check for fraud, funds, restrictions, and justification, and Sarah, who used to review and approve that PO will be sitting on the unemployment line, as the work she used to do can all be automated now. Same thing with the invoices and shipping information. The next generation will be higher-end workstations for the people who design the rules and processes in the AI system, and will need to see service aggregations and rules engines rulesets, which will need larger screens and lots of UI interaction. Kind of like the 15″ Macbook I use now with two additional 27″ HD monitors. And I’ve done it with a 13.3″ Macbook Air as well. Enough computing power to drive the large screen monitors to visualize large rulesets and service workflows.
This is the first comment that actually makes me sort of agree with Bob’s prediction. Your super smart AI is going to be proprietary to the Cloud Keepers. It won’t be available to us desktoppers.
This is Bob’s prediction: “This seventh generation of computing will, within 3-5 years, absorb the vast majority of the approximately $1 trillion we spend in the USA each year on IT.” As you say, it makes sense for super smart AI, since desktop computers aren’t powerful enough. But if the cloud becomes so expensive, perhaps that will increase the market, and the technology available, for local processing and software, bringing back the desktop to save money.
In the end, advances in computing and networking are about providing more power and functionality at the hands of users and customers for less money. Open Standards help immensely, because they level the playing field and force providers to compete on price and quality, while providing mostly the same services. It gets bad when a few companies dominate an industry, when there are less than 5 competitors. It gets bad when providers collude to “stay out of each other’s way” like has been done with broadband Internet to homes. Then you’re left with two providers (like my neighborhood), or a monopoly. To the extent that Cloud Computing can provide those advances, and provide the “abilities” (reliability, predictability, availability, stability), and protect the customer’s data, at a lower cost, it will grow and flourish. But, fail just one of those points, and the customer’s trust is broken. And then they are rushing to put it all back into their own data center, and run it themselves. So, the providers are on the hook here. Can they really do what they promised? Or is it just another bag of hype, well marketed and thrown at “sheeple” (people who are sheep, just following the crowd) to get them to buy the next thing?
Really happy to see you’re back. Thank you for the column (as always really intelligent stuff I can’t find out about) but especially given the many misfortunes. Thank you for your perserverence also.
That being said, I now have read your article and this is the only time where being an ignorant moron without understanding the article is a plus. Yeah, you totally/slightly wrong on this.
I may have misunderstood, but is there something VM is planning that Microsoft can’t or won’t? ‘Cause where I sit as an ignorant moron, seems to me that Microsoft comes in, takes some cloud resources, offers us full out Microsoft Office Everything for the same per month fee as Hulu and All Us Ignorant Morons Sign up for that and nothing else.
Maybe not even Hulu.
From one moron to another, I agree, it’s very confusing. Perhaps the key phrase here is “cloud-native”. Microsoft Office claims it’s cloud based, but actually installs Office on your PC, running it there, because it makes more sense to do that. That way they get to charge people for an ongoing subscription, without hurting performance too much. On the other hand, “cloud native” apps actually run in the cloud, not on your PC, regardless of whether it makes sense. Think “time-share” from 1975.
The end of the cloud is coming
https://venturebeat.com/2017/11/04/the-end-of-the-cloud-is-coming/
You beat me to it echo ! I suspect just as we get used to centralising everything on the Cloud, the next-big-thing will be de-centralising away from the Cloud … like ZeroNet
Thanks. It includes the statement “the design of the internet where customers send data to programs owned by businesses is backwards and that we should instead send programs to customers to execute on their privately held data that is never directly shared”.
Small correction: Desktops *today* (and laptops) are basically dead except for those that require massive computing – gamers, CAD, Video/Audio Editors, and (to some degree) Developers.
Most everyone else will do just fine with a Chromebook + Cloud Services.
Desktop is just a form factor. For example, my first computer, running Windows 95, was a laptop. I’ve been using portable PCs that can be docked to a larger monitor/keyboard/mouse for the past 10 years. But that doesn’t mean we don’t need the power and screen size of a desktop. Websites are all getting increasingly complex, including and especially Amazon, requiring an i5 processor and a big screen for a comfortable experience. The OS running on those powerful chips depends on the apps you’ve become accustomed to using. I would find it very hard to replace the functionality of a few Autohotkey apps I use every day, but Autohotkey only runs on Windows. I’m not a programmer, so I find it hard to write my own Autohotkey apps, but I got a hold of a few, which I find indispensable, and miss having them on my Android phone.
As someone who started in out in the late ’70s with IBM (in the hardware segment) and migrated to the software side of things, bouncing back and forth between big iron and micros until finally burning out and pulling the ripcord… I can only speak today from what I now see as a user on the sidelines vs. where I once was eyeball deep in the mud with the latchkey geeks. (I’m now one of the ‘users’ that I used to complain about LOL)
That being said, it has always occurred to me that this industry is just like everything else in nature – cyclical.
First it was mainframes with terminals, then it was PCs, then it was PCs as mainframe terminals, then it was workstations, then came thin clients, then back to the desktop (’cause thin client “wasn’t ready”), then came the workstation in earnest, then distributed computing and even mainframes evolved into basically a glorified collection of PCs connected in a rack that looked like a mainframe form factor, then back to the desktop but leveraging local nets (real distributed computing?) ’cause Internet connectivity didn’t have a big enough pipe for all but the biggest wallets, etc. etc. Now the mobile revolution has gotten seriously started and the same phenom is cycling there as processor and connectivity bandwidth has become more accessible on a personal device level to support it…
And it all leads us to Cloud computing (hey – look Thin Client is back! sortof…) as we know it today. It’s a freakin’ wave function, constantly cycling. Only the periodicity/frequency has changed.
Personally, I don’t trust the Cloud for many of the reasons listed in these replies – but I’m still a user where it makes sense – for me – because at many levels, we’re all being forced to adapt in that direction by the industry. And the way I see it (again – ignorant and from the sidelines these days) – industry will continue to push users into the Cloud for now because that’s where they SEE THE MONEY is (subscription based predictability and revolving revenue models, etc. etc.) AND shareholders are smitten with it as the latest thing right now because… that’s where they SEE THE MONEY is right now.
But the first time a truly large Cloud service has a serious disaster; say for instance in the area of security – take Equifax as an example for instance of scale, more than a brass tacks cloud tech example (can you say “to big to fail” – again…), then the public will sour on the idea and run away for a while – their trust being broken. And shareholders will then say “Now get us out of this before we lose more of my capital” and it’ll be back to more a more localized model. Until AI-driven distributed data storage and pathways to it, along with sustainable compression models (because data just keeps getting bigger and local won’t make as much sense – at least it’ll likely be marketed and sold as such, true or not) will light a new fire and the cycle will continue… again.
IMHO, the desktop won’t truly be dead for some time – it’ll just fall out of favor, until it’s not.
Even now, in the age of “big data” often the workload is being pushed out to the edge points (desktop, for all practical purposes, albeit in a mutated form) to do the heavy lifting because currently neither the bandwidth or affordable processing power to sustain a shared (can you say “timeshared”) model for a many-users & varied application simultaneously doesn’t practically exist in scale… yet.
Until we have *affordable*, predictable, “room-temp”, & understood quantum computing tech in scale that also uses quantum-driven security for both data AND the pathway to it, along with palatable wireless access to it all, the distant vs. local computing cycle as we know it will most likely continue. And even then, who’s to say what reasons we’ll have (or be sold) to continue to propagate the wave function?
Most likely, just as today and in the most of the past; it will ultimately be driven more by MONEY than actual technological reasons… And because the (gr/)need for “more” will always be changing the landscape, that most likely means there will continue to be a cyclical component.
But hey, I’m just a “user” now… I could be wrong.
The business drvier for the cloud (or for any other form of outsourcing) is not to save money; it is to obscure accountability.
Bob,
Do you think there’s a limit on Moore’s Law that we might be approaching with CMOS technology? Do you have any thoughts on potential alternatives like the application of generalized reversible computing? Michael Frank gave a fascinating talk recently published by Stanford Online about this alternative computing approach. (https://www.youtube.com/watch?v=IQZ_bQbxSXk)
I think the laws are falling away in favor of “communities”. When we got to about 1.8ghz with 4 cores on a CPU, Moore’s law did not matter as much as power consumption, except in specialized areas. Even now, the growth is in multi-processing by increasing cores and threads, while holding the power consumption down. Then the network law kicked in (call it Metcalfe’s law?) with the explosion of broadband Internet where we literally have a billion other people and things to communicate with in realtime. Now those billion endpoints are looking for “communities” to compute in, whether it blockchain for a distributed business ledger or cyber-currency, or connecting a company’s IoT devices to a support community, or perhaps even coders using Bitbucket, JIRA and Confluence, or connecting all the points in a supply chain to an AI system to manage inventory and product delivery. The infrastructure is built, the key items now are connecting the right people, bringing new combinations together, and keeping the “bad guys” at bay. I think it’s ironic that the “new” Internet has been around 10-15 years now, with huge IP addresses, and new routing rules, and yet somehow, we’re still operating most of our businesses on the old IP addresses as Internet Providers provide “magical routing solutions” to allow the old to work with the new. Yes, hardware will advance, but the advances now will be less about more circuits in a square inch, and more about how everyone and everything can find each other and communicate in secure ways.
Moore’s Law hit the wall back in 2007. Processors since then haven’t gotten any faster. It was a question of the physical limits of silicon. Since then, you’ve simply gotten more processors on the same chip.
The limit they hit was heat transfer. Running faster than about 2.7 GHZ, the silicon of the chip melted. For physical reasons, the heat lost on a chip is a function of both the current density and the frequency. Making things smaller reduced the current density, and allowed for a higher frequency. There is though a smallest allowable size for the wiring. There are quantum reasons for that. Tunneling, Hall Effect, things like that. That in turn limited the frequency that could be used without melting the chip substrate and destroying the processor.
The practical limit was reached then for a single processor. The absolute limit is still ahead of us a few more years. That limit is set by atomic structure. It’s less than ten years out. IBM established in the 1990’s that the smallest possible computer element takes about seven atoms. such structures can’t be mass produced. Yet…
The limit on cooling is what is driving the current attempts to abandon silicon for carbon. Silicon melts at a measly six hundred degrees or so. Graphite or diamond melts as something more like 2000 degrees. There are problems though.
So, when do we reach the end of Moore’s Law? Sometime between ten years ago and ten years from now, depending on how you measure it.
I know it won’t matter to most, but perhaps all of the Mineserver backer haters can start to understand some of our frustration when Bob sets a timeline and says he’ll do something:
.
“The second column will appear sometime before Thursday when I have scheduled cataract surgery on both eyes. Provided nothing goes horribly wrong I should be back driving and flying next week and will address at that time what my old friend Adam Dorrell calls “the Mineserver Jihad.””
.
And then just flat out misses the deadline (usually by weeks mind you, sometimes by months) and then when he finally posts, acts as though nothing happened and we’re being impatient. Perhaps time works differently for Bob, but he needs to stop making promises with deadlines if he doesn’t plan on keeping them.
.
I know he can’t see and might have had his home burned down; Totally understandable if he doesn’t write here often, but then don’t set the expectation saying you’ll do something if you don’t plan on keeping it. If he instead said “things are a little hectic now, but expect a second column about this when I am able.” that would convey the same message with a more flexible deadline. When he keeps making promises with hard deadlines and breaking them, you can understand why our faith in his word is continually diminishing.
.
(and for the devil’s advocates who may say “he never said THIS Thursday. Chill out!” I’ll just tell you and/or Bob to stop being a troll if that’s truly how you think anyone would interpret “before Thursday”. If I told you I’d deliver your pizza by 6pm but didn’t mean until 6pm weeks from now, I guarantee you’d also have a problem with that. Stop defending him and let him speak up for himself. He’s a big boy.)
He didn’t say Thursday, he said Thursday when I have my cataracts scheduled. He didn’t say which Thrusday that was. My guess is he was setting up his excuse ahead of time.
@MikeN Yes, I agreed that could be the case, which was addressed in my devil’s advocate disclaimer at the bottom. I would argue, however, that in general people don’t talk that way. When you say “Thursday” it implies this upcoming Thursday; If not, you say “next Thursday”, which implies Thursday of the following week. If it’s beyond that, people include a date because at that point, when you add a day of the week, it’s too specific to be generalized indefinitely, which it sounds like you’re arguing Bob may have done.
.
If his cataract surgery isn’t for a few weeks, then why would he include the day of the week? We are just supposed to check back on his blog every Thursday until something shows up? Most people over the past few months have commented that they were amazed he still hadn’t had his cataract surgery so waiting indefinitely may be some time.
.
I’m pretty sure the more reasonable case here is that he, once again, promised and missed a deadline, which has become his MO, but I guess we’ll see whenever Bob manages to get around to updating us.
You know what they say about doing the same thing and expecting different results? Perhaps this is one, long psychology experiment.
I agree it would have been MUCH BETTER to promise nothing then surprise everyone. An honest note saying “Check back in 2-3 months” would have done a world of good. I’m about to give up on this guy. Maybe I’ll check back in 3-4 months when possibly, he has had his surgery and has started putting his life back together. He’s beginning to act like Cameron Howe (disappears for five months and doesn’t build the browser) or Joe McMillan (disappears for more than a year after flooding an IBM mainframe room and trashing 2 million in mainframes) from Halt and Catch Fire.
You know what they say about doing the same thing and expecting different results? Perhaps this is one, long psychology experiment.
Türkiyenin en ideal ve eşsiz Milli Eğitim Bakanlğına Bağlı Kurumu olan ruyaavcısı ile güzel sanatlara hazırlık alanında bir numaralı adres olmaktayız.
İstanbul avcılar ve esenyurt şubesi olmak üzere iki şubemiz ile sizlere hizmet vermekten onur ve gurur durmaktayız. Ubeyt Çağatay eşliğindeki kurumumuz sizlere hizmet vermekten onur duymaktadır.
Amacımız sizleri geleceğe yönlendirmek olup eğitim aşağı genç kuşak yaratmaktır. Bakırköy resim kursu ile gelecek sizlerin elinde.
http://ruyaavcisi.com/guzel-sanatlara-hazirlik/
Bölümünüze sım sıkı sarılıp sevdiğiniz mesleği yapmak sizlerin elinde. Güzel Sanatlara hazırlık alanında bir numaralı adresiniz olan ruyaavcısı sanat merkezi ile Ubeyt Çağatay eşliğinde Mimar sinan fakültesine yerleşmeniz imkansız değil. Bizler ile iletişime geçip geleceğinize yön verebilirsiniiz. Eşsiz sanat eserlerine basamakları kazıyarak adım adım ulaşabileceksiniz.
El beceriniz görselliği ile paha biçilemez duyguları ifade edeceğine eminiz.
http://ruyaavcisi.com/guzel-sanatlara-hazirlik/
I don’t think having the capability to run desktop apps in the cloud means that is a signal that there’s a new tipping point. Running a DOS terminal inside a GUI isn’t a tipping point for GUIs and being able to RDP in to a desktop from your iPhone isn’t a tipping point for mobile computing. Cloud apps/services have fundamentally different advantages, not just a layer of cloud sitting around a traditional desktop app.
If anything, saying there is a strong requirement for access to desktop apps even when we have today’s decade+ old cloud means the cloud hasn’t replaced ALL desktop apps and desktop apps are still providing some unique value.
Robert, this is really a helpful description. I have gotten a lot of information through the post. Great article thanks and keep it up!
[…] | November 19, 2017 An anonymous reader quotes Cringely.com: My last column was about the recent tipping point signifying that cloud computing is guaranteed to replace […]
[…] nameless reader quotes Cringely.com: My last column was once in regards to the contemporary tipping level signifying that cloud computing is assured to […]
The Cloud rests on a belief in the stability and dependability of our various infrastructures: the electrical grid, the internet, corporate capitalism, non-totalitarian government. I think it is naive.
RE: “The Cloud rests on a belief in the stability and dependability of our various infrastructures” So does civilization.
[…] “{W}e defined AI back then as encapsulating knowledge we already had while AI today mainly means generating whole new data-driven understandings of how the world really works.” — Robert X. Cringely at his website. […]
I have noticed you don’t monetize your site, don’t waste your
traffic, you can earn extra cash every month because you’ve got hi
quality content. If you want to know how to make extra money, search
for: Boorfe’s tips best adsense alternative