Now, finally, to my predictions for 2019. This is, I believe, my 22nd and possibly last year of looking ahead, so I want to do something different and potentially bigger. Our old format works fine but I’ve been pondering this and I really think we’re at a sea-change in technology. It’s not just that new tech is coming but we as consumers of that tech are in major transitions of our own. It has as much to do with demographics as technology. So while I’ll be looking ahead all this week, coming up with the usual 10 predictions, I want to make sure we all understand that this isn’t business as usual. This time it really IS different.
I’ve been thinking about 50 year cycles. The year 1968, which was 50 years ago when I started writing this column back in November, saw a huge social and political upheaval with student riots all over the world, the rise of the hippy movement in the USA, the Summer of Love and the founding of Intel. Most of the technical progress we have seen since 1968 has been driven by microprocessors, which were largely the work of Intel. And it took 50 years, but we’re now approaching the Internet of Things, where processors will be in everything and everything will be linked or monitored, which is either good or bad depending who you are.
Fifty years before 1968 was 1918, when America became a world power by participating in World War One. The American Century was assured by the end of 1918, because we were on the winning side yet suffered no battlefield carnage. It’s not by accident that IBM, for example, emerged from that war as a maker of something we’d not really seen before — business machines. Between IBM and National Cash Register (NCR, where IBM’s Tom Watson Sr. had started his career) the information technology industry was born, but we just didn’t know it.
1968 was another year of transition, where political and social institutions were challenged in a terrain that was increasingly technology-driven. The kids who were protesting in 1968 (at least the American kids) were the children of World War Two veterans who were, themselves, the most educated in history because of the GI Bill. The kids of 1968 were smart and educated and solidly middle class where their parents had been wizened children of the Great Depression.
The kids in 1968 were the first young people since the 1920s who could afford to be radicals.
1968 saw the last newspapers set using Linotype machines. Offset printing was high tech after almost a century of Linotype.
1968 was also the year when Doug Engelbart — handsome as any leading man — did his famous Mother of All Demos at the Joint Computing Conference in San Francisco, showing us 50 years ago most of the technology we use today.
And 1968 was the year they turned on the Arpanet (commissioned by Bob Taylor in 1966) and what we now call the Internet was born.
Signs of change are all around today. We have an American President who attempts to rule through direct social media communication — the ultimate extension of FDR’s Fireside Chats during the Great Depression — yet he’s also a President who doesn’t use a computer and never has. Somehow that’s fitting, because technology news in 2019 is mainly about mobile phones and video gaming — entertainment technologies. Between games and phones, the personal computer has been made pretty much irrelevant in the news and probably in life as well.
And here’s why: The Macbook Pro on which I am writing this column was built in 2010 — nearly nine years ago. Who would have thought any computer would be remotely useful 3+ technical generations later? Yet the only reason I’ll replace it, even now, is because my hardware (and especially my GPU) isn’t supported by the newest version of MacOS. My notebook crashes a couple times per day because of software incompatibilities, but it otherwise runs just like a new one.
I’m interested in what’s going to be the outcome of this transition and I’d really like to hear your thoughts. too.
AI is going to be a huge factor, changing the way we work, but also the way we do everything else. Look at the rise of e-sports. When will e-sports transcend real life sports? The value of a good fantasy football quarterback is already higher than the same quarterback’s value in the NFL.
These are the sorts of things I am thinking about. I hope to present them in a series of columns this week that look to 2019, sure, but this time choosing trends that I think will have legs beyond the coming year.
Everything is changing and nothing — nothing — will ever be the same again. I hope that’s a good thing.
“Between games and phones, the personal computer has been made pretty much irrelevant in the news and probably in life as well.”
Actually PC sales have remained pretty steady for the past three years, despite phones and tablets, so this is not correct.
Quarterly personal computer (PC) vendor shipments worldwide, from 2009 to 2018
It’s fine to browse the internet and do social media on phones, but for real work of any kind you still need PCs.
I’m also highly dubious about ’50 year cycles’, and I don’t see any reason to believe there will be major technology changes in 2019. Things will be about the same at the end of this year, except for incremental improvements.
So, technological stagnation (this should be obvious by now, you just have to look at the CES show last month – no “Next Big Thing” was even been hyped), and more noise and garbage in the so-called tech world (I prefer to refer to it as “Just another database with just another front end”).
When people come to realize that ML (when you say AI, I assume that you refer to Machine Learning) is a scam that, even if it were to work, creates many more problems than it solves (fake news writing, deep fakes, unreliable surveillance, etc…), we are going to have a crash that is going to make the 2000 one look like nothing. And this time, it’s not just the VC betting their money in the bubble; pension funds are big into it as well…
“Everything is changing and nothing — nothing — will ever be the same again. I hope that’s a good thing.”
You mean that you are going to tell the truth about those M1neServers?
Mineserver: nothing will ever be the same again
Looking forward to all these new predictions, Bob! Hoping you are able to keep to the schedule you’ve posted for yourself.
Ask a 20 year old today about IBM or what it is/was and most likely you get a blank stare or a simple “I don’t know.”.
Which is odd – IBM does have state of the art hardware (with Nvidia) and (probably?) software for machine learning (…and maybe AI as well? Bob, would you consider Watson be ML or AI?). However, all the projects that they were developing over the last few years to take advantage of that expertise seem to have failed to have too much of an impact (Watson oncology, several medical image analysis with ML, that automated data analysis of research data in the cloud “put your data into our cloud and our system will find patterns in it for you without you having to do anything”, etc…). So, what is not working?
Some of what you say is very true; I just recently replaced my Dell Lattitude E6430 i5 with a newer – but refurbished – E7470 i7. Not because I needed to, but because I liked the E7470 I just got at work.
I do probably 85% of my personal everyday computing on an HP x2 Pro G1 I bought three years ago off of Woot.
I’d like to buy a new 75″ Sony 4K TV, not because my current 55″ FHD model doesn’t work any more, but because I want a bigger screen.
The things that were driving technology change just aren’t any more.
However, I now have a FireTV Cube, which I use a lot more than the FireTV Stick I’ve had for a couple of years. I have a Ring Video Doorbell 2, and I just ordered a Nest Thermostat, along with a couple of smoke/CO2 detectors. All of these are newer, connected devices that kinda sorta work together, in that I can tell Alexa to do something with the other devices, such as turn the heat up, or show me the live video from my front door.
These devices are what is interesting about technology these days. They aren’t particularly ‘smart’, but they can rudimentary things for us, after we spend some time to set them up properly. They show a glimpse of where we are headed, and the more devices that come out, and the better they work together, the more they will become essential, natural to have around. For some that might be scary. For me, the only scary thing is that some of these require a mobile device in order to set them up, no Windows (or even Mac) PC allowed, not simply required. So I found a use for the nearly ‘free’ iPad Mini I got when I bought a new iPhone for my wife a couple of years ago. Me, I’m still rocking my Lumia 950XL, but I’m about to get on the Galaxy S10 bandwagon…
Agree to most of the previous comments, you make me remember of Ohio (4 dead, 1968, Nixon)(I’m NOT a commie), your wonderful book “Accidental Empires” and movie(s), computers -as we know them- WILL prevail (we need them to write programs, or I will write programs on my Raspi to run on …(?) (and my Arduino, to drive something to make a living), IoT bullshit will continue, bitcoins will die?
Keep writing and be good. Best wishes to you !
The US lost over 50,000 soldiers in WW I battlefields. Far less than the major European combatants, of course but hardly ‘no battlefield carnage’. Roughly as many US battlefield losses as in the Vietnam war.
50,000 … we lose close to that many people every year to overdoses, shootings, transportation accidents, etc., and we don’t see that as significant these days – Bob is right, our view has changed.
> The US lost over 50,000 soldiers in WW I battlefields
First I read it the way you did, then realized the important part was that none of the battlefields were in N. America. The USA didn’t have civilian casualties, masses of homeless/displaced people on our hands, and we didn’t have cities to rebuild.
I think the changes in the next n years are going to be far more sociological than technical. The computing will become less and less obvious and far less trustworthy, and the result will be a world where you not only can believe what you wants, but you kind of have to. People will have their own facts because the video on the news is not going to be discernibly real or faked, and the output of dozens of sensors being compiled into a positive ID of a suspect might not be visual (phone location + Fitbit + heart monitor + Nike shoe-ware + Costco card RFID + sidewalk traffic monitor + readings from nearby car security IR sensors + doorbell night vision camera green screen + God know what by 2025.) We’ll need courts to keep up, news organizations to keep up, business models for anything other than huge organizations to survive, and our government must develop some way to represent the people instead of the giant organizations and swarms of lobbyists that are driving its agenda today. Otherwise today’s malaise will be tomorrow’s fond memories.
Looking beyond some new but ultimately insignificant gizmo that might be introduced tomorrow and instead envisioning 50 years out, it feels obvious that straight-line extensions of existing technologies will not be the center of technological progress. For example, will a few more x86 cores in a CPU change the world?
The challenges of the future, and hence technical focus, probably won’t be in the areas we’ve become accustomed to looking. Consider climate change: Either life will change dramatically, or totally new technologies will evolve to stop changes already underway, or both. I suspect climate change will impact society at least as much in the next 50 years as the microprocessor revolution impacted the last 50 years of society.
Similarly, North Korea has demonstrated that a small, poor country can develop weaponry previously associated with superpowers. Or 3D printing make extremely inexpensive and hard-to-detect handguns possible. Will homo sapiens over the next 50 years develop new neuroses based on fear of destruction, or will an improved social morality develop that allows disagreements to be expressed in more agreeable ways?
Technology will continue to change the world; looking 50 years out, changes will be in areas far beyond those limited to silicon-based tech toys.
But there were no battlefields in the U.S., so no loss of infrastructure, monuments, production capacity, etc. No damage to production capacity is particularly relevant to the beginning of the “American Century” because the US could produce goods that rebuilding European countries would need, a situation which would be repeated around thirty years later.
And regarding 50,000 casualties, yes that was significant, but that same year there were over 500,000 deaths from influenza in the U.S. (by some estimates over 650,000 US deaths), so battlefield casualties were far from the majority of deaths in those years.
I think improved health will be a big thing in next year and decade, enabled by the information that can be collected by improved technology. Continuous Glucose Monitors (CGMs) and other instantaneous, continuous health monitors will give people a much better understanding of how diet and exercise actually affect their health.
Bob,
One false assumption I believe you are making is that the future will look like the past. In other odds, ongoing innovation and most importantly, ongoing growth. Population growth and economic growth. These assumptions are probably right for the short term, but probably wrong for the medium to long term.
The fly in the ointment is of course global warming.
Global warming will negatively impact economic growth, whether it be by diversion of capital or by direct impacts of a hotter world.
No system with finite resources can sustain unlimited expansion and growth.
You can “fix” your 2010 Macbook Pro’s GPU problem by installing gfxCardStatus:
https://github.com/steveschow/gfxCardStatus
But there is a trick to stopping the GPU crashes with it:
Upon starting up, you have to kill any programs that are using the GPU, such as Chrome, so you can click on the gfxCardStatus icon to drop down and (successfully) select “Integrated Only”.
As for 1968, the tectonic force splitting the body politic asunder throughout the West is immigration which, coincidentally, was unleashed on the West by the Immigration and Nationality Act which went into effect June 30, 1968. Over 90% of the public opposed increasing immigration at that time and forward for decades. During those decades immigration _rates_ continually increased to the point that it changed the character of “the public.” The voting patterns are forever greater centralization of social policy. This means that everyone must, as a matter of self-defense, become supremacists. If they “tolerate” other social preferences, they will find themselves human subjects in social experiments to which they did not consent. This means war. The geographic pattern of voting shows a clear urban vs rural divide, so the war — when it goes hot — will entail taking down the physical infrastructure of the urban areas.
Those who are stuffing the ballot boxes with _people_ to gain political power are guilty of their coming genocide.
I think the bright future is in evidence-based decision making. A surprising number of ‘best-way-to-do-something’ are just established ways of doing things, not backed up by evidence.
Evidence-based approaches have failed because of the cost of collecting the evidence.
The internet enables virtually zero cost distribution of data, search engines allow people to find the data.
Powerful computers make it possible to analyse lots of data.
We already have the evidence-based movement in medicine.
My own contribution is evidence-based software engineering: http://www.knosof.co.uk/ESEUR
I was took part in a hackathon at the weekend working towards evidence-based campaigning: http://shape-of-code.coding-guidelines.com/2019/02/24/evidence-based-election-campaigning/
“Actually PC sales have remained pretty steady for the past three years, despite phones and tablets, so this is not correct.”
Although a lot of businesses (and education) buy computers on a three year depreciation scale so the fact computers sales are steady state just means they are being replaced not because they are useless but because of accounting.
We’re thinking (for the first time) of replacing our laptops at work with Surface Pros, and I don’t think we’ll be the last to consider that (if iPad Pro’s could handle big displays and keyboard shortcuts I’d seriously consider it as I already use mine while traveling instead of a laptop.
My biggest question is can democracy survive the commoditization of attention. Readily available media devices will draw ever more attention – think of AR glasses. Content is driven by commercial interest. Big brother is between the frames and independent, fact based thinking and judgement will become ever more challenging.
Meanwhile we are facing the greatest threat our species has know, Global Climate Change. This will exceed Word, Nuclear, Chemical, and Information wars in its threat level to human survival. It is slow in relation to human perception and symptoms are contradictory and subtle. Emotionally humans are poorly equipped to deal with it.
A way out would be leadership, but our leadership models are broken in the same way as the populations distrust in scientists and other experts.
Economic displacement will increase, people will not understand, and turn to the readymade answers of populism (fascism) to turn their frustration into hate. Populism works better in media than rational liberalism, emotions work better in the competition for attention.
This will not be resolved in a positive manner by waiting for ideas to evolve. We need organized thinking and action to work our way through this.
Looks like there is no way to edit all my typos. Sheesh…
“I hope to present them in a series of columns this week that look to 2019″
.
So when you say that do you really mean “this week” or whenever the hell you feel like it in the coming 1-2 months per usual? I just want to make sure we’re on the same page since we typically are not…
Bob,
Please say something insightful about AI.
Most of what I read is repeating drivel, written by people with no understanding of the tech or the limitations. One more fluff piece on how “AI changes everything” or “AI takes all jobs” and I will run screaming around the cubicle farm with my underwear on my head.
Your readers are technical enough to understand what ML and AI are and how they are implemented. Talk some engineering and deliver. Moore’s law is slowing, and that will slow AI too. Please!!
“Between games and phones, the personal computer has been made pretty much irrelevant in the news and probably in life as well.”
.
“Everything is changing and nothing — nothing — will ever be the same again. I hope that’s a good thing.”
.
So, is all that really just a long-winded way of saying you’re not going to bother doing anything about the Mineservers?
on WWI, the carnage of soldiers is unmistakeable… but unlike Europe, we had no destruction on our own soil. that ended with 9/11.
next time around, and it could start with the IoT hacks, there won’t be anybody anywhere to count the damage after a few months of the radiation clouds circling the globe. and we all have leaks in our homes, primarily TV streamers and security devices. the irony abounds… security devices have crap security on the net.
“Between games and phones, the personal computer has been made pretty much irrelevant in the news and probably in life as well.”
I have shifted to using my phone perhaps 50% of the time for the trivial – texting, checking email, RSS feeds, quick web searches, etc. – and I see that shift around me, especially among the youngsters (I’m 72 so that includes almost everyone). One factor that will keep a PC/laptop/convertible around is simple ergonomics.
Anyone who needs to do serious work on a computing device for long periods regularly will either stick with a PC-ish device for those times or they will come back to it when the physical problems – carpal tunnel, etc. – show up. I don’t see the human form changing any time soon.
…ken…
To help ruminate with a wider view, consider this chart [1]:
.
Data is from mid-2014, but still very insightful. As you can see, the US population appeared as follows:
22% gen-z
23% millennial
21% gen-x
24% baby boomer
9% silent generation
1% greatest generation
.
The younger generations are showing signs of being more thrifty [2] and more entrepreneurial … some people oversimplify this to say: they are more fiscally conservative, and there may be some truth to that. Health and wellness a daily consideration, such that their lifestyle itself reflects a desire to take care of their bodies and minds [3]. While they put off major milestones for later they still want marriage, children, home ownership, etc. Debt is not popular, and student loan debt is another ever-present source of anxiety.
.
Their expectations of life are well-writ, and technology has played a role everywhere: any business that wants their money will have its wares fully described online with a publicized price and for public review (and peer reviews are a big part of the purchase process for items large and small); services like Uber and Door Dash will help them go about their day without the requirement of owning a car if they don’t want to; quality is an expectation and will be carefully considered along with price when making buying decisions.
.
… this chiefly describes where we *are* … it’s a necessary starting place for where we’re *going*, but it’s just that: a starting place.
.
——————————————————————————–
.
[1] https://www.businessinsider.com/goldman-sachs-chart-of-the-generations-and-gen-z-2015-12
.
[2] https://www.vice.com/en_us/article/ppveam/millennials-have-discovered-going-out-sucks
.
[3] https://www.goldmansachs.com/insights/archive/millennials/
Bob you were at some point great tech writer. That was time when you were younger and you got all the tech info from first hands as well as you had the time to read and follow everything. Now you don’t and I can see that. You just don’t stay informed any more and you are irrelevant. It is time to hang up the hat while people still assume you are smart and relevant guy.
I told you I can see that you do not know much about health care, AI or Apple and now I can see you know even less about history.
– WW I – World War I started on July 28th, 1914 after the assassination of Archduke Franz Ferdinand and ended on November 11, 1918. US entered the WW I on April 6, 1917, more than two and half years after World War I started. US was NO FACTOR whatsoever in the WWI. WW I finished because Germans were unable to supply their troops on the western front and US has nothing to do with that.
– AI – Did you ever hear about DeepMind Bob ? That is AI company owned by Alphabet (Google).
Demis Hassabis, one of the founders, PhD Cambridge and London College, does not believe in the AI in the way you do Bob. Link is here :
https://venturebeat.com/2018/12/17/geoffrey-hinton-and-demis-hassabis-agi-is-nowhere-close-to-being-a-reality/
Did you ever ask yourself question Bob why only PhDs from Carnegie Mellon University are bullish on AI ?
Have you ever seen PhDs from Stanford, Princeton and Berkley giving such bullish statement about AI ?
For your info Bob Carnegie Mellon University is 37th ranked University in the United States. It can not make top 100 in the world.
Vaporware for guys like you Bob.
My personal opinion about all those PhDs in AI from Carnegie Mellon (I can be wrong) is that some hedge fund had a lot of money and instead of paying taxes they used it to fund so called AI in Carnegie Mellon. They now start companies that will go public or be sold so they will make even more money although there is just nothing there and they are using people like you Bob to get good vibes in public.
Old Romans would say Quid Pro Quo.
Bob,
I always get excited (and scared) at the same time when you say, “Everything is changing and nothing — nothing — will ever be the same again. I hope that’s a good thing.” Why? Because every time you’ve said it, it’s come true.
“I hope to present them in a series of columns this week that look to 2019″
@Gerard beat me to it….but based on Bob’s previous estimates, I figure we’ll see these columns closer to July 2019.
One-year predictions are next to useless. They exist solely as an excuse for pundits to produce words and snare attention. Nothing changes enough in one year to make such prognostications even remotely interesting or useful. “It takes 10 years to become an overnight sensation.” I would be much more interested in 5 and 10-year predictions.
@Ross — “I would be much more interested in 5 and 10-year predictions.”
.
Here ya go: In 5 years, Crookely still won’t have posted an update to the Mineserver project on Kickstarter.
.
In 10 years, more of the same.
@Roger “In 5 years, Crookely still won’t have posted an update to the Mineserver project on Kickstarter. In 10 years, more of the same.”
.
I was about to say “Well now Bob may post to KS just to spite you” but then I realized you created a win-win situation. He either does exactly as you say, and shows he’s an ass (both to his loyal readers and the backers), or he posts there and everyone gets what they want, ending all of this nonsense. You can’t lose! Well played…
Here’s a prediction:
.
In 5 years, Roger will have posted an additional 250 comments bitching about Mineserver. Several hours of his waking life spent spent fruitlessly shouting into the void.
.
In 10 years, Cringely will have moved on. A new business venture that we in 2019 wouldn’t even understand will dawn on him in the spring of 2022. This time, he’ll wisely not put himself in charge, but recruit a powerful team to make it happen (no mention of this in the above 5-year time frame because it’ll still be in stealth mode then). Cringely’s sons will be independent of this, pursuing their own ventures – one in electric aircraft, the other in a wholly different business venture nobody in 2019 would think of. Cringely himself will chuckle from time to time about the Mineserver mess, and aside from that not give it another thought. An embarrassment from the past … perhaps like a Dem virginian’s yearbook photos.
.
… and Roger and his friends will *still* be bitching to anybody who will listen (a small number indeed). Hell, by 2029, few people will remember there once was a game called Minecraft which people played on archaic personal pocket computers called phones … but Roger will remember: bravely carrying on the torch of ineffectual, impotent pointlessness. And I’ll be there next to him, counter-complaining.
@Howard — You and me forever, partner!
“recruit a powerful team”…
Were I one of these “powerful”, my first question to Bob would be “Can I trust you?” Because, based on the Mineserver fiasco, it appears you are not a man of your word.”
One more time: The Mineserver posters are not asking for a product, or for their money back. They just want what every Kickstarter project promises: transparency, and an occasional update.
Bob could end this in 15 minutes, by posting an update on the Kickstarter site. For reasons unknown, he doesn’t, and his reputation continues to degrade.
I’ve enjoyed his writing for many years, but the way he’s (not) handled this is a real head-scratcher.
Well… I’m writing this comment on a Windows PC I built in March of 2012. I guess I will upgrade when HDMI 2.1 and DisplayPort 1.5 become available to support a new 40″ 3840×2160 HDR10 monitor at 120fps. Otherwise, there really won’t be much difference between the Windows PC I built in 2012 and a newer PC built with 2019 PC technology.
I also have an iPhone, an Apple Watch, an iPad and an Apple HomePod… and an HP 17″ Windows 10 notebook PC I purchased in December of 2017. I really use all of this stuff quite a bit. I guess I use the HomePod and the iPad the least. I use the 2012 Windows PC the most because that is what I use for “real” work and it is the most effective tool I have. But, the iPhone is perhaps almost important because it provides connectivity when I am out and about – but, I can’t do “real” work on an iPhone. My family has 6 iPhones, 7 iPads, 2 android phones, a landline, 6 cable connected TVs, 6 Windows notebook PCs and 2 desktop Windows PCs, etc, etc.
I know most families would not have this much technology around. I guess desktop PCs are kind-of going away… I’m not sure – but, almost everyone I know has a Windows notebook PC or an Apple notebook PC to do “real work” because they don’t have space for a desktop PC.
So, I know there are a lot of people who do not have the wherewithal to keep so much technology around.
I have to agree with your assessment Bob looking at it from the perspective of a technologist outside of the computer biz. Things definitely run in cycles, how long a cycle is, is arguable, but cyclic it is. When I look back over the history of technology things always seem to enter into a stagnant period before they break loose again. I know my own area of work is definitely in a stagnant phase.
One thing that is for sure about these transition periods is that what we think now is the likely driver is very unlikely to be the driver in retrospect. Another thing we need to keep in mind is that it will seem slow while we live through it, but blindingly fast in retrospect. So get ready for the ride!
Having said all that…. I will prognosticate that typewriters and burn barrels will be the next big thing as organizations strive to reduce their exposure to cyber warfare and IP theft. 😉
“…but for real work of any kind, you still need PCs.”
Golly, that sounds familiar. Possibly because I’ve been hearing that “real work” purity test for decades.
Sorry, but that’s changing, too. “Real work” also gets done on phones and iPads now. You might not consider it “real,” but the younger people relying upon it might beg to differ.
Hey Bob, are you missing the human rejuvenation revolution?
https://www.undoing-aging.org
🎁 = https://www.youtube.com/playlist?list=PLEaHTFl1r192vkXHV-sj7EX_-gx1QKOqW
🎁 = https://www.youtube.com/playlist?list=PLE8svqV6H4wb4GLyjGOuAdLWX1V4d16hW
🎁 = https://youtu.be/5Z1gfgM7kzo
(1:11 – end 😂)
👍 🌟
💀 = https://www.youtube.com/watch?v=cZ7i2_kn_zE 😢
☀️ = https://youtu.be/7MmEVWI8Ieo
So… is the Windows style desktop PC as we know it going to be significant in the future?
I use the term “real work” not as a pejorative reference to other tech alternatives to Windows style desktop PCs – but, as a shorthand term that most people would understand.
Lots of individuals are thinking about how a Windows style desktop PC is going to be a part of the future.
At this point, in my opinion, if you need to interface for several hours with various kinds of information that requires a significant amount of input from the user – it is almost impossible to find a competitive alternative to a keyboard and mouse with a graphical-user-interface.
But, I can recall individuals showing how efficient slide-rules are for calculations – yet, I knew that slide-rules were going to be non-existent the moment I tried out an HP handheld calculator.
Often while reading the news on my iPad, I accidentally end up at a video of the news – which with great annoyance I quickly shut down. Video is just such a very slow way to gather information.
I really do think that reading and writing is going to remain a primary way to communicate for the indefinite future… am I wrong?
How long will keyboard-mouse last as a primary tool in the business world? I’m guessing at a minimum at least 10 years… maybe even 100 years. And, therefore – I’m guessing Windows desktop stye PCs are going to be important in the future.
Granted the market for Windows desktop style PCs is mature and we don’t expect it to be too much different in growth and development than – for example – something like kitchen refrigerators.
I’ll say that “real work” happens on a machine that can handle spreadsheets, slides, and wysiwyg word processing … i.e. Excel, PowerPoint, Word (microsoft) … or Sheets, Docs, Slides (google) … or similar. Phones, tablets, and phablets just aren’t up to that (and for the rare tablet that has a keyboard and mouse attached: you’re proving my point).
.
That’s a long way of saying: PC’s aren’t dead. Laptops are gaining the PC market share of course, and towers are far less useful / necessary in comparison to laptops than they used to be. That’s certainly a consequence to be aware of.
.
@True Rock: Often while reading the news on my iPad, I accidentally end up at a video of the news – which with great annoyance I quickly shut down. Video is just such a very slow way to gather information.
.
Video is a very effective way to deliver advertisements, which is why the @$#! things auto-load in the first place. Remember in the context of news you’re less the consumer than the product being sold, and the actual customer is Pepsi, Nike, etc.
I’m imagining all of the code I sling for work and cannot comprehend ever doing this on anything other than a PC in any short term with what’s available as an alternative. I feel “cramped” without 2 monitors so I don’t see myself (or others in my field) making that transition to something more compact to replace our PCs.
.
Perhaps we aren’t the target audience you refer to and you’re thinking more the business people of the world and my profession is the exception, not the rule.
Rob X ( what does X stand for) your out by 5 years 1963 is the seminal year.
1 Kennedy dead — new direction > Nixon
2 SDS rises — kids take control leading to S Jobs insights
3 LSD use — depression’s best medicine see https://www.abc.net.au/radionational/programs/allinthemind/david-nutt/10829146
4 The pill liberates women
Not a prediction, but a wish: real competition for Facebook, in the form of a low cost subscription social network. Apple?
Any paid-for site is not likely to draw users away from Facebook when Facebook is free. It’s not like Facebook is doing anything particularly high-tech; with the skills learned in a high-school HTML class, you can make a website with some pictures on it, and attach a text box to each picture with a PHP script behind it that lets people post comments. In 5 minutes, you can develop a website that does what Facebook does. People have absolutely no reason to pay for something that can be trivially provided for free; when was the last time you paid for a web browser or a picture viewer?
I think medical advances will be big in 2019 and going forward. Bio electronics – integration of electronics with people – will spread. Think cochlear implants, neural stimulators to work around nerve damage, etc. And I think 2019 will be a watershed year for energy, with the first net energy producing nuclear fusion. Just knowing that nuclear fusion will be feasible will produce big political changes too: it will take the wind (sorry for the pun) out of the green new deal, and hopeful US socialists will take a big hit.
Bob, I used to be your biggest fan. Now I agree with many on this board that you are “sucking wind” in terms of your understanding of technical futures in the “soon to be” 2020s. This article really had nothing to do with predictions and everything to do with “remember when”. And that is what people who aren’t active in the economy write about.
Thanks for the 1968 history lesson. And I disagree with you. I think things are actually “slowing down” in tech, and quite rapidly. This is a watershed year, not because of any new tech, but because it’s 1928 all over again. 50% of the companies we work with now will be bankrupt in 5 years. Just like with the railroads and automobiles, hardware and software are approaching “maturity” or “already mature”. I think there are plenty of ideas left. I just think there’s no money for ideas, and the money that is out there is chasing the wrong things and is getting concentrated in the hands of a few large companies buying up the tech landscape as ideas approach their Series B and Series C. (SAP, Oracle, Salesforce, Microsoft, Amazon, Google, IBM)
I also think we missed the 80 year “economic supercycle” depression in 2008, and we’ll make up for it in 2019/2020. A few trillion dollars “can” delay it 10 years, but it cannot ultimately stop it from happening. So I think the big changes are still 1-2 years out. We are already “fully saturated” in the debt department and much of the money was spent on stupid things like buying back their stock shares instead of tech innovation, and buying companies whose ideas are already “allegedly” starting work (like Uber and Lyft as governments try to crush their “innovations”). At the individual level, we bought houses with were too expensive to our income, we bought $1000 smartphones, and $100,000-200,000 college educations. AI was sold as the “answer to everything” (just like the last 3 times), and anyone who has been in the industry 10 years or more is smart enough to parse through the winners and losers. After the depression hits, if we don’t succumb to socialism, you will see some really amazing companies, well run, and lean and efficient, emerge out the ashes of the 2019 economy, probably 2022 or 2023. Their costs will be significantly lower than today, and they will be customer-driven and market driven, and I don’t mean the stock market. If we succumb to socialism, I’ll just go find a free country to live in, because like Venezuela, the US will be super poor, super fast.
You cannot let people vote themselves free stuff from government money. And you can’t tax the rich unreasonably. The rich will find the lowest tax place to operate with freedom and infrastructure, and relocate there. The US economy is prospering in early 2019 because Europe is sucking wind with the unmanageable debts of Italy, Greece, Spain, Portugal, France and now Germany. Many German and European banks are already broke. But we are right behind them, #5 in the debt to GDP ratio of developed countries.
Loans have to be for investment, not to let someone live their “lifestyle” to the next week. Pensions have to be saved, and if the money isn’t there, it won’t be. Success takes discipline in life, and keeping your word, and keeping up a discipline with EVERYTHING, your health, your work, your business, and your parenting. You can still love, but you have to TEACH too. And teaching isn’t easy sometimes, as it involves allowing negative consequences to happen, or possibly even providing them in some cases.
My guess is that you won’t have your future trends stuff done by July. I strongly recommend that you go ahead and write your final goodbyes and close this out. I will continue to check in monthly out of morbid curiousity, but it’s time to find your “retirement job” and take a nap every afternoon. I’m not far behind you either.
Take a good look at Kondratieff cycles, especially how they tie in with technological development. The micro electronid revolution is just taking off. The development of 3D circuitry will change the Moore’s law exponent from 2 to at least 3. A switch to something like self assembling virus scale components could cause a discontinuity of the curve as it jumps by a factor of millions or billions and then settles back down.
Change how you look at Moore’s law. I like to look at it as being a 1000 fold increase in transistors/device every 20 years. If you start at 1949-1950 as year zero you get to 1000 transistors/device in 1970. Yup, the start of the microprocessor.
1950 – 1 transistor/device
1970 – 1 000 tpd
1990 – 1 000 000 tpd
2010 – 1 000 000 000 tpd
2030 – 1 000 000 000 000 tpd
I first worked that out in the early ’80s when I was in grad school. I did not believe it then. Now, I have now doubt that we ain’t seen nothing yet.
Ummm… someone will come along and say the Moore’s law ended years ago when processors stopped getting twice as fast every 2 years… Please read the original paper before you show your gullibility in public.
Re: Mitch Wright “…a low cost subscription social network. Apple?” Oxymoron.
Re: Chris too, February 27, 2019 at 8:17 am “… I think 2019 will be a watershed year for energy, with the first net energy producing nuclear fusion.” The H-bomb has existed for decades. Other than that, I’m curious about the basis of that prediction.
Ronc, make it an app for iThingies only…
The only really bad things I see happening is
1.) the rise of semi-totalitarian societies, not going full George Orwell, more like Aldous Huxley, instead. Most people will be too distracted by social media, reality programming, online gaming, soft entertainment that requires little to no deep thinking. As it is happening right now all over the globe in real time, seemingly being perpetrated by both Russia and China, to weaken and keep in check Western democracies many which are currently weak, corrupt and susceptible, to further their own economic and political agendas.
2.) an actual nuclear conflict or several nuclear terroristic events. While I doubt U.S./Russia/China will employ any devices at each other, other nuke players may feel they have nothing to lose. The big likelihood is Pakistan v. India, with Pakistan risking being first to strike. How it strikes depends on situation and the reliability of its intelligence. If you’re going attack another nuclear power, the best and first target would be where the missile silos are. The devices both countries have are not Hiroshima/Nagasaki devices and the damage they can wreak will last decades, maybe centuries, not to mention the smoke and fallout drift across the continent and the rest of the world, affecting growing seasons, crop yields and the already extreme global climate crisis.
If not a nuclear conflict, then a series of terroristic attacks employing nuclear devices, either dirty bombs, which are ordinary explosives laced with radioactive material, affecting people in immediate area of explosion or the detonation of one or more full nuclear bombs in major population centers or near military or industrial areas.
3.) The most inevitable is of course the extreme global climate crisis, which is happening now and must be arrested ASAP, else the next fifty years is going to rough going for human beings ALL over, with water, food and energy shortages, and that stuff is going to make it easier for the first two things on my list to follow. Unfortunately, the ultra-conservatives are either real blind idiots or, perhaps, they actually do believe the science, and want climate to get worse. When food and water and housing is scarce and order is upset, what better time for despots to rise and offer simplistic solutions to sway public opinion and take power. And that is when things sway from being Huxleyish to being Orwellian.
Re: Kevin KunreutherMarch 1, 2019 at 3:05 am “offer simplistic solutions to sway public opinion and take power”. Any solution should be no more complex or simple than necessary to do the job. A simplistic solution to the scarcity of a necessity would be to legislate it’s price to make it “affordable”. It’s the reason for the failure of communism.
Re Ronc, In 2014 Lockheed Martin said they would demo a nuclear fusion generator by the end of 2019. In 2018, when they published articles revealing patents they filed for nuclear fusion generation, they repeated the claim. AFAIK, they have never slipped the date. LM is not the sort of company that makes vaporware claims.
“Evidence based decision making” would be wonderful – if it ever happens in medicine and elsewhere! The medical industry (medical doctors) sit in their offices “seeing” several hundred “patients” a day – making decisions (making guesses) about what to do with Mr. X, Y, and Z’s condition. I have experienced this guessing very recently. I was using my cellphone; and suddenly there was a loud noise in my right ear (the ear that the phone was against); which rersulted in making my right ear useless for hearing – forever! I have been “examined” by 5 Ear, Nose, Throat specialists who are also hearing specialists. All 5 gave absolutely different diagnoses!! I went to Google – put in my symptoms and voila Tensor Tympanic Syndrome caused by Acoustic Shock! It’s a common condition of telephone receptionists and call centre workers! In my case, the noise appears to originate from the new telephone system installed by the company I was calling that is interconnected to the internet! Noises clicks and growls is a common outcome from using any phone system hooked into the internet; how dumb (un-carrying) does one have to be to miss-design these communication systems so badly – that its transformed into a serious permanent health hazard!
@Chris too
Lockheed Compact Reactor design is 100 times worse than initial claims
Something to consider in your predictions http://3seas.org/EAD-RFI-response.pdf
The Tech software industry really doesn’t honor the Hackers Ethics today and its gonna bite them in the butt with what they do with AI.
@GreenWyvern – I guess time will tell. If a net energy producing, sustainable reaction can be achieved, even if only in lab sized equipment, it will be a watershed moment for the energy industry. The aircraft industry went from a Wright flyer to a passenger jet in 51 years. Commercial fusion will make similar gains in the next half century.
“Ummm… someone will come along and say the Moore’s law ended years ago when processors stopped getting twice as fast every 2 years… Please read the original paper before you show your gullibility in public.”
This may be true and is often mentioned. However, whether it is Moore’s law or not, something definitely has changed, because CPU’s _did_ tend to double in speed every 18-24 months over a very, very long time period and this is definitely no longer the case. The first PC my family got was a 486SX 25MHz with 4 MB RAM for Christmas 1993. A very, very common house hold PC at that time. Less than 5 years later – in 1998 – I bought my first own PC, a P2 300 MHz. Not only was the clock speed 12 times faster, but it was now a 3-way issue machine with many other performance improvements and 16 times the rAM. It’s an insane speed-up. The P2 was probably a bit higher-end than the 486 was when we bought it, so a more fair comparison might have been with a Pentium 233 MHz or something but still you get the idea! And this trend continued on and on – in 2002 I got an Athlon XP 1700+ for much cheaper than the P2. Then in 2005 I got an AMD 3000+. machine etc. etc. The increase was so incredible, and software kept pace, that a PC was almost completely obsoleted in 3-4 years. Of course it could run the software made when it was released, but forget about running the version of Windows or Office that came out 3-4 years after its release. But this trend has surely ended – compare a 5 year old PC to one today. In practice, not much difference. You can run the same software and do most task at what feels like the same speed. There has been some improvements and there’s now a shift to higher core counts. But remember, it was 10 years ago we got quad-core. A slow move towards 6-core and 8-core at the high-end (and at the cost of reduced clock frequencies) isn’t that amazing.
So something had definitely changed, whether you call it Moore’s law or not.
I enjoy day dreaming about the future, so a few years ago I started a short survey of ~30 questions among friends, to guess the year of some future change. I won’t bore you with the details, but here are some long term changes that I think are possible, or even if they never happen will impact society:
– quantum computing (interpret this with a dose of realism)
– continued slow progress on fusion power
– health advancements towards artificial blood
– health advancements towards stem cell grown organs (towards a self organ transplant)
– further decline in the use of physical cash. (will the US ever go fully cashless, using a tech replacement of some kind?)
– further increase in assisted driving technologies (lane assist, collision warning, automatic parallel parking)
– increase in passive surveilence (facial recognition at airports and other travel hubs)
– increasing use of algorithms to determine outcomes in the justice system
– hopefully decrease in amount of bias in the above algorithms
I don’t know how much of the above will ever fully materialize, but I think this is the direction the future is headed.
(side note: esports is more than fantasy football, see: dota, LoL, CSGO, fortnight, overwatch)
Los Antidepresivos en en Colombia Wellbutrin sin receta Luvox (Fluvoxamine) cuanto tarda en hacer efecto? . Comprar Pamelor Ventas de antidepresivos Ecuador.
“1968 saw the last newspapers set using Linotype machines. Offset printing was high tech after almost a century of Linotype.”
====
Actually, it was closer to the late 70s/80s. https://vimeo.com/127605643 The documentary “Farewell – ETAOIN SHRDLU” shows NYT converting over in ’78…but I suspect smaller printers had several more years of that into the 80s, due to high costs of printing equipment.
Interesting. I have been thinking more around a 60-year cycle. I have been predicting the coming decade (2020-2030) is going to bring more dramatic change than the decade of the 1960s (1960-1970).
You can widen the time frame a bit, to see the larger effect, the edges of the normal distribution. If we look starting from late 1957, we see the the first satellites and the rise of jet airplanes. By the end of the decade man walked on the Moon and the prototype Concorde SST and Boeing 747 Jumbo Jet had flown. The 747 entered passenger service in January of 1970, just 11 years and 9 months after Boeing’s first jetliner, the 707. Apollo 11 landed on the Moon just 11 years and 9 months after Sputnik 1. What is it about 11 years and 9 months?
Like the late 1950s, the late 2010s foreshadow the decade to come. Things like self-driving cars, AI, etc. are on the fringe now. Where will the be 11 years and 9 months from now?
Transportation could be radically different in 11 years from now. Higher education could be radically different 11 years from now. Medical treatments could be radically different 11 years from now.
Technology increases the speed of actions but does not change the speed of reception. As humans our receptors, ears, eyes, etc., are limited by human capacity. This is why AI and Robots are a corporate endeavors. Corporations once were concerned with customers and employees but profits must increase each year. How many of talk with a machine on the phone? Such overreach will result in economic misery as the amount of money with the buyers or workers is reduced. The unintended consequence of technology is invasion of privacy. Burn the books! Outrageous right! Bookstores are closing everywhere. What about history? I don’t like the way that sounds! Edit it! So much for history! We will adapt but at what price?
https://aurabear.com
Everything are changed with time to time. It all upon time time will never stay. It is running and running very fast. Last year i am working in https://cahomeworkhelp.com/do-my-english-homework-for-me/ then i was no any knowledge about essay and now i am move on in another company. Here i am in a good post. so at the time everything are changed. Thanks for sharing.
I think this is fine good