Their Ninebot escooters are pretty damn good, far better than most random brands.
I spent most of Covid in VRChat and met my current live-in gf, so the metaverse was real for me too.
I also made decent money selling crypto, so that part was real for me too.
And AI coding, for as dumb as even the best models are, still enabled me to create things that I wanted to, but wouldn't have had time or gotten nearly as far without.
I dunno if the author realizes, but all the things they mentioned did materialize in one way or another, just not exactly how the hype described it.
Maybe if they could let go of some of the cynicism, they could find something to be optimistic about. Nothing ever goes exactly as planned, but that doesn't mean nothing is good.
I get that everyone has a strong opinion on whats-going-to-happen-with-AI, but I really think nobody knows.
We're in that part of turbulence where we don't know if the floating leaf is going to go left or right.
The people who will have the hardest time with this transition are those who go all in on a specific prediction and then discover they were wrong.
If you want to avoid that, you can try very very hard to just not be wrong, but as I said, I don't think that's possible.
Instead, we need to be flexible and surf the wave as it comes. Maybe AI fades away like VR. Or maybe it reshapes the world like the internet/smartphones. The hardest thing to do right now, when everyone is yelling, is to just wait and see what happens. But maybe that's the right thing to do.
[p.s.: None of this means don't try to influence events. If you've got a frontier model you've been working on, please try to steer us safely.]
"All of the above technologies are still chugging along in some form or other (well, OK, not Quibi). Some are vaguely useful and others are propped up by weirdo cultists. I don't doubt that AI will be a part of the future - but it is obviously just going to be one of many technology which are in use.
> No enemies had ever taken Ankh-Morpork. Well technically they had, quite often; the city welcomed free-spending barbarian invaders, but somehow the puzzled raiders found, after a few days, that they didn't own their horses any more, and within a couple of months they were just another minority group with its own graffiti and food shops.
By the looks of it, 2026 might be the year where reality and fiction will finally collide with AI and we'll be able to see if all the hype was warranted.
But like all the previous hype, most of the people that were the loudest won't say they were wrong, and they'll move to the next thing, pretending like they never were the one that portrayed AI as the holy Graal.
> and we'll be able to see if all the hype was warranted.
Umm, what? For the past 3 years, every year I've said something along the lines of "even if models stop improving now, we'll be working on this for years, finding new ways to use it and make cool stuff happen". The hype is already warranted. To have used these tools and not be hyped is simply denial at this point.
i think the point is AI has to go much further and faster than it has in the past 3 years to justify the investments being made from the hype. The hype did its job now the AI industry has to execute and create the returns they promised. That is still very much up in the air, if they can't then the tech was over hyped.
Maybe AI is useful to you, but the US economy is currently buoyed by promises of AI replacing the workforce across the board.
Most of Mag-7 are planning to spend over 500B on capex this year alone on building out datacenters for AI pipelines that have yet to prove that it can generate a sustainable profit. Yes, AI is useful in some environments, but the current pricing is heavily subsidized. So my point stand, the hype is not warranted.
Everything is the same until it's not, good luck predicting when "until it's not" is on the horizon though. Isn't technology innovation a power law thing? Everything hums along fairly regular and then, out of the blue, there's a massive impact. Personally, I think AI has made a pretty large impact in software dev and overall tech industry but I don't see AGI any time soon (and that hype has died down) and therefore I don't see the economics working out. The coding tools, API integrations, chatbots, those are great but I don't see them producing the returns required to keep companies like OpenAI running unless OpenAI takes all the customers and all the ad clicks from everyone else ( Athropic, Alhphabet, X, Amazon, Meta, even Microsoft ). I just don't see that happening.
Perhaps this is the failure to understand the distinction between a technology and a meta-technology. Upgrading the factory that builds the robots is much different than upgrading the robots.
A technology is a set of methods and tools for achieving the desired results (generally in a reliable and reproducible way). Or, in a broader sense of the word, it's the idea of applying scientific knowledge to solving practical problems, and the process of such application.
Or (taking the other side) failure to notice the distinction between a technology and a pump-and-dump. The technology (attention/diffusion) is awesome. The hype is unbelievable. Literally.
What is the point being made here? Some past technologies were overhyped, therefore AI is overhyped? Well, some past consumer technologies did change the world (smartphones, texting, video streaming, dating apps, online shopping, etc), so where's the argument that AI doesn't belong to this second group?
Also, every single close friend of mine makes some use of LLMs, while none of them used any the overhyped technologies listed. So you need a specially strong argument to group them together.
OP here. Unless you're still watching Quibi on your curved TV, delivered via WiMax then, yeah, I'd say it was pretty bloody substantiated.
I like technology. I made a decent living from it. But if I had chased every hyped fad that was promised as the next big thing, I doubt I'd be as happy as I am now.
You're not really saying anything, though. For every tech hype that has failed, there is another that's changed the world. This IS changing the world and our industry, regardless of whether it reaches the heights of the hypers.
I mean you're just stating that sometimes tech doesn't meet it's hype. What's insightful about that? It's a given; cherry-picking examples doesn't prove your case.
The thing is, the successful tech rarely get the excessive hype.
MRNA vaccines. Where are the countless breathless articles about these literal life saving tech? A few, maybe, but very few dudes pumping out asinine "white papers" and trying to ride the hype train.
Solar and battery. Again, lots of real world impact but remarkably few unhinged blowhards writing endless newsletters about how this changes everything.
I'm struggling to think of a tech from the last 20 years which has lived up to its hype.
Not everything is written to be insightful. Some things are just written to get them out of my head.
I personally see plenty of hype but I've also been following the trends and using the tools "on the ground". At least in terms of software these tools are a substantial shift. Will they replace developers? No idea, but their impacts are likely to be felt for a very long time. Their rate of improvement in programming is growing rapidly.
Do feel AI is overall just hype? When did you last try AI tools and what about their use made you conclude they will likely be forgotten or ignored by the mainstream?
I spent an hour with Gemini this morning trying to get instructions to compile a common open source tool for an uncommon platform.
It was an hour of pasting in error messages and getting back "Aha! Here's the final change you need to make!"
Underwhelming doesn't even begin to describe it.
But, even if I'm wrong, we were told that COBOL would make programming redundant. Then UML was going to accelerate development. Visual programming would mean no more mistakes.
All of them are in the coding mix somewhere, and I suspect LLMs will be.
That's the most recent time. But I've bounced around all the LLMs - they're all superficially amazing. But if you understand their output they often wrong in both subtle and catastrophic ways.
As I said, maybe I'm wrong. I hope you have fun using them.
It's not unsubstantiated though. The claim is "People frequently assert that 'this time is different' and they are almost always wrong" and it proceeded to provide a reasonable list of analogous manias.
This only doesn't feel like substantiation if you reject the notion that these cases are analogous.
"You shouldn't eat that."
"Why not?"
"Everyone else who's eaten it has either died or gotten really sick."
"But I'm different! Why should I listen to your unsubstantiated claims?"
"(lists names of prior victims)"
"That doesn't mean anything. I'm different. You're just making vague and dismissive unsubstantiated claims."
The claim isn't "AI bad" the claim is more along the lines of "there's a lot of money changing hands and this has all the earmarks of a classic hype cycle; while attention/diffusion models may amount to something the claims of their societal impacts are almost certainly being exaggerated by people with a financial stake in keeping the bubble inflated as long as possible, to pull in as many suckers as possible."
If you want another example (which you won't find analogous if you've already drunk the koolaid):
I hoped the article would be be a meta-discussion of "time" and perhaps relativity or some other phenomenon. Sigh, it's an investment thesis saying "This Time is Different" is a risky bet.
oh interesting, TIL I can go edit my submission titles! That's useful, I've definitely submitted stuff and gotten a less-good title due to the automated fixes, so I'll have to pay attention to this next time
I spent most of Covid in VRChat and met my current live-in gf, so the metaverse was real for me too.
I also made decent money selling crypto, so that part was real for me too.
And AI coding, for as dumb as even the best models are, still enabled me to create things that I wanted to, but wouldn't have had time or gotten nearly as far without.
I dunno if the author realizes, but all the things they mentioned did materialize in one way or another, just not exactly how the hype described it.
Maybe if they could let go of some of the cynicism, they could find something to be optimistic about. Nothing ever goes exactly as planned, but that doesn't mean nothing is good.
We're in that part of turbulence where we don't know if the floating leaf is going to go left or right.
The people who will have the hardest time with this transition are those who go all in on a specific prediction and then discover they were wrong.
If you want to avoid that, you can try very very hard to just not be wrong, but as I said, I don't think that's possible.
Instead, we need to be flexible and surf the wave as it comes. Maybe AI fades away like VR. Or maybe it reshapes the world like the internet/smartphones. The hardest thing to do right now, when everyone is yelling, is to just wait and see what happens. But maybe that's the right thing to do.
[p.s.: None of this means don't try to influence events. If you've got a frontier model you've been working on, please try to steer us safely.]
"All of the above technologies are still chugging along in some form or other (well, OK, not Quibi). Some are vaguely useful and others are propped up by weirdo cultists. I don't doubt that AI will be a part of the future - but it is obviously just going to be one of many technology which are in use.
> No enemies had ever taken Ankh-Morpork. Well technically they had, quite often; the city welcomed free-spending barbarian invaders, but somehow the puzzled raiders found, after a few days, that they didn't own their horses any more, and within a couple of months they were just another minority group with its own graffiti and food shops.
- Terry Pratchet's Faust Eric"
But like all the previous hype, most of the people that were the loudest won't say they were wrong, and they'll move to the next thing, pretending like they never were the one that portrayed AI as the holy Graal.
Umm, what? For the past 3 years, every year I've said something along the lines of "even if models stop improving now, we'll be working on this for years, finding new ways to use it and make cool stuff happen". The hype is already warranted. To have used these tools and not be hyped is simply denial at this point.
Most of Mag-7 are planning to spend over 500B on capex this year alone on building out datacenters for AI pipelines that have yet to prove that it can generate a sustainable profit. Yes, AI is useful in some environments, but the current pricing is heavily subsidized. So my point stand, the hype is not warranted.
What is meta-technology?
Also, every single close friend of mine makes some use of LLMs, while none of them used any the overhyped technologies listed. So you need a specially strong argument to group them together.
New things are happening and it's exciting. "AI bad" statements without examples feel very head-in-sand.
I like technology. I made a decent living from it. But if I had chased every hyped fad that was promised as the next big thing, I doubt I'd be as happy as I am now.
I mean you're just stating that sometimes tech doesn't meet it's hype. What's insightful about that? It's a given; cherry-picking examples doesn't prove your case.
Well, no, the ratio is most definitely not 1-to-1.
MRNA vaccines. Where are the countless breathless articles about these literal life saving tech? A few, maybe, but very few dudes pumping out asinine "white papers" and trying to ride the hype train.
Solar and battery. Again, lots of real world impact but remarkably few unhinged blowhards writing endless newsletters about how this changes everything.
I'm struggling to think of a tech from the last 20 years which has lived up to its hype.
Not everything is written to be insightful. Some things are just written to get them out of my head.
Do feel AI is overall just hype? When did you last try AI tools and what about their use made you conclude they will likely be forgotten or ignored by the mainstream?
It was an hour of pasting in error messages and getting back "Aha! Here's the final change you need to make!"
Underwhelming doesn't even begin to describe it.
But, even if I'm wrong, we were told that COBOL would make programming redundant. Then UML was going to accelerate development. Visual programming would mean no more mistakes.
All of them are in the coding mix somewhere, and I suspect LLMs will be.
> usage is copy pasting code back and forth with gemini
the jokes write themselves
As I said, maybe I'm wrong. I hope you have fun using them.
> Not everything is written to be insightful. Some things are just written to get them out of my head.
I like that, going to use it as the motivation to get some things out of my own head.
This only doesn't feel like substantiation if you reject the notion that these cases are analogous.
"You shouldn't eat that."
"Why not?"
"Everyone else who's eaten it has either died or gotten really sick."
"But I'm different! Why should I listen to your unsubstantiated claims?"
"(lists names of prior victims)"
"That doesn't mean anything. I'm different. You're just making vague and dismissive unsubstantiated claims."
The claim isn't "AI bad" the claim is more along the lines of "there's a lot of money changing hands and this has all the earmarks of a classic hype cycle; while attention/diffusion models may amount to something the claims of their societal impacts are almost certainly being exaggerated by people with a financial stake in keeping the bubble inflated as long as possible, to pull in as many suckers as possible."
If you want another example (which you won't find analogous if you've already drunk the koolaid):
https://theblundervault.substack.com/p/the-segway-delusion-w...
Internet - this time is different
iPhone - this time is different
Love the Sir Terry reference.
Similarly to how titles that start with "how" usually have that word automatically removed.