Maybe with AI we can finally kill user-owned computing, and make almost everyone renters.
It's really wrong that the common people have access to things like PCs. It leaves a lot of money on the table the corporations can extract, and makes control much harder. PCs should cost at least as much as a car, so only the right people can afford them.
Those who earn their living from their labor, and those whose income is derived simply by owning things they (often) didn't create themselves and charge for access.
It will be the same two classes there are now and always have been. Those who need to sell their labor and those they sell it to. Class struggle is the only way out. Find some solidarity, you aren't exempt.
PCs are also made by corporations, together with PC parts. The reason computing became so cheap during the last 50 years was competition between said corporations. Competition that is also pushing the AI token price down and also encouraging - corporations - to come up with models that can run on user hardware.
So what are you ranting against?!
> Own nothing and be happy.
Ah, here it is. Only governments can confiscate our property and force us into that. Governments and politicians that keep telling us how evil corporations are…
The “own nothing and be happy” quote is from a blog post made by the World Economic Forum. I find meta-governmental organisations even more troublesome, and you can’t vote them out.
It isn’t only conspiracy theorists who should be disturbed by whatever politico-corporate freemasonry that goes on in Davos.
AI bros and crypto bros. One and the same thing. Same optimism. Same arguments. Same blind faith. Same zero knowledge of how economy, society or even the technology they are evangelizing works under the hood and what are its shortcomings that are impossible to overcome because physics won't allow.
You sound like a useless-eater manager. Just the kind of roles we'll be happy to have in our future Utopia. The people will be happy to be led by such visionaries such as yourself.
10-12 Months ago I had commented here that people are not realising that AI is going to price us normal people out of computer hardware and we need China to actually reach on parity with node size. And sadly it looks like I was correct in my prediction.
At current prices, Chinese companies could even produce everything possible (~anything but current gen CPUs and GPUs) on slightly older nodes and make a stonking profit while lowering market prices.
After the recent run-up, where are prices on a per-performance basis? Back to 2019?
Computers were incredibly more expensive when I was growing up. People bought them anyway.
Is a computer that lasts 5-8 really productive years (and is still serviceable for another 5-7) and costs $1500 really a deal-breaker just because it was $1000 and on sale for $850 a year ago? Even if it doubled again, it still doesn’t price normal people out, IMO.
Who cares if Qualcomm owns Arduino. It has never been cheaper to get into embedded computing. You can buy Arduino-compatible STM32 Nucleo boards straight from STMicroelectronics for $15-20, and that's first party. If you're willing to buy third party clones there are boards on AliExpress for $10 or less.
Hobbyist equipment is still relatively cheap. You can get previous-gen hardware for formerly current-gen prices, you can run lots of “hobbyist” software on low RAM and no GPU.
Because they own nothing but make believe stocks and life works great for them.
The mega-rich are 100% decoupled from physical reality. May as well treat them more like tribal shaman, priests, preachers, and rabbis.
Just parroting memes the likewise idiot politicians believe are the magic chants that keep gravity itself pulling together the Earth.
"Omg he said the thing! Cut his taxes! Give him welfare!"
Our generation of leaders were raised in a pre-science and information as world. They rely entirely in cult of personality as their meat suit never sees itself engage in the labor it relies on to live. It's well aware intuitively how fucked it is. Must continue to stand in the pulpit!
I think treating them as the fae, vampire or demons is sort of insulting. Those creatures are at least bound by supernatural laws and can be negotiated with in some way.
Nah. The first two thirds of the 20th century was the science and information world. Man gained mastery of the skies, the depths of the sea, the void of space, the atom. We were taming diseases and found a way to end hunger. We started building thinking machines. We were playing with the fire of the gods. Science was working miracles on the daily.
It still is, but nobody gives a shit anymore, we are in the financialization and rent-seeker world now.
But in a way I do agree with you, I doubt it is as organized as you imply. Yes, companies and governments do not way anyone on a General Computing Device at all. They want to see exactly what content you are viewing and responding to.
Microsoft and Apple have been slowly adding various forms of spyware and locking down what applications you can use. And Cell Phones ? Those are the Holy Grail of what Microsoft and Apple want to move your Laptop/PC to.
Right now Linux and BSD are the only games in town for non-spyware systems. But the new Age Verification Laws seems to be a first attempt to lock-down even Linux :( Since the Linux Foundation is owned by large corporations, I feel that will succeed. For the BSDs ? Right now seems they are flying under the radar.
Why do you doubt this when the rich also have Signal? They meet and talk out of view? The insider trading coming out of Washington?
Why when emails from discovery in labor disputes between google and apple in the 2010s revealed they engage in exactly the sort of manipulation you disbelieve?
PC is the last major open platform. While other platforms like Android and becoming less open, PC in general is becoming more open than it's been in a long time as heavy MacOS/Android/iOS competition is creating a focus on open standards and all-time high strong Linux support gives people a place to land and tinker/hack to their heart's content.
I think we will see an abandonment of consumer grade PC components and individuals are either pushed towards closed hardware like Playstation, MacBooks, and Android devices or they are pushed towards server grade components. I already have home sever rack, and would recommend it for other people.
Not just motherboards. Cases, PC accessories (fans, etc), consumer SSDs, and more. Cases are especially hard hit, apparently, as they're already quite a low margin business.
Personally, I see little reason to upgrade from my AM4 platform. It's never been easier to hold on to aging hardware with the advent of DLSS stretching older cards further, diminishing returns on the newer gen GPUs, and the 'realism' of video games plateauing.
I invested quite a bit in enterprises level homelab equipment 2020 to 2025 (about 10k). Happy I made it before the big bang. Eg. my SAS he8 drives will last at least till 2035. But what then? I want my children to be free, too.
Last year I said I should have upgraded my 1060 last year.
I bought it second hand 7 years ago and it is still the same price.
I don't do much gaming, and it runs Immich / etc light inference just fine. One thing I don't regret is getting 32GB of DDR4 when I built the system around the time of the GPU upgrade.
Agreed. I build a system every ten years and I've got 6 years to go. AM4 works great, and I've managed to hoard enough ram and drives to hopefully cover any concerns for the next 4 years. Things work, they are stable, and I feel super lucky for that.
I'm sure the AI shortages are hurting, but also I'm still using my same motherboard from 2020 and I see no reason why I should have to upgrade in the next 2-3 years (whenever I buy my RTX 7070Ti, it might be time, but maybe not even then).
High end resins and epoxies are in a critical supply shortage right now. I suspect that there are going to be some serious resource driven PCB shortages in the very near future.
I assume manufacturers were making enough motherboards in 2025 to fulfill demand, so what happens when the demand is the same but the production is 25% less? Crazy.
I was looking into self-hosting deekseek v4 pro since frankly cache reads are an absolute scam and they're 90% of the cost, but then I looked at the ROI and it will never pay off fast enough because the hardware will become obsolete faster even if you were running 10 token generation streams 24/7.
The napkin math resulted that renting is around 27 times cheaper than owning (not including power). I think we're really screwed when it comes to having owned access to AI unless intel comes out swinging with a c series card that has 128gb vram so we can run them in a 4x128gb configuration, but seems unlikely since nvidia has a large share in them.
This was calculated expecting around 30tok/s, of course you can get 2-5tok/s much much cheaper, but it's unusable for my workflow.
Ironically the few people not scamming you for cache reads are Deepseek.
Everyone else charges a ridiculous amount but Deepseeks API is $0.003625 / M tok.
I'm surprised no one talks about this because of how significant it is. GPT 5.5 for example costs a ridiculous $0.50 / M tok cached. It's literally almost 140 times cheaper which matters a lot for tool calls.
It's not just new hardware, even used hard drives manufactured a decade ago have at least doubled in price. Scam Altman has effectively killed personal computing for all but the most affluent
I know it's going to be extremely painful, but the sooner this ridiculous unsustainable AI bubble pops the better off we'll be. The more it inflates the more collateral damage it will cause, and we're probably already looking at 2008 levels of financial chaos.
Will demand for computing ever go down from where it is now? Even if the AI bubble temporarily pops, in the long run I think the demand for computers will be practically infinite.
Market forces will probably bring the price of hardware down in the next decade. Whether it is in a form that is useful for regular people/hobbyists is another question. If not, then hopefully the "cloud" starts to look a lot different.
I think it's possible (10-15%) that the AI bubble pops and we all live without 50M token/day OpenClaw installs and running Opus to do things that should have been done by a shell script to the point that it causes a dip in total compute demand. I think it's likely (75% likely if the AI bubble pop causes a dip in compute demand) that this dip extends longer than the median lifespan of the hardware currently being installed in datacenter.
Of course in 20 years we'll be using more compute than today (99% likely).
EDIT: Of course cryptocurrencies provide a floor compute pricing.
Shortage of ram and ssds, and soon, cpus. Motherboards aren’t selling because theres no point buying a motherboard if you can’t by the ram or ssd it needs.
It’s brutal. I’ve just built a workstation with DDR4 and two-gen old cpu. I paid more for the ddr4 than it originally cost, four years ago. The same amount of ram for the latest motherboard would have been 10x ($10,000). So used DDR4 has gone through the roof, which impacts hobbyists who used to rely on “hand-me-downs”.
15 months ago I saw writing on the wall on several fronts. I suggested my community commit to their buys/builds ASAP and be forward-looking, before things changed.
My high-end HEDT would now be +$2300 to build mostly due to memory and SSD pricing. 96GB of memory going from $430 -> $1800 is wild. One community member literally wouldn’t be able to buy their Mac Mini configuration anymore, plus the self-upgrade SSD would be price hiked.
Where I blanche most is my storage server running TrueNAS. Built it 3.5 years ago, future-proofing in mind. Strong SSD cache layer, plus two spare HDDs as spares. It wasn’t cheap then, but I think between disks, storage, ECC memory, etc. it’s +$7000 now to rebuild it again, +$9000-$10000 on last generation hardware.
Smaller manufacturers will fold, and larger ones will leave the consumer market (like Micron/Crucial did), before the market has a chance to bounce back. If and when it does recover, it will be a market of much fewer choices.
A somewhat comparable historical example is the destruction of the Swiss watch industry in the 70s with the advent of quartz and digital watches.
A Rolex Daytona today is known as a very fancy and even hard-to-get watch. In the 70s they were practically giving them away with other watch purchases because electronic watches were taking their lunch.
The bigger takeaway, I think, is the destruction and folding eventually lead to the Swatch group. People forget Rolex, Omega, et. al. were tool watches that were expensive but fairly attainable. Even into the 90s you could walk into a Rolex store and walk out with the watch you wanted. Nowadays you basically have to buy a watch to prove you're good enough to get the one you want.
I forsee a similar thing happening with computing hardware. There will be a small high-end side industry for non-datacenter customers.
The digital watch user will be renting time for a thin client via a datacenter provider. The wealthy or high status user will be able to purchase the expensive boutique home computing hardware they want.
even if volume and hype decreases from the general pop there doesn't seem to be much of a cap on model requirements -- so at least one sector will be pushed into purchases one way or another.
The brief window between the covid gaming bubble pop/PoS ETH switch and the AI hardware blackhole will be fondly remembered as the last golden age of consumer PC hardware accessibility.
If China keeps releasing decent copies of SOTA models that only take 20% of the resources, then we may get some relief when those models become "good-enough"
>copies of SOTA models that only take 20% of the resources
They might be 20% of the price (because they don't have to invest that much in training), but are probably not 20% of the resources (ie. inference), considering they take more tokens to do the same task, and have slower inference speeds.
I've been using deepseek and it's good enough for my personal use. It takes way more time/tokens/course-correcting to get things done, but I spend in a month what I spend in a day with opus 4.6
"Fueled by greed". It would be trivial to say no to AI companies because dollars are dollars, it doesn't matter who pays them, and prioritizing literally all of humanity instead of "five companies" is a choice that every single supplier could make, but decided not to. This problem was 100% manufactured by suppliers.
It's really wrong that the common people have access to things like PCs. It leaves a lot of money on the table the corporations can extract, and makes control much harder. PCs should cost at least as much as a car, so only the right people can afford them.
Own nothing and be happy.
Those who earn their living from their labor, and those whose income is derived simply by owning things they (often) didn't create themselves and charge for access.
That confident "will" in that prognosis may ultimately stimulate a consensus "why?" response in the population to explore alternative outcomes ..
I spent the last half a century making sure they have no leverage and I am not interested in being coerced.
It's called security.
So what are you ranting against?!
> Own nothing and be happy.
Ah, here it is. Only governments can confiscate our property and force us into that. Governments and politicians that keep telling us how evil corporations are…
It isn’t only conspiracy theorists who should be disturbed by whatever politico-corporate freemasonry that goes on in Davos.
... Do you want corporations to have that power too or something? What are you saying here?
Maybe we'll get a chinese hardware black market.
[1] https://www.reuters.com/sustainability/boards-policy-regulat...
Computers were incredibly more expensive when I was growing up. People bought them anyway.
Is a computer that lasts 5-8 really productive years (and is still serviceable for another 5-7) and costs $1500 really a deal-breaker just because it was $1000 and on sale for $850 a year ago? Even if it doubled again, it still doesn’t price normal people out, IMO.
That has never before happened in the history of computing, and it violates long-held, fundamental assumptions.
It’s bad, but it’s not “literally own nothing”.
people will own an increasing number of dumb terminals connected to rented services.
does that reduce the number of computers? well, no..
so, imo : the trick isn't to reduce physical ownership of devices, the trick is to make it so that you need Big Iron in order to do anything.
One way that might be achieved is by forming social and cultural dependence on models so large that no one individual could possibly run them...
The mega-rich are 100% decoupled from physical reality. May as well treat them more like tribal shaman, priests, preachers, and rabbis.
Just parroting memes the likewise idiot politicians believe are the magic chants that keep gravity itself pulling together the Earth.
"Omg he said the thing! Cut his taxes! Give him welfare!"
Our generation of leaders were raised in a pre-science and information as world. They rely entirely in cult of personality as their meat suit never sees itself engage in the labor it relies on to live. It's well aware intuitively how fucked it is. Must continue to stand in the pulpit!
Why associate them with roles that have a degree of positive association and human connection?
Treating them as faeries, vampires, or demons seems more accurate.
It still is, but nobody gives a shit anymore, we are in the financialization and rent-seeker world now.
Now we are just playing with fire.
But in a way I do agree with you, I doubt it is as organized as you imply. Yes, companies and governments do not way anyone on a General Computing Device at all. They want to see exactly what content you are viewing and responding to.
Microsoft and Apple have been slowly adding various forms of spyware and locking down what applications you can use. And Cell Phones ? Those are the Holy Grail of what Microsoft and Apple want to move your Laptop/PC to.
Right now Linux and BSD are the only games in town for non-spyware systems. But the new Age Verification Laws seems to be a first attempt to lock-down even Linux :( Since the Linux Foundation is owned by large corporations, I feel that will succeed. For the BSDs ? Right now seems they are flying under the radar.
Why when emails from discovery in labor disputes between google and apple in the 2010s revealed they engage in exactly the sort of manipulation you disbelieve?
I think we will see an abandonment of consumer grade PC components and individuals are either pushed towards closed hardware like Playstation, MacBooks, and Android devices or they are pushed towards server grade components. I already have home sever rack, and would recommend it for other people.
Personally, I see little reason to upgrade from my AM4 platform. It's never been easier to hold on to aging hardware with the advent of DLSS stretching older cards further, diminishing returns on the newer gen GPUs, and the 'realism' of video games plateauing.
It may sound like pseudo-Buddhist claptrap, but it's also true. Or, I suppose, Fight Club claptrap. It's still true.
The choice is "do you want to participate in society, its benefits and drawbacks". You can't have only one side of that.
Last year I said I should have upgraded my 1060 last year.
I bought it second hand 7 years ago and it is still the same price.
I don't do much gaming, and it runs Immich / etc light inference just fine. One thing I don't regret is getting 32GB of DDR4 when I built the system around the time of the GPU upgrade.
Why did we listen to the Worldcoin guy again?
The napkin math resulted that renting is around 27 times cheaper than owning (not including power). I think we're really screwed when it comes to having owned access to AI unless intel comes out swinging with a c series card that has 128gb vram so we can run them in a 4x128gb configuration, but seems unlikely since nvidia has a large share in them.
This was calculated expecting around 30tok/s, of course you can get 2-5tok/s much much cheaper, but it's unusable for my workflow.
Everyone else charges a ridiculous amount but Deepseeks API is $0.003625 / M tok.
I'm surprised no one talks about this because of how significant it is. GPT 5.5 for example costs a ridiculous $0.50 / M tok cached. It's literally almost 140 times cheaper which matters a lot for tool calls.
Market forces will probably bring the price of hardware down in the next decade. Whether it is in a form that is useful for regular people/hobbyists is another question. If not, then hopefully the "cloud" starts to look a lot different.
Of course in 20 years we'll be using more compute than today (99% likely).
EDIT: Of course cryptocurrencies provide a floor compute pricing.
It’s brutal. I’ve just built a workstation with DDR4 and two-gen old cpu. I paid more for the ddr4 than it originally cost, four years ago. The same amount of ram for the latest motherboard would have been 10x ($10,000). So used DDR4 has gone through the roof, which impacts hobbyists who used to rely on “hand-me-downs”.
My high-end HEDT would now be +$2300 to build mostly due to memory and SSD pricing. 96GB of memory going from $430 -> $1800 is wild. One community member literally wouldn’t be able to buy their Mac Mini configuration anymore, plus the self-upgrade SSD would be price hiked.
Where I blanche most is my storage server running TrueNAS. Built it 3.5 years ago, future-proofing in mind. Strong SSD cache layer, plus two spare HDDs as spares. It wasn’t cheap then, but I think between disks, storage, ECC memory, etc. it’s +$7000 now to rebuild it again, +$9000-$10000 on last generation hardware.
A Rolex Daytona today is known as a very fancy and even hard-to-get watch. In the 70s they were practically giving them away with other watch purchases because electronic watches were taking their lunch.
The bigger takeaway, I think, is the destruction and folding eventually lead to the Swatch group. People forget Rolex, Omega, et. al. were tool watches that were expensive but fairly attainable. Even into the 90s you could walk into a Rolex store and walk out with the watch you wanted. Nowadays you basically have to buy a watch to prove you're good enough to get the one you want.
I forsee a similar thing happening with computing hardware. There will be a small high-end side industry for non-datacenter customers.
The digital watch user will be renting time for a thin client via a datacenter provider. The wealthy or high status user will be able to purchase the expensive boutique home computing hardware they want.
even if volume and hype decreases from the general pop there doesn't seem to be much of a cap on model requirements -- so at least one sector will be pushed into purchases one way or another.
They might be 20% of the price (because they don't have to invest that much in training), but are probably not 20% of the resources (ie. inference), considering they take more tokens to do the same task, and have slower inference speeds.
https://x.com/scaling01/status/2050616057191072161