People continue to think about #AI in terms of #2010s computing, which is part of the reason everyone gets it wrong whether they're #antiAI or #tech bros.
Look, we had 8GB of #ram as the standard for a decade. The standard was set in 2014, and in 2015 #AlphaGo beat a human at #Go.
Why? Because, #hardware lags #software - in #economic terms: supply follows demand, but demand can not create its own supply.
It takes 3 years for a new chip to go through the #technological readiness levels and be released.
It takes 5 years for a new #chip architecture. E.g. the #Zen architecture was conceived in 2012, and released in 2017.
It takes 10 years for a new type of technology, like a #GPU.
Now, AlphaGo needed a lot of RAM, so how did it stagnate for a decade after doubling every two years before that?
In 2007 the #Iphone was released. #Computers were all becoming smaller, #energy #efficiency was becoming paramount, and everything was moving to the #cloud.
In 2017, most people used their computer for a few applications and a web browser. But also in 2017, companies were starting to build #technology for AI, as it was becoming increasingly important.
Five years after that, we're in the #pandemic lockdowns, and people are buying more powerful computers, we have #LLM, and companies are beginning to jack up the const of cloud services.
#Apple releases chips with large amounts of unified #memory, #ChatGPT starts to break the internet, and in 2025, GPU growth continues to outpace CPU growth, and in 2025 you have a competitor to Apple's unified memory.
The era of cloud computing and surfing the #web is dead.
The hype of multi-trillion parameter #LLMs making #AGI is a fantasy. There isn't enough power to do that, there aren't enough chips, it's already too expensive.
What _is_ coming is AI tech performing well and running locally without the cloud. AI Tech is _not_ just chatbots and #aiart. It's going to change what you can do with your #computer.