Majority of AI Researchers Say Tech Industry Is Pouring Billions Into a Dead End
Majority of AI Researchers Say Tech Industry Is Pouring Billions Into a Dead End

Majority of AI Researchers Say Tech Industry Is Pouring Billions Into a Dead End

Majority of AI Researchers Say Tech Industry Is Pouring Billions Into a Dead End
Majority of AI Researchers Say Tech Industry Is Pouring Billions Into a Dead End
I liked generative AI more when it was just a funny novelty and not being advertised to everyone under the false pretenses of being smart and useful. Its architecture is incompatible with actual intelligence, and anyone who thinks otherwise is just fooling themselves. (It does make an alright autocomplete though).
The peak of AI for me was generating images Muppet versions of the Breaking Bad cast; it's been downhill since.
Like all the previous bubbles of scam that were kinda interesting or fun for novelty and once money came pouring in became absolut chaos and maddening.
It peaked when it was good enough to generate short somewhat coherent phrases. We'd make it generate ideas for silly things and laugh at how ridiculous the results were.
AGI models will enter the market in under 5 years according to experts and scientists.
trust me bro, we're almost there, we just need another data center and a few billions, it's coming i promise, we are testing incredible things internally, can't wait to show you!
We are having massive exponential increases in output with all sorts of innovations, every few weeks another big step forward happens
Around a year ago I bet a friend $100 we won't have AGI by 2029, and I'd do the same today. LLMs are nothing more than fancy predictive text and are incapable of thinking or reasoning. We burn through immense amounts of compute and terabytes of data to train them, then stick them together in a convoluted mess, only to end up with something that's still dumber than the average human. In comparison humans are "trained" with maybe ten thousand "tokens" and ten megajoules of energy a day for a decade or two, and take only a couple dozen watts for even the most complex thinking.
Humans are “trained” with maybe ten thousand “tokens” per day
Uhhh... you may wanna rerun those numbers.
It's waaaaaaaay more than that lol.
and take only a couple dozen watts for even the most complex thinking
Mate's literally got smoke coming out if his ears lol.
A single Wh
is 860 calories...
I think you either have no idea wtf you are talking about, or your just made up a bunch of extremely wrong numbers to try and look smart.
Wh
, it also is done in weeks so it's obviously going to be way less energy efficient due to the exponential laws of resistance. If we grew a functional human in like 2 months it'd prolly require way WAY more than 13,000 Wh
during the process for similiar reasons.True, my estimate for tokens may have been a bit low. Assuming a 7 hour school day where someone talks at 5 tokens/sec you'd encounter about 120k tokens. You're off by 3 orders of magnitude on your energy consumption though; 1 watt-hour is 0.86 food Calories (kcal).