Grumbles about generative AI's shortcomings are coalescing into a "trough of disillusionment" after a year and a half of hype about ChatGPT and other bots. Why it matters: AI is still changing the world — but improving and integrating the technology is raising harder and more complex questions than first envisioned, and no chatbot has the magic answers. Driving the news: The hurdles are everything from embarrassing errors, such as extra fingers or Black founding fathers in generated images, to significant concerns about intellectual property infringement, cost, environmental impact and other issues.
It's only a matter of time 'til the "AI" bubble really pops and all those tech companies that fired too much of their workforce have to start hiring back like crazy.
While there are some bubbles that need popping, especially in board rooms - i work for a large tech company that has not fired anyone because of AI. Rather the opposite, we have been expanding our AI team in the last 5+ years and have delivered succesful AI products. There is a lot more to AI than ChatGPT. Which, while impressive as a proof of concept, is not actually useful to business.
I'm so skeptical that AI will deliver large scale economic value.
The current boom is essentially fueled by free money. VCs pump billions into start-ups, more established companies get billions in subsidies or get their customers to pay outrageous amounts on promises. Yet, I have yet to see a single AI product that is worth the hassle. The results are either not that good or way too expensive, and if you couldn't rely on open models paid for by VC, you wouldn't be able to get anything off the ground.
It's the same at my employer, which has wasted untold thousands on subscriptions to ChatGPT and CoPilot and all we've gotten out of it so far is a script that takes in transaction data and spits out "customer loyalty recommendations".. as if we don't already have a marketing department for that. XD
I think most people don’t understand the one fundamental thing about AI: ChatGPT, Dall-E and what not are just products produced by machine learning, not AI themselves. Machine learning is already doing a lot of work for science and it‘s utterly unthinkable to not utilize it in fields like chemistry for example. We only read about media producing LLMs because we just consume so damn much media. Maybe that‘s something we should think about.