The World of Warcraft subreddit recently realized that a website, zleague.gg (I am not linking to it), which runs a blog attached to some of sort of gaming app which is its main business, has been scraping reddit threads, feeding them through an AI and summarizing them
A website called zleague.gg has been scraping Reddit threads, feeding them into an AI and publishing auto-generated summaries without proper oversight. World of Warcraft players on Reddit noticed this and created a fake thread about a made-up feature called Glorbo to trick the AI. The AI then published an article summarizing the fake Glorbo thread, showing that it was easily fooled. This highlights issues with AI-generated content crowding out human writers and the need for Google to better regulate such sites to ensure quality. The Glorbo prank provides an amusing example of how gaming communities can push back against AI overreach.
“This highlights issues with A-I generated content crowding out human writers and the need for Google to better regulate such sites to ensure quality.”
Can agree with the first part of that statement, but Google regulating…feels like handing a fox the keys to the hen house.
Regulation of industry should be coming from our legislators, but they’re too busy polarizing their base, abasing themselves to donors and gilding their own cages to care.
Apologies for the rant, but this whole “AI” imbroglio is a textbook case of the cart put before the horse whilst heading downhill.
Yes, exactly. Google is causing this problem by making a way for this crap to be monetized, and driving the human eyeballs to it. The solution is not to further enable Google as a gatekeeper to information, but to simply replace them.
Regulation should come from the industry not the legislators. Legislators don't know enough about it anyway and will end up just getting the biggest players to write it anyway.
But if the industry does it, certification would be voluntary and it would be transparent who wrote the regulation. Much easier for smaller players to contribute and shape it.
And the best part is that if it sucks, they don't have to participate. And then they can try again
In a perfect world maybe it would work that way, but I’ve seen too much enshittifaction via vertical integration or ego driven CEObros to have anything but skepticism for industry regulation, self-regulation in particular.
Much of the issues seen out of the US are because their approach to legislation is industry-led. You really do not want lobbyists and special interests writing legislation. You also don't want politicians drafting it either. In other countries, you have a strong non-partisan civil service that can carry out consultations and work out the nuts and bolts. Elected politicians should be kept to the making the higher level decisions on what to look at and in what general direction to move in.
The issue with Google de-emphasizing bad AI in search results is that they aren't impartial. I sort of want them to do it anyways, but I also don't trust them, because they're competing at AI content.
Reddit user kaefer_kriegerin expresses their excitement, stating, ‘Honestly, this new feature makes me so happy! I just really want some major bot operated news websites to publish an article about this.’
This feels like the early days of Google, when it was basically a cat and mouse game of their page ranking algorithm vs website creators trying to game their page rank. Still happens, but it's less obvious and easy.
The best part of this is that the AI could not come up with original content. All it can do is repeat what humans already output. I would say games journalism is safe from an AI takeover. Now we just need to get rid of the bot spam.