Garbage In/Garbage Out: Some signs of AI model collapse begin to reveal themselves
Garbage In/Garbage Out: Some signs of AI model collapse begin to reveal themselves
Garbage In/Garbage Out: Some signs of AI model collapse begin to reveal themselves
"Maybe if Google gets more AI in their search it'll be better" is batshit insane.
"I use AI a lot for searching, but lately it's giving me sources that don't exist" is also kinda nuts. Why would you deliberately use a resource that is going to lie, just not predicably?
Why would you deliberately use a resource that is going to lie, just not predicably?
I can give a real-life answer to that. I was working on an assignment where keyword searching journal databases was not really helping because the while the primary keywords have specific meanings in the context I was intending, the words themselves have different meanings and uses. So I had to weed through a ton of articles in order to get just a few useful ones. Asking ChatGPT to give me 10 peer-reviewed journal articles about X and Y topic would return maybe three or four real articles, but it took me a lot less time to identify the three real ones from a set of 10 compared to locating three on-topic articles from, say, 100 results.
The above was using the public version of ChatGPT. Partway through the semester, my school got ChatGPT Edu. Interestingly, the Edu version, given the same prompt, did not return fake articles, but it included somewhat related, but not-relevant ones. In either case (public or Edu), I still had to check all 10 results, but it was still less time consuming than trying to search on my own.
I think what Optional is getting at is that "running an initial wide search then checking the specific sources" is not an improvement on the current system.
But you knew you had to check them first!