firefox also isn't immune
firefox also isn't immune
firefox also isn't immune
I don't mind seeing an AI summary of search results as much as I mind sponsored links fucking up page rank. Sometimes it is even nice to see "hey your search doesn't make sense because you've conflated two terms". But I guess I'm in the minority.
Reminds me of early wikipedia when there was a deep trustworthiness problem. Seeing a wikipedia link on a presentation stole your credibility, but it was still a hell of a lot better starting point than grabbing an encyclopedia and asking jeeves until you found a thread to pull.
AI summaries put another layer of interpretation between the reader and the source material. When having accurate and properly-sourced information matters, it's just not trustworthy enough. At least with Wikipedia, it tells you when there is potentially biased or improperly sourced material. Search AI will confidently assert their summaries as though they are factual, regardless of how reliable or unreliable their own sources are.
So long as the citations are there I'm not usually taking the summary at it's word. I find searching "hard to Google" terms easier with AI.
When having accurate and properly sourced material matters, I hope you're not trusting the descriptions of citations laid out by wikipedia editors who are also just another layer of interpretation. It's always worth a double check.
I've been an editor on Wikipedia for decades now. I've followed sources to clarify information, fix broken links, and remove inaccurate information. I know how it works.
It’s always worth a double check.
That's exactly my point. Wikipedia is transparent about where it gets its information. You can double-check citations, and if the citations don't exist or don't support a relevant claim, you can discard them (or edit them to flag that fact, or go above and beyond to provide a new source, if you're so willing.) With AI summaries, you can't do any of that. You're given a summation without automatic citations (or sometimes, with bogus made-up ones), and you can't do anything to correct any misinformation you encounter. Maybe you can report it, but you can't do anything in real time to prevent others from finding that same inaccurate information - not in the way that you can to immediately correct an inaccuracy on Wikipedia.
Same. But now this is a different topic.
For something like perplexity under brave where you're given inline citations, yeah, go follow them and get to an authoritative source faster.
We didn't start with "I can't submit an updated review if I find mistakes", we started at "there's another unnecessary layer of indirection". Which, sure, but it's hardly different than getting a start with a medium article of "best xxx of 2025" or, yes, a wikipedia page. It may not be to your taste, but I've had some occasions where it's convenient.
they make them up, and they dont source it properly.
Go ask perplexity.ai a question about programming or troubleshooting FAQ and then follow a cited link. I assure you they are not all made up.
So long as the citations are there
AI fabricates citations.
Every citation is not fake or irrelevant. In wikipedia it's "citation needed" or "page does not exist". Same problems.
All you have to do is click it or search again.
But hey, of you prefer the old fashioned way of opening every returned search result starting with page 1 to page 6 until you just search again anyway, go ahead and do that. I'll deal with sifting through occasional bad advice in an eighth of the time.
This ^.
I think people forget the fabled "old" internet was actually a pile of trolls where one had to double check what they read.
Basic sanity checks really aren't that hard. But its a forgotten habit, I guess.
"oh my god, AI makes shit up!"
I've never had a result that helpful. I've seen it make up sports results in advance though.
I suppose I'm mostly using it for programming, movie look up, vocab, and so on. Not sports/weather/news kinds of things.