Vivid example of how Google Search AI overviews give conflicting,
worse than useless answers

Here's a vivid example of how badly Google Search AI Overviews can
screw up a simple question. I ask it twice, with almost identical
wording, whether Vasquez Rocks is in the TMZ -- the Hollywood Thirty
Mile Zone. In one case it definitively says yes. In the other, it just
as definitively says no! Note that after saying No in the main answer,
the added text below contradicts that. It can't even get its answer
straight in a single response!

And this is a harmless question. What happens when someone asks about
something really important and this happens? AI Overviews are worse
than useless because you CANNOT TRUST THE ANSWERS! All the Google
disclaimers in the galaxy won't fix that.

Screenshots: https://mastodon.laurenweinstein.org/@lauren/113351685382748787

L

- - -
--Lauren--
Lauren Weinstein [email protected] (https://www.vortex.com/lauren)
Lauren's Blog: https://lauren.vortex.com
Mastodon: https://mastodon.laurenweinstein.org/@lauren
Founder: Network Neutrality Squad: https://www.nnsquad.org
        PRIVACY Forum: https://www.vortex.com/privacy-info
Co-Founder: People For Internet Responsibility
_______________________________________________
google-issues mailing list
https://lists.vortex.com/mailman/listinfo/google-issues

Reply via email to