realistically though, I guess hooking up “google this for me” to an LLM, while a cute trick, is basically a “generate falsehoods machine” so I guess it’s not *that* practical
I wonder if you could like… hook it up to a search engine and tell it “everything must be backed up by at least two sources”. would that actually have any effect
@gardevoir like "sources" might trigger the thing to put links or citations but they regularly just make them up
@noiob lmfao yeah actually you’re probably right
it *would* probably just generate citations to fake sources