End of hallucinations? How Vancouver AI firms achieve accuracy

biv.com

2 points

ClearwayLaw

4 hours ago


2 comments

ClearwayLaw 4 hours ago

Ventures increasingly train AI agents on retrieval augmented generation (RAG) systems that containerize data in small data sets

reify 3 hours ago

Its not that difficult really.

Considering that AI never hallucinated in the first place.

It basically fucks up and squirts out shit.

Its like putting too much animal feed in a cows mouth and waiting at the other end with a bucket.

hallucinate is a made up word for the stuff you eventually get in your bucket.

Excuse me for two minutes while I pop to my toilet to hallucinate a big turd.