![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://lemmy.world/pictrs/image/8aead832-799f-4d34-a20d-eae5b621a9b1.jpeg)
The problem is that they are prone to making up why they are correct too.
There’s various techniques to try and identify and correct hallucinations, but they all increase the cost and none are a silver bullet.
But the rate at which it occurs decreased with the jump in pretrained models, and will likely decrease further with the next jump too.
This is incorrect as was shown last year with the Skill-Mix research: