So all you do is create phrases based on things you’ve read in the past and recognizing similar interactions between other people and recreating them? 🤔
Forming your own thoughts because you reasoned by yourself?
AI just goes “I’ve seen X before, someone answered Y, therefore I will answer Y.” In its current state it can’t decide “I’ll answer something nonsensical just for the lulz” because it doesn’t know if Y is right or wrong, it just knows that over billions of lines of texts it has seen X with Y most often so X = Y. If X was always answered with a nonsensical answer it would repeat it even if it has access to information that proves that answer wrong. Which is also why there’s a lot of bad info being shared by AI.
So all you do is create phrases based on things you’ve read in the past and recognizing similar interactions between other people and recreating them? 🤔
No we also transfer generic material to similar looking (but not too similar looking) people and then teach those new people the pattern matching.
My point: Reductionism just isn’t useful when discussing intelligence.
Man… I must be smart as heck to be able to come up with my own thoughts then…
Idk man, I’m pretty sure I can find all of those words in a dictionary.
As opposed to what, exactly?
Forming your own thoughts because you reasoned by yourself?
AI just goes “I’ve seen X before, someone answered Y, therefore I will answer Y.” In its current state it can’t decide “I’ll answer something nonsensical just for the lulz” because it doesn’t know if Y is right or wrong, it just knows that over billions of lines of texts it has seen X with Y most often so X = Y. If X was always answered with a nonsensical answer it would repeat it even if it has access to information that proves that answer wrong. Which is also why there’s a lot of bad info being shared by AI.