• HardlightCereal@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    3
    ·
    1 year ago

    Language is a method for encoding human thought. Mastery of language is mastery of human thought. The problem is, predictive text heuristics don’t have mastery of language and they cannot predict desired output

    • cloudy1999@sh.itjust.works
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      1 year ago

      I thought this was an inciteful comment. Language is a kind of ‘view’ (in the model view controller sense) of intelligence. It signifies a thought or meme. But, language is imprecise and flawed. It’s a poor representation since it can be misinterpreted or distorted. I wonder if language based AIs are inherently flawed, too.

      Edit: grammar, ironically

      • HardlightCereal@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Language based AIs will always carry the biases of the language they speak. I am certain a properly trained bilingual AI would be smarter than a monolingual AI of the same skill level

    • MajorHavoc@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      “Mastery of language is mastery of human thought.” is easy to prove false.

      The current batch of AIs is an excellent data point. These things are very good at language, and they still can’t even count.

      The average celebrity provides evidence that it is false. People who excel at science often suck at talking, and vice-versa.

      We didn’t talk our way to the moon.

      Even when these LLMs master language, it’s not evidence that they’re doing any actual thinking, yet.