• LainTrain@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      4 months ago

      What would be the threshold for them to “take off”? It’s all already out, so already there no?

      • umbrella@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        its been a while, but last i tried it wasnt as good as the proprietary models.

          • umbrella@lemmy.ml
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            4 months ago

            i tried the llama model for text, and another one meant for images i cant quite remember the name but it was one of the main ones.

            are they any good now? running an llm actually sounds mildly useful.

              • LainTrain@lemmy.dbzer0.com
                link
                fedilink
                English
                arrow-up
                1
                ·
                3 months ago

                Honestly i think speed is something I don’t care too much about with models, because even things like ChatGPT will be slower than Google for most things, and if something is more complex and a good use case for an LLM it’s unlikely to be the primary bottleneck.

                My gf private chat bot right now is a combination of Mistral 7B with a custom finetune and she it directs some queries to ChatGPT if I ask (I got free tokens way back might as well burn through them).

                How much of an improvement is Mixtral over Mistral in practice?

                • just another dev@lemmy.my-box.dev
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  3 months ago

                  Sillytavern by any chance?

                  And I’d say the difference between mistral and mixtral is pretty big for general usage, feels like it’s a next generation.