• DarkThoughts
    link
    fedilink
    35 months ago

    Yes, but what would a local model do for you in this case? Chatbots in browsers are typically used as an alternative / more contextualized search engine. For that you need proper access to an index of search results. Most people will also not have enough computing power to make use of any complex chatbot / larger context sizes.

    • @kakes
      link
      English
      15 months ago

      Pennomi wrote a whole list of potential ideas. And honestly, while I agree that local LLMs on typical hardware are underpowered for most tasks, it’s possible they would have the option for those that can run it.

      People are getting all upset over this announcement without even knowing what their plan actually is, like the word “AI” is making them foam at the mouth or something. I’m just saying we should reserve judgements for when we have an idea of what’s happening.

        • @kakes
          link
          English
          15 months ago

          Yes, and then you asked for ideas, which were in that comment that you replied to.

          • DarkThoughts
            link
            fedilink
            05 months ago

            I honestly have no idea what you’re referring to now. I never asked for any ideas.