• tfmOPMA
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 months ago

    Yeah, that’s bad if you care about AI.

    • inlandempire@jlai.lu
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 months ago

      Nah I mean, I was hoping it would be fully self hosted and offline, but I guess that would require you to run the models yourself

      • tfmOPMA
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 months ago

        Yeah, I think this is part of their current business model, unfortunately.

        I mean, it wouldn’t be hard to implement OpenAI compatible APIs. That would access to pretty much any AI service, including self hosted like Ollama.

        But this would make their service unnecessary.