• 0 Posts
  • 18 Comments
Joined 6 months ago
cake
Cake day: November 5th, 2024

help-circle







  • Okay but I wasn’t arguing morality or about children posting nudes of themselves. I’m just telling you that works submitted into the public domain can’t be retracted and there are models trained on exclusively open data, which a lot of AI haters don’t know, understand or won’t acknowledge. That’s all I’m saying. AI is not bad, corporations make it bad.

    The law isn’t a reliable compass for what is or isn’t right.

    Fuck yea it ain’t, I’m the biggest copyright and IP law hater on this platform and I’ll get ahead of the next 10 replies by saying no it’s not because I want to enable mindless corporate content scraping; it’s because human creativity shouldn’t not be boxed in. It should be shared freely, lest our culture be lost.



  • TOR is just slightly harder to keep up on as far as being listed on the same tables as commercial VPN hosts because it’s so dynamic. Anyone can spin up a node and be a relay or, for the brave/foolish, an exit node in a few minutes.

    Actually Tor relays and exits are published, public knowledge and you will be on every list that cares about listing those within hours of spinning up a relay or exit.


  • I can’t make you understand more than you’re willing to understand. Works in the public domain are forfeited for eternity, you don’t get to come back in 10 years and go ‘well actually I take it back’. That’s not how licensing works. That’s not victim blaming, that’s telling you not to license your nudes in such a manner that people can use them freely.



  • I actually think it’s very interesting how nobody in this community seems to know or understand how these models work, or even vaguely follow the open source development of them. The first models didn’t have this problem, it was when OpenAI realized there was money to be made that they started scraping the internet and training illegally and consequently a billion other startups did the same because that’s how silicon valley operates.

    This is not an issue of AI being bad, it’s an issue of capitalist incentive structures.



  • I think his argument is that the models initially needed lots of data to verify and validate their current operation. Subsequent advances may have allowed those models to be created cleanly, but those advances relied on tainted data, thus making the advances themselves tainted.

    It’s not true; you can just train a model from the ground up on properly licensed or open data, you don’t have to inherit anything. What you’re talking about is called finetuning which is where you “re-train” a model to do something specific because it’s much cheaper than training from the ground up.




  • drkt@scribe.disroot.orgtoFuck AI@lemmy.worldThe Perfect Response
    link
    fedilink
    arrow-up
    13
    arrow-down
    60
    ·
    5 days ago

    Oh boy here we go downvotes again

    regardless o the model you’re using, the tech itself was developed and fine-tuned on stolen artwork with the sole purpose of replacing the artists who made it

    that’s not how that works. You can train a model on licensed or open data and they didn’t make it to spite you even if a large group of grifters are but those aren’t the ones developing it

    If you’re going to hate something at least base it on reality and try to avoid being so black-and-white about it.