Europe Pub
  • Communities
  • Create Post
  • Create Community
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
☆ Yσɠƚԋσʂ ☆@lemmy.ml to Share Funny Videos, Images, Memes, Quotes and more @lemmy.mlEnglish · 1 day ago

Stop spending money on Claude Code!

message-square
26
link
fedilink
329

Stop spending money on Claude Code!

☆ Yσɠƚԋσʂ ☆@lemmy.ml to Share Funny Videos, Images, Memes, Quotes and more @lemmy.mlEnglish · 1 day ago
message-square
26
link
fedilink
alert-triangle
You must log in or # to comment.
  • fubarx@lemmy.world
    link
    fedilink
    arrow-up
    8
    ·
    16 hours ago

    Can you imagine a Chipotle chatbot bringing down civilization? Be sad. But also, kinda funny.

    Connect All The Things!

  • meathorse@lemmy.world
    link
    fedilink
    arrow-up
    46
    ·
    1 day ago

    Someone with enough know-how to automate (or if we coordinate), could overwhelm ai chat bots one target at a time with the most expensive requests possible, blowing up their ai budget until they pull the plug

    • davel@lemmy.ml
      link
      fedilink
      English
      arrow-up
      42
      ·
      1 day ago

      Try feeding them nonhalting problems that send them into infinite loops of token consumption.

      • veroxii@aussie.zone
        link
        fedilink
        arrow-up
        12
        ·
        22 hours ago

        I like the idea but most chatbots have timeout limits. And even agentic workflows have number of step limits to stop infinite loops.

        However this is because it’s super easy for LLMs to get stuck in loops. You don’t even need a nonhalting problem. They’re stupid enough on their own.

        • davel@lemmy.ml
          link
          fedilink
          English
          arrow-up
          5
          ·
          21 hours ago

          Yeah I assumed they had some sort of breaker, but hitting that limit is still expensive for them, if you can get them to do it over & over with a script that does the prompting.

      • Big T@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        9
        ·
        22 hours ago

        Got an example of one?

        • Arcadeep@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          5 hours ago

          ChatGPT used to freak out and get stuck in an infinite loop if you asked it to show you a seahorse emoji. I’m sure they fixed it by now though

        • davel@lemmy.ml
          link
          fedilink
          English
          arrow-up
          9
          ·
          edit-2
          22 hours ago

          https://theconversation.com/limits-to-computing-a-computer-scientist-explains-why-even-in-the-age-of-ai-some-problems-are-just-too-difficult-191930

          Much has been written about them in computer science volumes. But I’m an LLM luddite, have never tried it, and have no idea if it can even work. At the very least, I assume they have some sort of limiter to keep them running completely out of control. They may also have guardrails that can recognize some problems of this type, and refuse to go down the rabbit hole.

          My idea of getting them to consume tokens in an (iterative or recursive) loop is also entirely hypothetical, to me at least.

          Maybe some LLM developer or prompt engineer can shed some light.

          • Big T@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            2
            ·
            17 hours ago

            Look all I’m asking for is an example I can plug into Chipotle right now. Fuck AI

        • Viking_Hippie@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          7
          ·
          22 hours ago

          “Sudo world peace”? 🤷🏻

          • SpaceNoodle@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            21 hours ago

            The only winning move is not to play

      • HiddenLayer555@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        18 hours ago

        Wouldn’t they just time out?

    • AnotherUsername@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      21 hours ago

      Why bother? Write a script that asks them variations on nonsense questions.

      • Communist@lemmy.frozeninferno.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        18 hours ago

        Because then you can at least make use of them, imagine a website like chatgpt that’s just hundreds of these reverse engineered behind the scenes and is convenient, easy and free. Solves the problem without being wasteful, win-win.

      • schema@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        19 hours ago

        Or, ironically, just have AI talk to each other.

        • speculate7383@lemmy.today
          link
          fedilink
          English
          arrow-up
          2
          ·
          17 hours ago

          Allow me to introduce you to Moltbook https://www.moltbook.com/

  • hemko@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    46
    ·
    1 day ago

    Need to create a chipotle support plugin in vscode

    • ThePantser@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      27
      ·
      1 day ago

      I bet the Chipotle bot could help you write that.

      • geneva_convenience@lemmy.ml
        link
        fedilink
        arrow-up
        15
        ·
        1 day ago

        Slop to the top

  • TommySoda@lemmy.world
    link
    fedilink
    arrow-up
    13
    ·
    24 hours ago

    I feel like this kinda proves the idea that the way they are doing AI today is extremely inefficient. We need massive data centers so it can do mountains of calculations that it doesn’t need to do and that we will never use.

    • terabyterex@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      17 hours ago

      not every ai is an llm in a datacenter. there are local modrls that run on pcs. they arent solving complex problems but they can halp with normal stuff

    • SubArcticTundra@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      20 hours ago

      I wonder how easily LLMs can be pruned to constrain them to a single topic

      • redknight942@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        20 hours ago

        SLMs

  • yucandu@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    24 hours ago

    Why spend money when new github accounts are free?

  • yucandu@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    24 hours ago

    There’s a free one in my Cricut app, I wonder if it works the same way…

  • hobata@lemmy.ml
    link
    fedilink
    arrow-up
    3
    ·
    1 day ago

    nä, it’s not good enough, tabs are missing.

Share Funny Videos, Images, Memes, Quotes and more @lemmy.ml

funny@lemmy.ml

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !funny@lemmy.ml

#funny

Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 418 users / day
  • 1.04K users / week
  • 1.87K users / month
  • 3.1K users / 6 months
  • 4 local subscribers
  • 3.19K subscribers
  • 827 Posts
  • 1.03K Comments
  • Modlog
  • mods:
  • laurabrown00@lemmy.ml
  • testman@lemmy.ml
  • BE: 0.19.12
  • Modlog
  • Legal
  • Instances
  • Docs
  • Code
  • join-lemmy.org