A 57-year-old woman spent six days in the hospital for severe liver damage after taking daily megadoses of the popular herbal supplement, turmeric, which she had seen touted on social media, according to NBC News.

The woman, Katie Mohan, told the outlet that she had seen a doctor on Instagram suggesting it was useful against inflammation and joint pain. So, she began taking turmeric capsules at a dose of 2,250 mg per day. According to the World Health Organization, an acceptable daily dose is up to 3 mg per kilogram of weight per day—for a 150-pound (68 kg) adult, that would be about 204 mg per day. Mohan was taking more than 10 times that amount.

  • sabreW4K3@lazysoci.al
    link
    fedilink
    arrow-up
    7
    arrow-down
    13
    ·
    6 days ago

    This is why I want AI available to everyone. I want for people like this to ask AI if it’s safe and for AI to tell her no. There’s not enough doctors to go around, so we need something to fill the gap and provide common sense advice.

    • Ecco the dolphin@lemmy.ml
      link
      fedilink
      arrow-up
      10
      ·
      edit-2
      6 days ago

      You think that a woman who took 10x the recommended dose of something would listen to an AI instead of a label designed explicitly for this supplement?

      Or are you saying that we should encourage folks to get advice from an AI and respect it as they would a medical professional?

      Also, what do you mean, “available to everyone?”?? Its baked into google???

      Also… Gemini seems to recommend 2000 mg at the top end of the range… Idk man, that’s real close to what she was taking daily. Seems bad!

      Gemini probably sucks for this but I don’t think AI is a great idea for this anyway.

      • sabreW4K3@lazysoci.al
        link
        fedilink
        arrow-up
        2
        arrow-down
        2
        ·
        6 days ago

        I’m not talking about a general commercial AI, I’m talking about a vetted, specifically trained AI that would be able to escalate queries to tele-operators.

        Again, we don’t have the doctors to go around, we need to figure how to fill the gap in medical care. Especially with aging populations. We can turn up our nose at AI, but it can save lives, even if it’s just freeing up doctors to work on more urgent tasks.

        • Akrenion@slrpnk.net
          link
          fedilink
          arrow-up
          4
          ·
          6 days ago

          We still have pamphlets and texts vetted by professionals.

          Ai is not needed when people can just consult health care providers websites. In fact I doubt people will trust AI over their “own research”.

          The problem is quacks trying to sell snake oil.

    • Match!!@pawb.social
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      6 days ago

      who the hell can we trust to make a babysitter life coach AI for vulnerable people

      • sabreW4K3@lazysoci.al
        link
        fedilink
        arrow-up
        1
        arrow-down
        2
        ·
        6 days ago

        Honestly, the government. I was speaking to a GP and they were saying so much of their time is lost doing basic commonsense consultancy with patients. If we can free them up for more important things like chasing up care that patients need and providing support, it’s a win for everyone.

        Is it probable? Likely not. But it’s definitely something we should aspire towards. Everyone deserves the best possible care and figuring out how to do that with dwindling resources is imperative.

    • Randomgal@lemmy.ca
      link
      fedilink
      arrow-up
      7
      arrow-down
      3
      ·
      6 days ago

      Sir this is Lemmy. I’m gonna need you to start hate-antromorphizing AI and refuse to even engage with it.