• totesmygoat@piefed.ca
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    5
    ·
    29 days ago

    I’m sorry. But if you are following chatgpts medical advice… It’s the kind of thing that will take care of itself.

    • DougPiranha42@lemmy.world
      link
      fedilink
      arrow-up
      24
      ·
      29 days ago

      It’s not like a user asked vanilla ChatGPT random questions and took the responses for medical advice. This is a service openAI marketed specifically for medical use. If we want to keep up the pretense that consumer protection laws and regulations exist and matter, this is a big deal, and the blame is on the vendor, not on the user.

  • Kirp123@lemmy.world
    link
    fedilink
    arrow-up
    18
    ·
    29 days ago

    They’ll get sued and will have to prevent it from dispensing medic advice. Honestly I’m surprised they weren’t sued already.

  • Buffalox@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    edit-2
    29 days ago

    This is actually criminal activity, it poses as qualified expert advice, but is nothing but a con, delivering totally unqualified advice.
    This can only be categorized as fraudulent, in any lawful society where quackery is illegal.
    ChatGPT needs to be investigated, and legal procedures needs to be held against them, with the goal of closing the “service” and huge fines for putting lives at risk.