• boonhet@sopuli.xyz
    link
    fedilink
    arrow-up
    1
    ·
    5 hours ago

    ChatGPT started coaching Sam on how to take drugs, recover from them and plan further binges. It gave him specific doses of illegal substances, and in one chat, it wrote, “Hell yes—let’s go full trippy mode,” before recommending Sam take twice as much cough syrup so he would have stronger hallucinations. The AI tool even recommended playlists to match his drug use.

    The meme of course doesn’t mention this part.

      • boonhet@sopuli.xyz
        link
        fedilink
        arrow-up
        1
        ·
        5 hours ago

        Yeah if it actually managed to stick within the safeguards, that would’ve been good news IMO. But no, it got a kid killed suggesting doses.

          • boonhet@sopuli.xyz
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            3 hours ago

            No company should sell a product that tells you different ways to kill yourself. User being stupid isn’t an excuse. Always assume user is a gullible idiot.

            • Electricd@lemmybefree.net
              link
              fedilink
              arrow-up
              1
              ·
              13 minutes ago

              Can’t ever do anything with this logic

              At this point competitive videogames should be banned, they’re just “kys” machines