A profound relational revolution is underway, not orchestrated by tech developers but driven by users themselves. Many of the 400 million weekly users of ChatGPT are seeking more than just assistance with emails or information on food safety; they are looking for emotional support.

“Therapy and companionship” have emerged as two of the most frequent applications for generative AI globally, according to the Harvard Business Review. This trend marks a significant, unplanned pivot in how people interact with technology.

  • kingthrillgore@lemmy.ml
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    11
    ·
    edit-2
    9 hours ago

    Of course men will go to an AI for their problems, they can’t fathom going to a woman for honest advice.

    And as a result, they gaslight themselves with a worse version of ELIZA.

    • InfiniteGlitch@lemmy.dbzer0.com
      cake
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      9 hours ago

      Of course men will go to an AI for their problems, they can’t fathom going to a woman for honest advice.

      This seems a bit far-fetched, don’t you think? There could be so many reasons as to why someone would rather use AI than going to another person for advice (this is not just about women).

      Honestly, as someone who actually went to therapy and yes, my therapist was a woman. It’s was quite tough to open up and be vulnerable.

      I think for some people using AI, they might feel as if they’re not that vulnerable because it is not a person. However, they don’t realize that there’s data is being gathered.

      And as a result, they gaslight themselves with a worse version of ELIZA.

      With this, I can’t figure out whether you’re serious, trolling or just writing randomly.