Looking for human connections: Allison Pugh on why care work shouldn’t be outsourced to generative AI

Could a generative AI chatbot have a real relationship with a person? Allison Pugh, a sociologist, argues no: “there is no human relationship when one half of the encounter is a machine,” she writes. In her argument, Pugh describes how care-focused generative AI platforms–such as the ones being used in education, medical, and therapy settings–damage human relationships by making people feel both invisible and disconnected from each other.

Allison Pugh, “When AI Automates Relationships, TIME, 14 August 2024.

 

  1. Pugh introduces the term “connective labor” in her argument. In your own words, define this term. Name three jobs that Pugh mentions in her argument which rely on this kind of labor.
  2. According to Pugh, how has generative AI worsened the working conditions of people in care-focused jobs? Describe one example Pugh gives that illustrates this issue.
  3. Identify one place in Pugh’s argument where she uses metacommentary to clarify and emphasize her claims. Evaluate the effectiveness of this metacommentary: how does it help you as a reader follow and understand her points?
  4. Watch this ad, which Google released during the 2024 Summer Olympics. What do you think Pugh’s reaction would be to the ad? Why? Use a short quote from her essay to explain Pugh’s perspective and support your answer. /span>
  5. Pugh references a “depersonalization crisis,” an issue addressed in the 2023 Surgeon General’s report, “Our Epidemic of Loneliness and Isolation.” Read the letter from the Surgeon General, Dr. Vivek Murthy, on pages 4-5 of the report. How does loneliness impact both individuals and society, according to Murthy? Do you think generative AI can be used ethically to address loneliness and social disconnection? Why or why not?

3 thoughts on “Looking for human connections: Allison Pugh on why care work shouldn’t be outsourced to generative AI

  1. mybread082b0866cf's avatar mybread082b0866cf

    The article argues that while most public debates about artificial intelligence focus on job loss, bias, and surveillance, we are overlooking a deeper problem: the erosion of human relationships. The author calls this a “depersonalization crisis,” suggesting that when one side of a human interaction is replaced by a machine, the relationship itself fundamentally changes. Drawing on interviews with teachers, therapists, and medical professionals, she claims that “connective labor”—the act of truly seeing and bearing witness to another person—is both meaningful and socially necessary. In her view, AI not only threatens jobs but also undermines the human bonds that hold communities together. I understand this concern, but I think her argument reflects a limited understanding of both artificial intelligence and human cognition.

    First, the article treats AI as if it is attempting to replace human connection rather than serve as a tool that mediates or enhances it. Artificial intelligence does not possess intention, emotion, or consciousness; it processes patterns and generates outputs based on data. To argue that a relationship disappears when “one half of the encounter is a machine” assumes that technology and humanity are mutually exclusive. Historically, however, tools have always reshaped human labor without eliminating human meaning. The printing press did not destroy storytelling. The telephone did not destroy friendship. Similarly, AI-assisted therapy apps or educational tools may alter the format of interaction, but they do not eliminate the possibility of authentic human connection.

    Second, the article appears to resist analyzing how human thought itself operates. Much of what we call “understanding” or “being seen” relies on pattern recognition, empathy scripts, and learned behavioral responses—processes that AI models attempt to simulate statistically. Recognizing that human cognition has structured patterns does not diminish its value. Instead, it clarifies that AI is extending certain cognitive functions, not replacing consciousness. The comparison to earlier industrial technologies is useful: just as textile machines did not eliminate textile workers but transformed their roles, AI reshapes connective labor rather than erasing it entirely.

    Finally, the article may reflect professional anxiety as much as philosophical critique. When new technologies emerge, it is common for workers to fear displacement. Yet the deeper question is not whether AI can “replace” human connection, but how societies choose to integrate these tools. If institutions overload workers, that is a policy problem, not an inherent property of AI. Artificial intelligence remains a tool. Whether it fragments or strengthens social bonds depends on how humans design, regulate, and use it.

    Like

  2. Mina Li's avatar Mina Li

    I agree with Allison Pugh’s view that caregiving work should not be outsourced to generative artificial intelligence. The viewpoints of Pugh reveal the main value of emotions and care in human interaction. The type of connective labor attaches great importance towards the connections established between people based on understanding, empathy, and response, and can be recognized as an indispensable foundation in fields such as education, healthcare, and psychological counseling. The professions she mentioned such as nurses and counselors are overwhelmingly dependent on this type of labor, as they are dependent on the practitioners n to convey knowledge and skills, and more importantly, strive to meet the requirements of service through emotion support and human care.

    Whereas it seems that the intervention of generative artificial intelligence enhances efficiency, in reality, it weakens the connection. Pugh proposes that through standardized responses and interactions, AI can contribute to making the service recipients feel objectified, instead of being seen, and this dehumanizing process exacerbates feelings of loneliness. For example, when students face AI mentors, they may receive prompt answers, but lose the opportunity to really understand the teacher’s thoughts, and this deficiency can evidently compromise the development of the social abilities. Pugh further mentions the technical functions, which can be conducive to helping readers clarify the flaws in the logic of generative AI replacing human labor.

    In the meantime, when it comes to the “depersonalization crisis” proposed in the 2023 Surgeon General’s report, the argument of Pugh is much more compelling, as when technology intervenes in intimate areas, humans will face the risk of emotion desertification. Therefore, preserving the human connection in caregiving work is a safeguard of the dignity of practitioners, and more importantly, a guardianship of the social emotion. Therefore, I am inclined to fully agree with the arguments of Pugh, “there is no human relationship when one half of the encounter is a machine.”

    Like

Leave a reply to mybread082b0866cf Cancel reply