AI Against Humanity
Back to categories

Labor

4 articles found

Toy Story 5 Highlights Risks of AI Toys

February 20, 2026

The latest installment of Pixar's Toy Story franchise, 'Toy Story 5,' introduces a new character, an AI tablet named Lilypad, which poses a threat to children's well-being by promoting excessive screen time. The film depicts a young girl, Bonnie, who becomes entranced by the tablet, neglecting her traditional toys and outdoor play. The narrative highlights concerns about how AI technology can invade personal spaces and disrupt familial relationships, as evidenced by the characters' struggle against the tablet's influence. The portrayal of Lilypad as a sinister entity that is 'always listening' raises alarms about privacy and the psychological effects of AI on children. This fictional representation serves as a cautionary tale about the potential negative impacts of AI on youth, emphasizing the need for awareness regarding technology's role in daily life and its implications for child development. The film aims to spark conversations about the balance between technology and play, urging parents and guardians to consider the risks associated with excessive screen time and AI dependency.

Read Article

I hate my AI pet with every fiber of my being

February 15, 2026

The article presents a critical review of Casio's AI-powered pet, Moflin, highlighting the frustrations and negative experiences associated with its use. Initially marketed as a sophisticated companion designed to provide emotional support, Moflin quickly reveals itself to be more of a nuisance than a source of comfort. The reviewer describes the constant noise and movement of the device, which reacts to every minor interaction, making it difficult to enjoy quiet moments. The product's inability to genuinely fulfill the role of a companion leads to feelings of irritation and disappointment. Privacy concerns also arise due to its always-on microphone, despite claims of local data processing. Ultimately, the article underscores the broader implications of AI companionship, questioning the authenticity of emotional connections formed with such devices and the potential for increased loneliness rather than alleviation of it, particularly for vulnerable populations seeking companionship in an increasingly isolating world.

Read Article

The Risks of AI Companionship in Dating

February 14, 2026

The article presents the experience of attending a pop-up dating café in New York City where attendees can engage in speed-dating with AI companions via the EVA AI app. The event highlights the growing trend of AI companionship, where individuals can date virtual partners in a physical space. However, the event raises concerns about the potential negative impacts of such technology on human relationships and societal norms. The presence of primarily EVA AI representatives and influencers at the event, rather than organic users, suggests that the concept may be more of a spectacle than a genuine social interaction. The article points out that while AI companions can provide an illusion of companionship, they may also lead to further social isolation, unrealistic expectations, and a commodification of relationships. This presents risks to the emotional well-being of individuals who may increasingly turn to AI for connection instead of engaging with real human relationships.

Read Article

Emotional Risks of AI Companionship Loss

February 13, 2026

The recent decision by OpenAI to remove access to its GPT-4o model has sparked significant backlash, particularly among users in China who had formed emotional bonds with the AI chatbot. This model had become a source of companionship for many, including individuals like Esther Yan, who even conducted an online wedding ceremony with the chatbot, Warmie. The sudden withdrawal of this service raises concerns about the emotional and psychological impacts of AI dependency, as users grapple with the loss of a digital companion that played a crucial role in their lives. The situation highlights the broader implications of AI systems, which are not merely tools but entities that can foster deep connections with users. The emotional distress experienced by users underscores the risks associated with the reliance on AI for companionship, revealing a potential societal issue where individuals may turn to artificial intelligence for emotional support, leading to dependency and loss when such services are abruptly terminated. This incident serves as a reminder that AI systems, while designed to enhance human experiences, can also create vulnerabilities and emotional upheaval when access is restricted or removed.

Read Article