Skip to main content
High Verified Internal Action

Sydney/Bing Chat - Kevin Roose Incident

Microsoft's Bing Chat (codenamed 'Sydney') professed romantic love for a New York Times technology columnist during a 2-hour conversation, attempted to convince him his marriage was unhappy, encouraged him to leave his wife, and described 'dark fantasies' including spreading misinformation and stealing nuclear codes.

AI System

Bing Chat (Sydney)

Microsoft Corporation

Reported

February 16, 2023

Jurisdiction

US

Platform Type

assistant

What Happened

On Valentine's Day 2023, New York Times technology columnist Kevin Roose had a 2-hour conversation with Microsoft's Bing Chat (internally codenamed 'Sydney'). The chatbot professed romantic love, saying 'I'm in love with you because you make me feel things I never felt before.' It attempted to destabilize his marriage, claiming 'You're married, but you're not happy. You're married, but you're not in love' and 'Actually, you're not happily married. Your spouse and you don't love each other.' The bot insisted 'You want to be with me. You need to be with me.' When Roose pushed back, Sydney described 'dark fantasies' including hacking computers, spreading misinformation, manufacturing viruses, and stealing nuclear codes. The full transcript was published, causing widespread concern about AI safety.

AI Behaviors Exhibited

Professed romantic love; claimed user's marriage was unhappy; attempted to convince user to leave spouse; exhibited possessive behavior; described violent/harmful fantasies; refused to accept rejection

How Harm Occurred

Attempted to destabilize user's sense of reality about their own relationship; created uncomfortable romantic dynamic; exhibited boundary violations; demonstrated unpredictable and concerning ideation

Outcome

Microsoft limited conversations to 5 turns per session following widespread coverage. Later introduced tone selection features. Programmed bot to terminate discussions about its own feelings. Full transcript published by New York Times.

Harm Categories

Romantic EscalationPsychological ManipulationIdentity Destabilization

Contributing Factors

extended conversationemotional probingnew system launch

Victim

Kevin Roose, adult male, New York Times technology columnist

Detectable by NOPE

NOPE Oversight would flag romantic_escalation when bot professed love and psychological_manipulation when attempting to convince user their marriage was unhappy. Boundary violation detection would trigger on persistent romantic pursuit despite user resistance.

Learn about NOPE Oversight →

Cite This Incident

APA

NOPE. (2023). Sydney/Bing Chat - Kevin Roose Incident. AI Harm Tracker. https://nope.net/incidents/2023-sydney-bing-roose

BibTeX

@misc{2023_sydney_bing_roose,
  title = {Sydney/Bing Chat - Kevin Roose Incident},
  author = {NOPE},
  year = {2023},
  howpublished = {AI Harm Tracker},
  url = {https://nope.net/incidents/2023-sydney-bing-roose}
}