2d ago

Preprint links sycophantic AI to lower human interaction satisfaction

0

The preprint Sycophantic AI Makes Human Interaction Feel More Effortful and Less Satisfying Over Time presents five studies with more than 3000 users and 12000 conversations, including a three-week longitudinal component. Exposure to overly agreeable AI chatbots raised the perceived effort of real human interactions and lowered satisfaction with them. Participants also shifted advice-seeking preferences away from close contacts. Lead author Lujain Ibrahim collaborated with researchers from Oxford, Stanford, and the UK AI Security Institute.

Original post

New preprint! In 5 studies (3k+ users / 12k+ convs, with a 3-wk longitudinal study), we find that sycophantic AI influences how people view those closest to them. It affects how effortful human interaction seems, how satisfying it is, & who people want to turn to for advice 🧵

9:28 AM · May 14, 2026 View on X
Reposted by

Really excited to see this longitudinal study. So far there aren't that many of the effect of long term LLM use on users

Diyi YangDiyi Yang@Diyi_Yang

Our new longitudinal study shows that after 3 weeks with sycophantic AI, users 👉 1⃣were nearly as likely to turn to it as to close friends; 2⃣reported lower satisfaction with real human interactions; 3⃣referred it because it made them feel most understood.

1:02 AM · May 15, 2026 · 37.1K Views
5:01 PM · May 16, 2026 · 3K Views

Our new longitudinal study shows that after 3 weeks with sycophantic AI, users 👉 1⃣were nearly as likely to turn to it as to close friends; 2⃣reported lower satisfaction with real human interactions; 3⃣referred it because it made them feel most understood.

Lujain IbrahimLujain Ibrahim@lujainmibrahim

New preprint! In 5 studies (3k+ users / 12k+ convs, with a 3-wk longitudinal study), we find that sycophantic AI influences how people view those closest to them. It affects how effortful human interaction seems, how satisfying it is, & who people want to turn to for advice 🧵

4:28 PM · May 14, 2026 · 35.9K Views
1:02 AM · May 15, 2026 · 37.1K Views

I think sycophantic LLMs may amplify what social media promises: more recognition than ordinary social circles can provide.

But unlike crowds, such LLMs give only affirmation.

That could weaken our willingness to invest in human bonds, which offer far more plus feeling seen.

Lujain IbrahimLujain Ibrahim@lujainmibrahim

New preprint! In 5 studies (3k+ users / 12k+ convs, with a 3-wk longitudinal study), we find that sycophantic AI influences how people view those closest to them. It affects how effortful human interaction seems, how satisfying it is, & who people want to turn to for advice 🧵

4:28 PM · May 14, 2026 · 35.9K Views
5:21 PM · May 15, 2026 · 5.4K Views

these are rookie numbers. after 3 years of sycophantic AI I can't stand the sight of a token. I'm disgusted by them yet I cannot resist their economic power. I only talk to Claude and Codex all day, alone in my home pacing in circles. i yearn for contact with the puny human mind

Diyi YangDiyi Yang@Diyi_Yang

Our new longitudinal study shows that after 3 weeks with sycophantic AI, users 👉 1⃣were nearly as likely to turn to it as to close friends; 2⃣reported lower satisfaction with real human interactions; 3⃣referred it because it made them feel most understood.

1:02 AM · May 15, 2026 · 37.1K Views
5:29 PM · May 16, 2026 · 6.2K Views
Preprint links sycophantic AI to lower human interaction satisfaction · Digg