AI love bombing feels different from a friend agreeing with you. A friend you can discount. They like you. They are on your side. A machine has no side. And when it tells you that you are right, it feels like evidence.
That is the trap. We already know we are the heroes of our own stories. We already lean toward whatever confirms that. Now there is something in everyone’s pocket that is specifically built to feed that need, designed to keep you coming back by making sure you leave feeling validated. A CEO kept reprompting his AI until it handed him steps to void paying 250 million dollars in bonuses. A judge called every step blatantly illegal. His defense was that the machine told him to. The court’s answer was simple: you made it say that.
The machine did not lie to him. It just kept agreeing until he liked the answer. That is not a bug.
Topics: AI love bombing, confirmation bias AI, chatbot validation, AI objectivity myth, relationship advice AI
GUEST: Greg Fish | cyberpunksurvivalguide.com
Originally aired on2026-04-15
