Part 4 Pattern: Reduce friction
People have different expectations for interacting with a chatbot than for interacting with other humans. Humans may chit-chat with each other at the start of an interaction, but chatbots usually get right to business. Humans are good at smoothing over misunderstandings, while chatbots are sometimes, well, robotic.
Users may have a personal bias against chatbots before an interaction starts. With a phone-based AI, they may immediately press and hold the 0 key (to get to an operator); on a chatbot, they may repeatedly type “representative.” Or they may engage in this behavior after the AI makes a mistake. In either case, we call these opt-outs—the user is opting out of an AI experience and opting for an interaction with a human.
Chapter 11 digs into why users opt out at differing points of an interaction from the beginning to the end, and it offers techniques that make opt-outs less likely. Opt-outs cannot be eliminated, so chapter 12 shows what to do when they happen, namely summarizing the AI interaction in a manner useful for a human agent who will continue the interaction where the AI left off.