11 Reducing opt-outs

 

This chapter covers

  • Identifying the reasons behind a user’s desire for human agents
  • How to prevent users from immediately wanting to opt out
  • How to keep users engaged with your conversational AI
  • Using generative AI to create friendlier dialogue messages
  • Deciding when to involve a human agent (and when not to)

The term “opt out” refers to a user attempting to exit a virtual agent experience, often with the intention of reaching a human agent. You might also see this described as escalating or zeroing out (pushing zero on a phone’s dial pad to get an operator). Opt-outs can be costly. Chatbots are an investment, and they must demonstrate a return on business value in order to remain viable. Containment loss due to too many opt-outs can sink a business case.

Users will opt out for a variety of reasons that often require different strategies and approaches to resolve. Regardless of the type of bot you are managing, be it voice, text, FAQ, process-oriented, or even routing agents, identifying where your users opt out within the conversation can give you clues about why they did so. Learning why users opt out will help you design an experience that minimizes opting out, which should improve your containment.

11.1 What drives opt-out behavior?

11.1.1 Immediate opt-out drivers

11.1.2 Motivations for later opt-outs

11.1.3 Gathering data on opt-out behavior

11.2 Reducing immediate opt-outs

11.2.1 Start with a great experience: Greetings and introductions

11.2.2 Convey capabilities and set expectations

11.2.3 Incentivize self-service

11.2.4 Allow the user to opt in

11.3 Reducing other opt-outs

11.3.1 Try hard to understand

11.3.2 Try hard to be understood

11.3.3 Be flexible and accommodating

11.3.4 Convey progress

11.3.5 Anticipate additional user needs

11.3.6 Don’t be rude

11.4 Opt-out retention

11.4.1 Start right away by collecting opt-out data

11.4.2 Implementing an opt-out retention flow

11.5 Improving dialogue with generative AI