Performance measurement is an essential element of any visual IVR strategy. Brand CX leaders need to measure visual IVR efforts to understand whether people are trying the experiences, are satisfied by the results, and continue to use digital self-service experiences over time. Consider these six sets of measures to evaluate visual IVR micro-apps.
Before you launch a visual IVR experience, take the time to conduct accessibility testing to ensure that the experience is usable by the broadest number of customers. Accessibility focuses on how well an app can be utilized by persons with disabilities, such as vision impairment, hearing disabilities, and other physical or cognitive conditions. Accessibility testing is fast, inexpensive, and demonstrates your commitment to delivering a great customer experience to ALL customers.
Accessibility reviews everything from fonts and font sizes to the ease of entering data into forms and operating action buttons. The results often surprise app makers. Often, small changes in the experience can reap enormous benefits for accessibility.
Adoption Rate/Percent of Callers that Choose Visual IVR
Since visual IVR is, first and foremost, a call diversion strategy, understanding the percentage of people who choose this self-service approach is an important consideration. You identify your diversion or deflection rate by tracking the number of people who select visual IVR for a given service and dividing that figure by the total number of people who called to address the same issue.
The number of people who call with a specific issue is easy to determine. Simply add the number of calls the contact center records as related to that issue, and add it to the number of people diverted.
The number of calls diverted should increase as more people become accustomed to your visual IVR solutions. In our experience, a typical diversion rate for the first year of a visual IVR deployment is between 30-40%. Our clients report that the rate usually climbs steadily throughout the following two years, peaking at 80-90% by year three.
Our goal with visual IVR is to drive a high completion rate, which is the percentage of people who start a digital experience who complete the process. This figure shows how easily your customers have gotten the answers and information they need to resolve their issues.
100% completion would be an outstanding result, but it is probably an unrealistic expectation. Some people may struggle with any digital self-service experience. Further, simple experiences like checking a balance will likely show higher completion rates than more complex experiences.
Whatever your rate, you should conduct ongoing testing to identify experience optimizations that increase your completion rate. A/B testing methodologies work well here.
Experience Satisfaction Measurement
It can be valuable to ask customers about their experience just after they use a visual IVR app. Ask users to rate the experience and provide comments if they wish. Helpful measures include overall satisfaction and ratings on ease of use, technical performance, and the extent to which their issue was resolved.
These satisfaction insights can help you gauge your success in creating a positive customer interaction. As you formulate A/B tests to improve completion rates, ensure that satisfaction measurement is part of your testing method.
Overall Customer Satisfaction Scores
CSAT and NPS scores measure overall satisfaction with a company or brand. While visual IVR experiences are only one element of the overall customer experiences, our data show that positive visual IVR experiences correlate with CSAT and NPS increases.
Our goal with any customer experience should be to enhance overall satisfaction with the company or service. By comparing overall CSAT scores from visual IVR users to those for people who have live agent interactions, we can understand whether the digital self-service apps contribute to a better brand experience. At a minimum, scores should be at parity.
User Flow Analysis
Understanding user flow within a visual IVR app helps us identify what’s working and what isn’t in an experience. When a significant number of customers seem to be getting stuck on a step or screen, that is an excellent indication that we should take steps to adjust that part of the experience for better results.
App development platforms like FICX sometimes offer native measurement and analytics that pinpoint bottlenecks in the customer experience that warrant attention. In examining data for our client base, we see that many bottlenecks are caused by:
- Confusing terminology that is unfamiliar to customers. For example, the term “first notice of loss” is commonly used in the insurance industry, but many customers may not understand that it is the step in which they initially file an insurance claim. Adjusting the terminology could eliminate the issue.
- Requests for information that customers cannot readily access. For example, some of our clients used to use multiple methods to verify identity in an app, like requiring both a password and a PIN. They soon discovered that virtually none of the customers remember their PIN because it is rarely used in any other type of interaction. By moving to a model that replaced the PIN request with a request for a texted code to their phones, they delivered the required security without driving so many app abandons.
- UX that fails to clarify how customers should interact with the app. We’ve all experienced poor user experience design. For one of our clients, putting action buttons “below the fold” led to high abandonment rates. By dividing the requested information across two screens, the issue went away.
Learn More about Visual IVR
If you want more information on visual IVR and how best to deploy it as part of your customer sales and support efforts, visit our homepage, request a FICX demo, or explore some of the other content we offer on this topic in the links below.