By Brian Handrigan on Friday, 29 May 2015
Category: Contact Centre

Be Confident Your Contact Center Technology Delivers The Brand You Promise

Today, most B2C interactions involve some form of contact center technology. With the exception of in-store purchases, contact center technology is responsible for providing the vast majority of brand impressions that customers have through experiences with toll-free numbers, self-service IVR, and CTI screen pop—and that’s just one of the channels. There are also self-service websites, apps, social media scrapers, blended queuing processes, and more.

SERVICE DELIVERED VS. SERVICE INTENDED

VOC programs ask customers for feedback on the experience as they remember it – which is extremely important when determining if the experience delivered was pleasing, useful, efficient, or memorable. But it’s also critical to monitor the experience delivered by the organization’s technology and compare it to what was intended. Customers don’t know whether or not the technology with which they’re interacting is actually doing what it’s intended to do, they just know whether or not it’s available when they want to use it and lets them get what they need quickly and efficiently.

CUSTOMER VS. TECHNOLOGY PERSPECTIVE

Data that’s easy to collect with regularity is what’s frequently relied upon for decision making and tuning—the data collected inside the contact center related to congestion, CPU consumed, call arrival rate, etc. Accurate, unemotional, precise data about the experience actually delivered has to come from the outside in, the way customers really access and interact with technology, and be collected in a controlled fashion on a regular basis.

IS THERE A BETTER, MORE STRATEGIC WAY TO DO THIS?

There are ways to gain a true assessment of the customer service experience as delivered. It starts with documented expectations for the experience as it’s supposed to be delivered. Defined expectations establish functionality, performance, and availability specs as benchmarks for testing.

It’s in every company’s best interest to know that the technology it put in place is, first of all, capable of making all those connections it has to make, and then actually delivering the customer service experience that it intended to deliver. To do that, you need reliable data gathered from automated, outside-in, scripted test interactions that can be used to assess the functionality that’s been put in place, as well as the technology’s ability to deliver both at peak load and then continuously once in production.

Think of automated testing as using an army of secret shoppers to access and exercise contact center technology exactly as it’s intended to be used: one at a time, then hundreds, thousands, tens of thousands of virtual customer secret shoppers. Think also about the technology lifecycle: development – cutover – production – evolution.

Start with automated feature/function testing of your self-service applications— voice and web—to ensure that what was specified actually got developed. Precisely verify every twist and turn to ensure you are delivering what you intended. Do that before you go live and before you do unit testing or load testing. Using an automated, scripted process to test your self-service applications ensures you have reliable discrepancy documentation—recordings and transcripts—that clearly document functionality issues as they are identified.

Next step, conduct load testing prior to cutover. Use automated, scripted, virtual customer test traffic, from the outside-in, through the PSTN and the web—to ensure your contact center technology really can perform at the absolute capacity you’ve designed and also at the call arrival and disconnect rates that are realistic. Plan failure into the process—it’s never one and done. Leave enough time to start small, to identify and address issues along the way.

Once in production, continuously access and exercise contact center technology just like real customers, through the PSTN and through the web—to ensure it’s available, assess its functionality, and measure its performance so you can be confident that technology is delivering the intended experience, 24x7.

If you have self-service applications that undergo periodic tweaks or enhancements, automated regression tests using those initial test cases will ensure all functionality is still on task after tweaks to the software or underlying infrastructure.

WHAT’S THE NET RESULT OF TESTING THIS WAY?

VOC feedback tells you what customers think about your efforts and how they feel about dealing with your brand and technology. Automated testing and monitoring allow you to determine if the interfaces you’ve provided are, in fact, providing the easy-to-use, low-effort experience you intend.

It’s pretty straightforward—quality and efficiency of experience drive loyalty, and loyalty drives spend. You HAVE to have both perspectives as you tweak the technology, and that’s the message. Listen to your customers as you decide what you want your technology to do, and then make sure your technology is doing what you intend it to do.

Originally Published in Speech Technology Magazine.

Leave Comments