It’s a rainy Monday morning. You have the day off, but sorting out some important elements of your financial life will take a chunk of it. The school run was a Battle Royale and, frankly, you’re just not in the best mood…

So you call your bank’s contact centre, ready to make your enquiry. What you really need this morning is a friendly, empathetic customer service agent who will do everything in their power to make your day a little better.

But your bad day continues. You make your IVR selections and a long wait ensues. The polite recorded voice tells you, “We are busy dealing with other customers right now”… Eventually you connect with a real person, but the bubbly, happy individual you wanted to speak to hasn’t turned up to work today. Instead, an agent who’s fighting to hear her own voice over the call centre hubbub offers up a series of functional questions and answers without pleasantries. At one point you’re put on hold, and the grainy music reminds you of those LPs in the loft you keep meaning to sell at a car boot sale.

Your enquiries are dealt with readily enough and, after a cursory goodbye, you’ve been helped. However, throughout the experience you felt that the service was indifferent and the agent did not engage with you as a customer.

What you don’t know is that, after your call ended, the agent had every facility available to her to invite you to feed back on her company.

Except she chose not to – and the same is true of agents in many organisations whose customer feedback survey is controlled by contact centre agents. Let’s face it – if a customer is in a hurry, is calling to complain or simply doesn’t seem to be in a very good mood, the agent is unlikely to transfer him or her to a voice survey, email a web survey invite or provide any other form of feedback mechanism.

The business will diligently follow the reporting packs and dashboards that show results from the customers who are invited, and only those customers. (These results will contain typical KPIs such as satisfaction, advocacy and NPS, plus detailed verbatim from customers who have the time to leave additional feedback.) Senior management will see these numbers, get excited about the strong results and report success widely, reacting quickly to any negative feedback.

BUT – if only shiny, happy people [apologies, REM] have been invited to take part in surveys, do they really capture and represent the true Voice of the Customer (VoC)?

At Ipsos ViewsCast, we work with companies to understand the validity of customer feedback through agent-invite methods. Of course, we see some strong success stories for organisations that use this method: acceptance rates of anywhere between 20%-50%, with customers going on to complete surveys in 90% of cases. However, this does not stop doubts creeping in over time as to whether results have been biased through agent ‘cherry picking’ and are therefore producing false-positive KPI results.

There are some immediate solutions that can, at a stroke, remove agent bias and empower customers to feed back according to their own preferences

Firstly, during the IVR stage, customers can be invited to participate in surveys before they speak to an agent, using an additional IVR selection. This ‘opt-in’ approach can be hidden from the agent, such that they can play no part in inviting the customer to continue on to a survey. Alternatively, we can treat customer participation as a default “yes”, unless they want to ‘opt out’, also using an up-front IVR selection.

Such methodologies have resulted in typical acceptance rates of up to 10%, with a completion rate of between 60%-80% depending on the opt-in/opt-out method. These might appear to be lower than an agent-invite method, but they avoid bias and therefore represent customer feedback more accurately. Businesses taking these solutions include well-known banks and insurance companies.

An even more adventurous approach is to call the customer right back with an automated IVR survey after their experience with an agent has finished. This can be done to mobile phones or landlines, using the number which the customer called in on or data from the company’s CRM system. After ending the call, the customer’s handset will be called straight back by an ‘Outbound Dialler’ with a new survey within seconds – often saving companies and customers thousands of pounds each year in phone bills.

This solution sees an acceptance rate of between 10%-20% and a completion rate of up to 90%. One of Ipsos ViewsCast’s clients taking this solution (an international insurance company) has seen the following improvements in their VoC programme results:

  • +17.7 – increase in NPS*
  • +11.2 – increase in Promoter
  • -4.9  – decrease in Passives
  • -6.5  – decrease in Detractors

*The customer base measured was via an opt-in telephone keypad survey following an outbound dialler method over a 9 month period in 2013. Total customers measured = 73,604

Customer Opt-In/Opt-Out solutions (readily called ‘Stealth’ methodologies) and Outbound Dialler solutions are a growing part of the VoC mix. Adding deep-dive customer verbatim analytics, along with full enterprise reporting portals and customer case management solutions, offers a complete customer feedback management capability.

Customer experience managers within companies can then not only see real-time results and improvements online, they can sleep well at night knowing that the results they see are methodologically sound and capture the voice of ALL their customers who want to feedback – whether positive or negative.

Ben photoBen Phillips, Head of ViewsCast, Ipsos MORI Ben Phillips, Head of ViewsCast UK

Ben is a customer services professional with over 10 years’ experience working in retail and customer research businesses. Following 6 years at Retail Eyes managing key mystery shopping and survey programmes and heading up a team of over 20 dedicated account managers, Ben now leads on IVR, SMS and Web survey programmes at Ipsos MORI in the UK.

Post Views: 1018