Service Desk Performance – What the numbers say versus what the business really thinks



Scroll Down
28-Jun-2019 09:33:00

In my last piece, I talked about Problem Management and how it is often confused with Major Incident management. In fact, while the two are linked, they are very different beasts. However strong your Problem Management, you must be ready for business critical incidents.

When an organisation decides to outsource its Service Desk function, the decision is usually driven by the desire for improved efficiency, greater expertise and raised levels of service. After all, why struggle with trying to deliver a service yourself when you can draft in experts to handle it with a much greater degree of proficiency?

But once you have outsourced the service, you are then faced with a challenge: how do you gauge whether the provider you have hired is actually delivering on what they have agreed to do?

The starting point for this should always be the Service Level Agreement (SLA). A key part of the contract, this document sets out precisely the services the provider will deliver in addition to the minimum standards that provider is obliged to meet.

Performance metrics are an integral part of this. Some of these metrics are relatively easy to track. For instance, providing you have an effective and accurate ticketing system in place, it should be relatively easy to keep track of matters such as the average time it takes from instigation of a ticket until resolution of the issue. Likewise, you should also have clear visibility on basic service metrics such as the average time it takes for the Desk team to pick up a call when a user attempts to make contact.

That said, other elements of the service can be a lot harder to pin down – and these are often some of the most important aspects of that service. How do you begin to put a numerical value on customer satisfaction, for instance?

If something is difficult to measure, this doesn’t mean that it should be bypassed – otherwise you may be faced with a situation where the provider is meeting all of its obligations in theory but there is something fundamental missing from the service in practice. This is the so called watermelon SLAs that are green on the outside but red at the core. Here are some pointers for ensuring that you are keeping track of the right aspects of performance.

What to measure – and how to measure it

Define your vital KPIs. As a starting point, to prevent you from ‘drowning in data’ it is always important to focus on the metrics that really matter: in other words, those that directly address what you want the support provider to deliver for your company. With this in mind, take a look at our guide, Does your existing support provider deliver on these KPIs?  

Customer satisfaction is definitely something worth measuring as a distinct metric. In other words, don’t overlook it just because it takes a little more than a basic calculation to gauge (see below for customer satisfaction measurement tips).

Resist setting your ‘green light’ expectations too low. Let’s say the provider suggests that an appropriate service level of initial responses is for 90%  of calls to be answered within 10 seconds. At first glance, this may appear reasonable. However, if in reality, your people want and expect immediate service, then perhaps the appropriate level would be 95% of calls answered within 8 seconds.

At the same time, factor in the operational impact of SLA metrics. It’s a balancing act here. For instance, taking the example of answer times once again, if your green light demanded 98% of calls to be responded to in three seconds, this may have a big impact on the volume of people you need to man the service – and on associated costs.

Measuring customer service. It’s vital, but be careful how

If the majority of service measures are green but users are still demonstrating a degree of dissatisfaction, this is a sign that you need to achieve a better correlation between the numbers and the mood and perception of users.

There is no single magic formula for measuring customer satisfaction levels. However, one common way of gauging it is to rank how the ticket was handled at resolution of the Incident. Here, you could ask them to address key areas such as the quality of communication, ease of using the service, speed of response – and of course, whether the matter was ultimately resolved to their satisfaction.

Broadly, users are likely to be ‘Happy’, ‘Not Happy’ or ‘Neutral’. Bear in mind that ‘Neutral’ and ‘Happy’ are two very different things – and that if you have a high level of neutral responses, this may be a sign that particular aspects of the service still need to be fixed – and that the level of service you are getting is less than you bargained for. The best way is to give users an even number of options so they can’t sit on the fence about their satisfaction.

Choose your Service Provider carefully

In general, there is a movement towards XLAs (Experience Level Agreements) rather than the traditional SLAs, but there is some debate as to what these actually are. Generally it favours weighting customer satisfaction metrics much higher than in the past.

However, ultimately empathy and flexibility are key drivers of great service – so it follows that this is what you should be looking for from any new service provider. Check for evidence of this – as it will, in all likelihood, help you avoid the situation whereby performance measures are ‘green’ but the level of service overall is not what you had in mind.


New call-to-action

Download our FREE
End User Support e-book

If you share any level of responsibility for delivering high quality It to your organisation, our FREE e-book ‘Happy Users, Easy Life’ is for you.

New call-to-action

Leave a Comment

Pete Canavan
Pete recommends our FREE webinar consultation

Talk to us today about Business Advantage IT

If you’d be interested in discovering how Plan-Net could help give your organisation Business Advantage IT, get in touch.

Did you find this article useful?
Sign up to receive more from Plan-Net