As Head of Delivery of AMF, AlfaPeople’s support and maintenance business within Microsoft Dynamics CRM and AX, I am often involved in working with integrating our services into our customer’s existing internally organised IT support. Through a series of blog posts, I will attempt to break up the classic way of thinking and creating IT support – this enable me to air some of my ideas, but also function as a shout-out to you out there, because I wish to hear your thoughts. First subject is benchmarks.
I am a great fan of reconciliation of expectations. This ensures that what we do – is the same as what people expect us to do. I am also a great believer in facts and benchmarks, which eliminates “I think” and “I feel” situations when it comes to expectation fulfilment. However, this has been overdone when it comes to IT support.
When an internal support organisation talks to the business, their customers, the firstthings they are met with are aggressive demands concerning reaction and solution time – and often with serious reprisals if not upheld. IT these days is so complex that, in order for these benchmarks to make sense, they must be assisted by exceptions, subdivisions, backwards compatible requirements of the business, and an extension of requirements towards external suppliers etc. – all of this being complicated to navigate and impossible to communicate. The result is therefore often that the relationship between business and support is worn thin. This is simply due to a misalignment of expectations.
IT Support’s reaction is not merely a wealth of additional statistics and metrics that either support or counter-argue certain official benchmarks, but also the introduction of procedures meant solely to reach the benchmarks, instead of aiming for an added creation of value in the solution. One should maybe take a step back and take a look at the original metrics once again?
Lots of support organisations use ITIL’s definitions within Service Operation as a guideline. This specifically states that the main purpose is to find a balance between stability and flexibility, cost, and quality. A balance that matches the company’s best interests – not necessarily the individual “customer” from the businesses best interest. This does not match up too well with the heavy focus on reaction and solution time for each individual task either.
Another thing to bear in mind, is that internal IT support can these days be compared to an IT project that contains an unknown workload, many stakeholders, large codependence, constant change of priorities – and at the same time a set price. It may therefore be slightly unfair to measure success purely on compliance to deadlines.
From my perspective, the fairest benchmark for a support organisation is the quality of its decisions – as well as efficiency when making and executing such decisions. In order to offer a more nuanced argument, I would probably include LEAN – with its thought process concerning value and waste. For instance, this could be a categorisation of time through the colours Red, Yellow, and Green – depending on how much value is created. The thought behind it is that one can then monitor it regularly and constantly make internal improvements – something both the business and the CIO can understand and respect. It will also change the attitude towards the use of metrics to something positive.
What do you think? Which benchmarks do you believe makes sense within a complex internal IT support system?