How not to run shared services
DEC 21, 2008
In this season of giving, the most useful present we've received so far comes from the UK's Department for Transport (DfT), a government ministry that's inadvertently provided the rest of the world with a blueprint for getting a shared services project utterly wrong.
DfT was accused of "stupendous incompetence" in a report published last week by the House of Commons Public Accounts Committee, a government oversight body charged with examining value-for-money in public expenditure. Declaring the DfT initiative one of the worst cases it had ever seen, the Committee criticized pretty much every aspect of a project that had set out to improve HR, payroll and finance administration. Most damning of all, it argued that instead of the anticipated benefit of £57m ($85m), the project will eventually cost taxpayers £81m - in some cases, in return for poorer performance than they're getting now. Some investment.
Reading through the Committee's report, it appears that DfT started out badly and went downhill from there. Pretty much every component of the project - from planning and commissioning to project management and performance measurement - ran into problems.
To begin with, according to the report, the initial project timetable was overly aggressive, with two agencies expected to go live within just one year - apparently because the Department believed that there was "no advantage in planning for a longer detailed design and later start".
In order to hit its deadlines, it was advised by its consultants to build on an existing IT system rather than going out to tender. This decision, said the Committee, triggered further problems, contributing to "poor specification of its requirements, the piecemeal placement of work and poor management of its suppliers." It's not hard to see why. If you've ever been through a tendering process you'll appreciate that it can be tortuously long, but it does have the invaluable upside of enabling you to discuss individual departments' requirements, build consensus for what you're trying to achieve, and get multiple suppliers' perspectives on where things could go wrong.
Next - and perhaps because it failed to consult widely enough - the report says the Department couldn't crack "the inherent tensions in securing the agreement of seven separate agencies to a single set of processes". In addition, individual agencies couldn't provide enough staff who understood their processes to work at the center. Given that process standardization is a key goal of a shared service initiative, this was something of a problem.
The aggressive timetable continued to dog the project through to the end. Running over schedule, DfT made the calamitous decision before going live of cutting system testing from a recommended minimum of two months to two weeks. As a result, the system was unstable when it went live. Worse, the Department had abandoned its testing environment to save money, so after the first phases were up and running, all future testing and upgrades had to be carried out on the live system. That, inevitably, led to system crashes.
The list of problems goes on. DfT didn't set up a performance framework for 2007-8 until September 2007, and still only collects data on 14 out of 18 key performance indicators. And what data is available doesn't sound convincing. At the time of the Committee hearing it had only achieved four of the 18 KPIs: the cost per payslip processed in the center is double industry average, and it processes about half the transactions per full-time equivalent (FTE) than other organisations achieve. Interestingly, the DfT pointed out that it didn't have the economies of scale the private sector enjoys and government reporting requirements are more complex - which implies it should be weighting its performance goals to make the metrics it uses more meaningful (see Webster Buchanan's Multi-country Payroll Scorecard).
It also failed to win over users, thanks in part to poor service levels and training. And as the Committee pointed out, DfT didn't think KPIs around customer data were as important as others, "despite the fact that the accuracy of these details is important to those using the system and in building their trust".
All of which provides numerous lessons for anyone else embarking on a similar project, chief among them:
For Disclaimer and Copyright Notice, click here.
© Webster Buchanan Research 2013