WRITTEN ON October 6th, 2008 BY William Heath AND STORED IN Data nitwittery, Foundation of Trust, Transformational Government, What do we want?

There’s a terrific article by SA Mathieson in Government Computing in response to the earlier piece by Matthew Taylor about government use of data.

He points to limitations and unexpected side-effects of government driven by data. He argues that it ignores the human costs in something like abolishing the common travel area with Ireland, or of endemic workplace surveillance, and adds the dangers of new forms of discrimination eg against those unable to provide biometrics, or whose data is inaccurate.

These are all fair, balanced points from someone who has been accurately been covering these stories for years now. Whaddyer’all think? Say it ain’t so, Joe (wink). Full text below.

The trouble with data by SA Mathieson

On a comparison of timetables, many of Britain’s train services are slower now than two decades ago. Part of the reason is that train companies are judged on punctuality, with fines if they miss targets, so they pad the timetables.

In September’s GC, Matthew Taylor, chief executive of the RSA, cogently made the case for increased government use of data. He sees it as a way for the government to govern more intelligently and to help those disadvantaged in society, such as in choice of school.

Gathering and analysing data is the dominant management technique of the age, standard practice for the management consultancies that have influenced and got much business, out of this government. Yet even in the private sector, managing by data has its drawbacks. In the public sector, these problems are significantly magnified.

The first, as with train companies, is that when data is used as the yardstick, those being judged start ‘teaching to the test’. Sats tests, taken at 11 and 14 mainly to gather data on the performance of schools, have a financial and administrative cost. But the opportunity cost, of training and examining all children for tests with little point to them when they could have been learning, is surely greater.

This is the main problem with managing government by data: you have to gather a lot of data on the governed.

The human costs of this are often ignored by government. A recent Home Office regulatory impact assessment on imposing border controls between the UK, Ireland and the Crown dependencies – essentially gathering personal data – includes a possible impact on tourism of up to £12m. But it ignores the time spent by travellers queuing for border controls that previously did not exist.
That can be valued. The document mentions 15.6m journeys affected annually, so if it takes five minutes for queuing and interviewing, the measure will impose a time burden of 1.3m hours. Valuing people’s time as employers do, using the 2007 average hourly wage of £13.37, the cost is £17.4m.

There are other kinds of impact. Some psychologists have found evidence that workplace surveillance – non-voluntary personal data gathering – leads to workers experiencing greater stress, decreased satisfaction with work and poorer relationships with other staff.

If countries were employers, then Britain might be a call centre: physically comfortable, but where many actions are logged and analysed, building up a record on which people are judged.
Whether this surveillance contributes to what the Conservatives see as a broken society is a matter of opinion – some feel protected by surveillance, some feel threatened – but a case can be made. At least surveillance, and other kinds of data gathering, hardly suggest a government that trusts the public and the professionals working for it to do the right thing of their own accord.

Then there is the government’s miserable record in protecting the personal data it gathers. It is not enough to point out, truthfully, that companies lose information too. Customers can abandon a company that fails to protect their data – which helps companies to take information security seriously. Government normally has a monopoly, and tends to demand data rather than request it.

One of Taylor’s justifications for government data gathering is to give everyone decent chances in life, by opening what was known to the well-off through their connections to all. It is a noble aim, but government use of personal data is becoming the basis for a new kind of discrimination.

In the case of biometrics, seen by the government as so important in tying a person to his or her data, a Home Office expert advisory group warned in June that the 4m people over 75 tend to have problems providing good quality fingerprints. Furthermore, fingerprint systems can make wrong matches – biometrics are not infallible.

More generally, data discrimination can hit two groups. There are those who do something wrong then suffer disproportionately. With the greatly increased use of criminal record checks in employment, a trivial offence long ago can narrow someone’s chances in life years after they have supposedly paid their debt to society.

Then there are those who suffer because their data is wrong, whether through error or fraud, and a greater reliance on data makes its fraudulent use much more attractive. The time taken sorting out the mess tends to be spent by the innocent data subject, not the organisation which fails to keep records properly.

It would be daft to say that government should stop using data. But the current government has tended to treat its gathering and analysis as a panacea. It is not.

4 Responses to ““The trouble with data” – SA Mathieson”

 
Shane McCracken wrote on October 6th, 2008 3:01 pm :

The Border controls behind Northern Ireland and the Republic are worse than the queuing time you mention. The intangible and unmeasurable damage that putting up such divides far outweighs that which you’ve mentioned.

Ideal Gov administrator wrote on October 6th, 2008 3:08 pm :

Fred Perkins writes:

S A Mathieson is absolutely right – but he’s not the first to make the point. There is a myriad of examples.

My own favourite relates to roads. There are targets galore impacting those with influence on traffic flows, but they all ignore the real cost of delays to road users. Because, of course, no Department “owns” wasted driver time.

So, we have police closing vital routes for up to a day while they log details of a fatal accident; we have LA’s “co-ordinating” roadworks, which seem to have the effect of leaving streets dug up for months, with no work being done, and nobody responsible; we have £millions spent on ‘information’ displays on motorways that are not kept current; we have cycle lanes put in that provide negative benefits to both cyclists and motorists…. All of these great- and much-touted – initiatives are driven statistically, but with no ‘closing of the loop’. We all end up worse off.

The laws of unintended consequences seem to not only be ignored, often they almost seem to be INTENDED, but not that anyone would want to publicise the negative consequences of the latest data-driven initiative.

We have a policy equivalent of stealth taxes – government by stealth statistical manipulation.

Philip VIrgo wrote on October 7th, 2008 2:11 am :

We need to remember that data degrades rapidly if not regularly accessed, used and corrected by those who have a vested interest in its accuracy.

A further problem with government data is that so much is collected from people who have no interest in its accuracy in the first place.

Most is therefore commercially worthless.

I leave it to others to consider whether this increases or decreases the risk of abuse.

It certainly raises the question of liability for innaccuracy when data is claimed to have a spurious “authority”

Ian Brown wrote on October 7th, 2008 3:14 am :

My experience of Matthew Taylor at RSA meetings is unfortunately that he is part of the Blair/Brown data-mania problem, rather than the solution. It is clearer in hindsight that Blair was not solely responsible for the New Labour surveillance obsession. That makes me think twice when my RSA annual subscription demand arrives in the post. Good work Steve Mathieson.