Robots in Public Service: Automating Local Authorities

Robots in Public Service: Automating Local Authorities

This week I hosted a round table event for executives from local authorities, alongside my client Freeths Solicitors and the HR & finance recruitment consultants Seymour John. The topic was automation in the public sector, and to kick things off I gave a short talk.

Here’s what I said.

I’d like to kick off the debate with a little provocation. I’ve worked with a few councils and public sector organisations over the last five years, and the attitude to digital transformation in all of them has been largely similar. Outside of the few cheerleaders for technology, it is usually seen as a necessary evil. It’s accepted that the ideal solution would be maintaining humans in their roles. Digital services are largely a low-cost — and lower quality — alternative.

I have some sympathy with this argument, for a number of reasons. For a start, the drive to digital has been an explicit response to the extraordinary cuts you’ve faced. It has naturally felt like compromise.

Worse, the compromise has often been very real: the results of digital transformation projects haven’t always lived up to expectations or been a suitable replacement for what went before. They have been implemented at a direct cost to service, but also at a less-visible cost to flexibility. There’s very little ‘give’ in a digital system, when the computer says ‘no’.

A fresh perspective

What I want to propose this morning is that we have to rethink that attitude and look afresh at the next generation of technology that is inbound into the public sector. Because I think we’re approaching a point where technology, and particularly various types of robotics, could actually help us to deliver services better than humans ever could.

To back this up, let me give you three examples, one from the contact centre, one from the back office, and one from a care context.

The customer support robot

The first example is Amelia, the artificial intelligence that I was loosely involved with helping Enfield Council to procure last year. I’m not involved with the roll-out of this technology, so with the caveat that I can talk only about the promise rather than the practicalities, let me show you this video then I’ll talk a little about what I have seen of Amelia.

Amelia is a learning system. What that means is that it can digest information from natural language and then create answers from what it learns. It is not like a chatbot where you have to define every step in a particular workflow. It understands the meaning of questions and finds the answers from its base of knowledge.

In a demonstration that I hosted, we used the iPhone manual as the source of that knowledge. Amelia starts knowing nothing, just an avatar on a screen with a blank text box. But two minutes after copying and pasting a page from the iPhone manual into that box, it confirms that information has been digested. From this point on it — or now ‘she’, the avatar — can answer natural language questions about any of the information.

Q: “Why isn’t my phone charging properly?”

A: “Common causes of battery issues are…”

She speaks, and she writes. And she answers 80% of queries first time. But what’s most confounding to expectations, and to anyone who has interacted with prior generations of voice system, is that people prefer dealing with Amelia to dealing with other people.

Why? She’s quick and efficient. And sometimes, you’re dealing with things where a human on the other end isn’t an advantage. Rent arrears, for example.

The data processor

The second example is a much less anthropomorphic robot.

I’ve seen multiple situations in councils where human beings have been engaged to make connections between disparate systems and siloes in an organisation. I called my local council once to report that the cycle path I used each day was getting heavily overgrown, was an inch deep in slippery fallen leaves, and increasingly interrupted by fly-tipping. This was apparently a ‘complex query’ so I was directed from the website to the call centre. There I had to spend over half an hour on the phone while the poor recipient of my call had to work her way through three different systems — and two different mapping interfaces — in order to report my three issues.

When I was working with Enfield we had a hypothesis that callouts to environmental health might be an early indicator of an adult social care problem. But we couldn’t test this easily because the data for the two services was in completely separate systems. And if we had wanted to act on it, without major investment it would have meant Janet from environmental health having a chat with Bob from adult social care*, if she thought there was someone that may need checking on.

A robot doesn’t have this issue. Because a robot can natively speak data, it can communicate with two disparate systems and look for correlations, even without any formal integration work. In the most extreme cases it can even pretend to be a human with a screen, keyboard and mouse, if there’s no other way to access an ageing system. A robot could operate across three systems simultaneously, in real time. It could test hypotheses across disparate systems, and perhaps even start to come up with its own.

The care robot

The third potential application for robotics is perhaps the most controversial: inside an adult social care setting. The idea of this, imported from Japan with its impending demographic crisis, horrifies a lot of people. Surely robots can’t care? They can’t show empathy?

My argument is that perhaps they don’t have to. I’ve been observing my childrens’ interactions with robots for the last few years, from home built things that waddle around and make rude noises, to Amazon’s Alexa and most recently, Cozmo. What’s clear is that each of these devices has quite a very rich appeal based in part on its rounded personality. But that personality is not a factor of the technology, however clever or otherwise it might be. It is projected onto these devices by my children.

The number of marriage proposals received by Alexa, around half a million when I last spoke to someone from Amazon about it, suggests that adults also anthropomorphise even faceless machines. Combine our imaginations with the capability to track our health with ruthless efficiency, and you potentially have an incredible tool for preventative medicine and light touch remote care.

Not better than humans, but better in single aspects

None of these things is a direct replacement for a human being. But we can also no longer afford to look at them as a poor compromise. Even if there were to be some radical political reversal in the near future — something that looks increasingly likely with every government own-goal — we shouldn’t abandon the opportunity that robots present. The opportunity to do things better than we alone ever could.

This post forms part of my Future Human series. For more posts on this subject, visit the Future Human page.

Share this article ...

Future News

Subscribe to my newsletter and get weekly stories plus other insight into tomorrow's world.

Latest Articles

Tom Cheesewright