Humans, robots and rules: what the Covid-19 pandemic tells us about decision-making and rules

As we all set ourselves up for weeks, or even months, of self-isolation, never has there been a better time to think about rules, and the reasons we do the things that we do.

While the UK government has imposed new rules, telling us that we need to stay at home, these rules are only ever an approximation of the ideal rule, which in this case, is the idea that everyone needs to stay at home. The issue here is that while universal isolation is all well and good in theory, we still need health workers and we still need to keep the electricity flowing and the water running.

Rules vs reality

All rules, and indeed all laws, only really come into effect when they are tested in the real world, and when someone is forced to decide whether or not a rule has been broken.

Take the classic example, ‘Thou shalt not kill’. While it may seem simple enough to apply this rule in the general sense, there are of course many exceptions, including times of war and the argument of self-defence. But of course, when we make a rule, we don’t list all of the exceptions when we share it – that would be far too complicated and we’d never get anything done.

This same problem can be seen in the UK government’s rules around staying at home. While the message has been distilled into ‘Stay at Home, Protect the NHS, Save Lives’, there is an unspoken assumption around what is deemed necessary and important in a time of global crisis.

But the government can’t make these decisions on a case-by-case basis – that would be impossible. As such, the decision is left to individuals and communities to make judgements based on the ‘spirit of the rules’ and the context around each individual case.

Robot rules, human judgements

What this goes to show is that rules can often be quite ‘perverse’. This is because they are only ever an approximation of an ideal rule (i.e. everyone stay at home), and they require a human individual to make a judgement based on an understanding of the rule, the spirit of the rule, and of the context in which the test case has been put.

But of course, things don’t always go to plan.

Take shopping for example. As a consequence of the Corvid-19 pandemic, many supermarkets have imposed limits on the number of items a customer can purchase in a single transaction. The aim is to limit panic buying, and keep enough items on the shelves so that everyone has enough to eat.

Empty supermarket shelves

However, this has led to some customers flouting the rules in various ways, including those who have taken their shopping to the car and then returned to do a second shop.

And yet the rules themselves, often don’t work as intended. I was recently in my local branch of Sainsbury’s, filling my basket responsibly, as I do every week, when I was told I’d have to put some yoghurts back on the shelf. ‘You can only have three of these’, the shop assistant told me. I raised an eyebrow. ‘But multipacks are ok.’

Of course, I’m not complaining – the poor supermarket worker was only doing her job. But what this reveals is just how strange and perverse rules can really be.

According to the logic of the rules, I could have purchased three large multipacks of yoghurts, and that would have been fine. I could also have purchased three very large tubs of yoghurt, and that too would have been ok. However, to purchase four small yoghurts – together, less volume than a single large pot – is not allowed, even if they are specifically reduced and near the end of their use-by date.

Somehow, this doesn’t feel quite right. This is because the ‘rule’ (only buy three of something), isn’t easily applied to a situation where numbers alone don’t tell the whole story.

This is just the same as what happens when we use technology to help us make decisions as things are rarely ever black and white. In an article I wrote for The Conversation back in June, I discussed the issue of video technology in football, and how the technology forces (human) referees to adhere to rules in a robot-like fashion. What this exposes is not a problem with the technology per se, but rather a problem with the rules, and the fact we have to judge where and when a human player can be offside.

The VAR football controversy.

In the case of my shopping example, clearly, the supermarket worker was just following the rules (just as a robot would obey a piece of code). However, in applying the rule to the letter so she also exposed just how flawed many rules can be. This is why we need human intervention. Not because machines are inherently wrong, but rather because you can never take a general rule (i.e. a piece of machine code) and apply it to the infinite number of individual cases.

What’s needed then, in these times of crisis, is a little more humanity – a little ‘common sense’. However, as we’ve seen, some people lack even basic common sense, which suggests that maybe we do need some robot-like decision making after all. It’s a difficult balancing act. If we can’t be trusted collectively to act as responsible humans, then the government will be left with no choice but to treat us more and more like machines.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.