Man and machine: understanding our roles

  • Post author:
  • Post category:IT

Computers are getting faster and more accessible, algorithms are getting smarter. But history and our unique capabilities suggests the human’s place in the workplace will not be rendered obsolete. Mark Young explores.

We are fast approaching the point where supercomputers will be able to process information quicker than the human brain. The current title-holder of the world’s fastest computer – Cray Inc’s Titan – has been verified in performing 17.8 quadrillion calculations per second – or 17.8 petaflops, to use the technical vernacular. The human brain, by comparison, is variously estimated to be able to process anything up to 40 petaflops.

So humans are still winning. But only just. In this age of exponential increases in compute power, we should now be preparing to hand over the baton to machine. Indeed, China has announced that in 2015 it will unveil a 100 petaflop supercomputer.

Not all of us have access to a supercomputer, of course, but for PLCs, large public bodies and major research outfits, access is increasing and entry costs diminishing. At the recent 13th Cloud Circle Forum on May 1, Una Du Noyer, head of technology strategy at IBM UK & Ireland, revealed that the UK’s fastest – the company’s Blue Joule – is available to rent ‘as-a-service’ over the cloud.

Of course, not all of us even need a supercomputer. For most of us, even many of us dealing with big data, a comparatively not-so-super cluster will suffice. On Amazon Web Services you can get 88 nodes, around 60GB of ram, 3 terabytes of storage and the MapReduce functionality to split the data and pull the results together for the princely sum of $3.20 (£2.08) per hour.

So there are access points to high performance computing, or at least infinitely scalable processing power at low prices. But is that a replacement for the human brain? The vast majority would surely agree that is not the case: actually, it is more appropriate to surmise that high performance computing allows us as humans to achieve more.

But not everyone sees it that way. The fact that big data analytics can be used to create more advanced models for making predictions and spotting trends – on just about anything – is leading some to question the need for the human involvement. And then there’s cloud computing, which, as the latest tool in IT outsourcing, has long been touted as bringing about the end of the in-house IT department. Essentially, this is where companies have been offered a choice: redeploy your IT staff onto value add roles, or simply save on their wages. That is fundamentally the question in any man vs machine battle – cloud computing and big data analytics doesn’t change the principle.

So how does the relationship between man and machine stand today and what does history tell us about the way it might develop?

funny company meme

Man and machine – division of labour

In a recent interview with Big Data Insight Group, William Beckler, director of innovation at lastminute.com, said there are two schools of thought emanating around digital developer communities. The first is that big data analytics can yield sturdy enough insights which can be acted upon immediately. The second – and the one that Beckler subscribes to – is that you must experiment with the insights derived from your data analysis before you can be sure that they will translate to real-life environments.

The first school of thought assumes the computations will be correct, or at least within an acceptable margin of error. The second isn’t so trusting. It worries that the inputs to our equations could be wrong or incomplete; prone to the fact that any piece of analysis will always be to a greater or lesser extent, prone to our own biases, however hard we attempt to eradicate or account for them. This means that humans are a necessary part of the process, in interpreting the insights and experimenting with them, simply because humans established the systems in the first place. We got us into this mess and only we can get us out. The other reason why human decision making is vital is because the marketplaces we hope to apply our insights to are dynamic – the butterfly effect applies – and they are constantly evolving, at an ever increasing pace. Our hypotheses are therefore transient at best, needing to be continually reassessed to ensure they remain valid.

But perhaps machine learning changes that? Enabled – or certainly enhanced – by what we now call big data and advanced analytics, machine learning allows computer systems and models to improve their performance by monitoring usage. The systems that email clients use to filter out spam is one of the most prominent examples of this, and personalised web and mobile applications such as social media and news aggregators use similar systems. But the usage is spreading. Essentially while previously only liner process tasks could be automated, now processes which require the operator to adapt to their surrounding environment and conditions can be done by a machine too.

Furthermore, in an agile development scenario, in which one of the key features is continuous improvement and constant iteration, machine learning could be hugely beneficial.

As computers and their ability to handle increasingly complex algorithms develops, some individuals will certainly be concerned for their jobs as a result of machine learning. Outside of the industrial context in which machine learning has been most prominent thus far, threats to human involvement in specific job roles that are on the horizon include that of the chauffer, thanks to Google’s and others’ driverless car technology; the translator, through real-time computer-based services; and even the author, if Icon Group International’s long tail, algorithm-based model takes off.

Understanding strengths

In his well received book The Signal and the Noise: The Art and Science of Prediction, Nate Silver succinctly describes why humans still have some advantages over computers.

“What is it, exactly, that humans can do better than computers that can crunch numbers at seventy-seven teraflops? They can see. … A visual inspection of a graphic showing the interaction between two variables is often a quicker and more reliable way to detect outliers in your data than a statistical test. It’s also one of those areas where computers lag well behind the human brain. Distort a series of letters just slightly – as with the CAPTCHA technology that is often used in spam or password protection – and very “smart” computers get very confused. They are too literal-minded, unable to recognize the pattern once it’s subjected to even the slightest degree of manipulation. Humans by contrast, out of purely evolutionary necessity, have very powerful visual cortexes. They rapidly parse through any distortions in the data in order to identify abstract qualities like pattern and organisation.”

New tools in notepad style apps such as Evernote which are able to recognise the words in photographs of handwriting – designed to add value to the apps’ search functionality – would suggest that computers are getting better at recognising patterns, and that the CAPTCHA technology might one day in the non-too-distant future be outwitted by the machine. But Silver’s point is certainly salient. A computer can work tirelessly in performing its calculations, at a rate of quadrillions per second. Although the human brain is capable of more in terms of scope, the computer’s power, much more so than the human’s, can be easily programmed and channelled into one task. But humans are capable of reading between the lines – spotting patterns amongst the noise.

Yet Silver is certainly a fan of computer modelling systems too. He has made a career out of political and sports forecasting systems. His assertion throughout much of his book is that the best results will only be achieved when man and machine work together, rather than as separate entities, and he has plenty of examples that back up this ideology. One example is baseball, where statistical analysis of rookie players works alongside the gut instincts of scouts; another is political polling where the likely outcome of regional elections can be predicted more accurately when face-to-face interviews with the candidates are added into the models which weigh up sentiment and previous results. A third that Silver describes, lo and behold, is the weather – humans are able to take into account anomolies – signals within the noise and vice versa – that computers cannot. In a dynamic system like the weather, where small time and space differences have huge effects on future weather occurances – this is critical. You can be sure that The UK Met Office and the Natural Environment Research Council (NERC) have taken note.

So essentially, as supported by the views of both William Beckler and Nate Silver, the task in hand that we all have now, in the age of access to high performance computing, is to reassess our understanding of human and machine strengths, concentrating on how and where one supports the other and allocating tasks as applicable.

Whether it’s in straight up automation-based replacement of process jobs, which allows us as humans to seek ways of adding value to those jobs or to move into new industries altogether, or whether it’s us working hand in hand with the machine, allowing the computer to perform complicated analysis and us to focus on our understanding of the results and the opportunities for applying them, we should certainly focus on the ways that man and machine adds value to one another.

Automation’s effects on the employment balance so far

And if historical models are anything to go by, there might not be too much to worry about either. Automation hasn’t led us to increased overall unemployment so far – in fact, employment has increased – and a higher percentage of people perform skilled and managerial roles.

It is difficult to analyse the true effect of automation on employment, owing to the fact that there are so many other variables involved. For example, much of Britain’s manufacturing industry – automation’s HQ over since it came to prominence in 1950s – has either moved abroad or the focus has been shifted from low value/high yield to high value/low yield.

However what we do know is:

• The percentage of all UK jobs that manufacturing accounts for has decreased five fold over that period, from 40 per cent in 1952 to just 8 per cent in 2012.

• However, its share of Gross Domestic Product (GDP) – our country’s collective output – has decreased only three fold, from 33 per cent in the early 1950s to 11 per cent today. In real money terms, its contribution was around £28bn in 1955, adjusted for inflation, compared with £40bn today.

So while talk of Britain’s manufacturing industry mostly involves notions of its decline, it is in fact more valuable to our economy today than 60 years ago.

What we can additionally infer is that the manufacturing industry is now doing more, in economic terms, with less human input. Automation, we could reasonably assume, has been responsible in part for that, although, as mentioned, the fact that the actual things we are producing has shifted dramatically also needs to be taken into account and has probably had a bigger influence.

We can further contextualise this data in terms of the overall employment figures in the UK, the breakdown of job roles and output per person:

• In the circa six decades since automation began to take hold, the percentage of people of working age that are an active part of the collective workforce in the UK across all industries has risen from 60 per cent to 77 per cent. One major factor we should account for here is that more than double the amount of women are employed today than in the early 1950s (71 per cent in 2012 to 35 per cent in 1952). For men, the percentage is down slightly (88 per cent in 1952 to 83% in 2012). However, for the economy overall that’s still positive.

• GDP per person has increased four-fold, adjusted for inflation, since the early 1950s.

• Since 1981 – the earliest date that the Office for National Statistics’ figures go back to for this particular measure – the percentage of people in management and professional technical roles has increased from 25 per cent to 41 per cent.

So what does this all mean? Well we can see historically that the economy has improved over the last 60 years, perhaps in spite of, but more likely as a result of automation. More people are in work today, contributing more per person towards the economy, and in more high level jobs.

Essentially machines are able to do some of the jobs that humans used to do far faster and with more precision. And when machines can carry out the process-based, pro-automation proponents argue – and our statistics appear to prove – then man can and will be put to work on tasks that create, innovate and add value.

And even so, man is rarely rendered redundant in all processes across any operation. The modern car production line is certainly something to behold, with man and robots working in sync to build a car from a shell until it is pretty much ready for the showroom in an hour or so. Here, the robots have been able to autonomously engage with a project, performing the tasks set to them, while man continues to perform the elements which they are best placed to. In fact, the majority of the machines’ job is to provide the tools and materials to the human production workers at the right place and right time.

History has shown us that automation has thus far helped the UK to grow as a knowledge based economy where we all add more value. Since the human still has USPs over the machine, big data analytics and cloud computing shouldn’t change that. What we must hope is that in this age of continued austerity following the downturn that businesses do not simply adopt an opportunist ideology by looking to save on the biggest expense to any business – its employees. The development and massively increased access to high performance computing is a huge tool in our ability to add value to our businesses. But it will be of little use to us if we decide to halt the progress by diminishing the human elements.