Strategy. Innovation. Brand.

hunter-gatherers

Digital Taylorism and Dumb Humans

I’m your new manager.

Years ago, I heard Jaron Lanier give a lecture that included a brief summary of the Turing Test. Lanier suggested that there are two ways that machines might pass Turing’s test of artificial intelligence. On the one hand, machines could get smarter. On the other hand, humans could get dumber.

I wonder if humans-getting-dumber is where we’re headed with digital Taylorism.

Frederick Taylor, who died just over 100 years ago, was the father of scientific management or what we would now call industrial engineering. Working in various machine shops in Philadelphia in the late 19th century, Taylor studied the problems of both human and machine productivity. In Peter Drucker’s words, Taylor “was the first man in recorded history who deemed work deserving of systematic observation and study.” His followers included both Henry Ford and Vladimir Lenin.

The promise of the original Taylorism was increased productivity and lower unit costs. The gains resulted from fundamental changes in human work habits. Taylor-trained managers, for instance, broke complex tasks into much simpler sub-tasks that could more easily be taught, measured, and monitored. As a result, productivity rose dramatically but work was also dehumanized.

According to numerous commentators, we are today seeing a resurgence of Taylorism in the digital workplace. With digital tools and the Internet of Things, we can more carefully and closely monitor individual workers. In some cases, we no longer need humans to manage humans. Machines can apply scientific management to workers better than humans can. (Click here and here for more detail).

Digital Taylorism has spawned an array of devices to measure ever-more-granular work in ever-more-granular detail. Sociometric badges are “…wearable sensing devices designed to collect data on face-to-face communication and interaction in real time.” They could deliver “…a dramatic improvement in our understanding of human behavior at unprecedented levels of granularity.”

More recently, Amazon patented a wristband that can monitor a warehouse worker’s very movement. The wristband can track where a worker’s hands are in relation to warehouse bins to monitor productivity. It can also use haptic feedback – basically buzzes and vibrations – to alert workers when they make a mistake. (Click here, here, and here for more detail).

Could digital Taylorism fulfill Lanier’s suggestion that machines will match human intelligence not because they get smarter but because humans get dumber? Could it make humans dumber?

It’s hard to say but there is some evidence that we did indeed get dumber the last time we fundamentally altered our work habits. Roughly 10,000 years ago, human brains began shrinking. Prior to that time, the average human brain was roughly 1,500 cubic centimeters. Since then, our brains have shrunk to about 1,350 cubic centimeters. As one observer points out, the amount of brain matter we’ve lost is roughly the size of tennis ball.

What happened? A leading hypothesis suggests that our brains began shrinking when we transitioned from hunter-gatherer societies to agricultural societies. Hunter-gatherers live by their wits and need big brains. Farmers don’t. As our work changed, so did our brains.

Could digital Taylorism lead to a new wave of brain shrinkage? It’s possible. In a previous article, I asked what should we do when robots replace us? Perhaps a better question is what should we do when robots manage us?

My Social Media

YouTube Twitter Facebook LinkedIn

Newsletter Signup
Archives