Having worked and lived in both the Silicon Valley, California, and the Silicon Valley of the East, Shenzhen, I thought I had a decent idea about the current state of affairs when it came to technology.
But when I was back home in Malaysia, I caught wind of a new buzzword that was doing the rounds. It sounded highfalutin enough to be intriguing, familiar enough to be memorable, and yet vague enough that it could be used indiscriminately, making it an especially sticky term.
It’s called Industry 4.0 or IR 4.0 for short.
It was so much in vogue that the Malaysian government even released a document in 2018 detailing the national policy for this new era we were supposedly entering.
Surprised, and slightly displeased, that I hadn’t heard of this newfangled concept before, I immediately turned to the great oracle – Google – to satiate my curiosity.
The basic narrative is that at three times in the past there have been several technological breakthroughs that, by occurring roughly contemporaneously, resulted in the nature of our work, and by extension our lives, being upended.
The argument goes that the Third Industrial Revolution has come to an end and we’re now embarking on the Fourth Industrial Revolution. We’re told that its flag bearers are a hodgepodge of new technologies, including 3D printing, artificial intelligence (AI), big data analytics, cloud computing, augmented reality, robotics and the internet of things (IoT).
But do these new technologies really deliver on their promise of bringing about the Fourth Industrial Revolution? Or is this whole narrative just a marketing gimmick? To find out, we need to evaluate the technologies above and compare them with those that spurred the three industrial revolutions in the past.
- 3D printing: As incredibly useful as it is, 3D printing is still largely relegated to prototyping and small batch manufacturing – and for good reason. It hasn’t replaced mainstream manufacturing tools such as injection molding, die casting and CNC machining for large scale manufacturing as it is not as cost effective, takes longer and its products are often not as structurally sound.
- Artificial Intelligence (AI): In simple terms, AI is the imitation of human intelligence by a software programme. This has been a hugely successful endeavour, with many industries reaping its immense benefits. However, all AI applications now only possess narrow AI, which is AI that can learn on its own and perform well at one or several tasks that it’s specifically designed for. The field’s holy grail is the development of general AI – a software programme that can learn on its own and perform well at a multitude of tasks that it’s not specifically trained for. This would undoubtedly be revolutionary but unfortunately, it’s still decades away.
- Big data analytics: This has also been a rewarding field, with many industries being able to better collect and mine the vast amount of data at their disposal to (1) provide products and services that are better suited to the needs of their customers, (2) be able to target their customers better, and (3) increase process efficiency. There’s frequent cross-pollination between big data analytics and AI.
- Cloud computing: This is just a fancy term for the ability to keep and pull our data and computing power from on-demand data centres instead of from our local hard drives and processors.
- Augmented reality: Along with virtual reality, it’s making its mark in the gaming industry, but few else, at least for now. It’s still a work-in-progress which I only expect to become mainstream in the next decade or two.
- Robotics: Has really started to hit its stride in the past decade, with it now being integral to a host of important industries, including surgery, manufacturing, food service and space exploration.
- Internet of Things (IoT): Essentially embedding electronics into “dumb”objects to imbue it with machine smarts. The field is in its adolescence but has seen some stellar successes, especially in the smart home, wearables, and agriculture industries.
As impressive as these technologies are, I would argue that they don’t so much constitute a revolution as they do an evolution, albeit an important one. For something to be called a revolution, breakthrough technologies need to be invented and they should drastically change the way we live and work.
3D printing and augmented reality are novel, nascent fields that are still finding their footing and have yet to make a sizable impact, though I have no doubt they will in the future.
Cloud computing, artificial intelligence and big data analytics are more sophisticated software and internet-enabled tools that, as beneficial as they are, are not entirely novel. Robotics and the internet of things fall into this category as well, but with a hardware bend to them.
These technologies exhibit a change in degree and not a change in kind, which is essential if they are to bring about a revolution.
For contrast, let’s look at the three industrial revolutions that preceded it.
The First Industrial Revolution saw the invention of the steam engine, which enabled the first mode of long-distance travel without animal power, and the invention of the telegraph, which enabled near-instantaneous communication for the first time in human history.
This was the most consequential time in human history since our shift from a hunter-gatherer society to an agrarian society thousands of years ago.
The Second Industrial Revolution saw the hugely important electrification of many parts of the world, the invention of the telephone, which is the first real mode of instant communication for the masses, and the invention of planes and cars, which entirely transformed the way we travel.
The Third Industrial Revolution was no slouch either, seeing the invention of computers and the internet – the two things that have most defined human life in the past 50 years. At the back of that, we’ve seen the invention and mass adoption of the now ubiquitous smartphone.
This spurred a seachange in the world’s economies, creating entirely new industries to support our burgeoning technology sector.
So you can see that as rapidly as today’s technology seems to progress, it’s yet to be as impactful as the technologies which propelled the three industrial revolutions of the past.
I would, however, argue that social media and the sharing economy (which will be bolstered in the future by autonomous vehicles) border on being revolutionary but for some reason, proponents of IR 4.0 often leave them out of the conversation.
In terms of our work and life, besides being able to do things more efficiently, conveniently, and being more virtually connected, there hasn’t been a societal-level shift of much significance since the Third Industrial Revolution. A person who has time traveled from 50 years in the past would find almost everything about today’s world intimately familiar – except for the pocketable, addictive screen we constantly tote.
So, all-in-all, I wouldn’t go as far as to call IR 4.0 a gimmick but those who propound it are certainly premature in their pronouncement. This is why no one I’ve encountered in the tech capitals of the US and China spoke about the Fourth Industrial Revolution.
Because we’re not in one yet.