While the first industrial revolution was characterised by mechanisation, the second was all about smoke-belching chimneys, and the third gave rise to the world-changing power of computers and telecommunications. You’d be forgiven for not realising that a fourth was even taking place.
The fourth industrial revolution, or Industry 4.0 as it’s better known, refers to the digital transformation of the manufacturing sector. It encapsulates everything from artificial intelligence to the Internet of Things and promises to transform the industry as we know it, making factories smarter, production lines more efficient, and businesses generally more productive.
“Industry 4.0 promises to transform manufacturing as we know it, making factories smarter, production lines more efficient, and businesses generally more productive”
Unsurprisingly, appetite for this kind of digital transformation is high, with 83 per cent of companies planning to make investments in smart factory technologies. But as any veterans of previous industrial revolutions will tell you, some people are inevitably left behind by the technological upheaval at their core – and this one looks to be no different.
A 2018 study by Deloitte* found that by 2028 there could be 2.4 million unfilled job openings in the US manufacturing industry and Japan and Germany are expected to fare even worse. A McKinsey* report from the same year also found that two out of three companies that trial new digital manufacturing solutions fail to adopt them on a larger scale.
How is this possible when Industry 4.0 promises so many benefits? Intel’s Accelerate Industrial study has identified various issues that are preventing successful adoption for many companies, including lack of interoperability and concerns around security and data. While the biggest issue was a lack of technical skills preventing them from benefiting from their investment.
Just a few years ago the top five skills required of an employee would have included basic modern programming or software engineering, manufacturing, communication, innovation and traditional IT skills. Whereas now there’s been a very noticeable shift towards more specific digital skills. Critical skills for an Industry 4.0 employee now include a much deeper understanding of modern programming or software engineering techniques, digital dexterity, data science, connectivity and cybersecurity. With specific manufacturing skills now relegated to the bottom of the list.
This reflects the significant change in how businesses will operate afterwards, with many of the more physically demanding tasks becoming automated and the new technologies creating a whole raft of new roles and responsibilities.
There are measures that companies can take to solve the problem themselves. Dr. Faith McCreary and Dr. Irene Petrick, the pair responsible for the Accelerate Industrial study, make a number of suggestions. These include creating programmes that support learning among the existing workforce, offering lessons and hands-on opportunities with digital tools and skills, and striking a balance between hiring external experts and recruiting existing staff when implementing new smart technology projects. This creates opportunities for people to learn on the job and transition with the business.
But how else can this looming skills gap be closed? One suggestion involves rethinking the way salary decisions and promotions are made. Instead of basing them purely on experience and past performance, employers in the manufacturing industry should also take new skills onboard, encouraging staff to embrace the coming changes and prepare themselves for the new ways of working.
But what about people who aren’t even part of the workforce yet? Universities and institutes of higher-education have a big part to play in adjusting their courses to ensure that the employees of the future emerge equipped for the working landscape as it will be then, not what it looked like five years ago. This sounds easy, but technology is a fast-moving beast, so curriculums will need constant re-evaluation to ensure that they remain relevant.
An alternative option that helps to avoid this issue is an apprenticeship, which has become a viable alternative to traditional higher education for many people. On-the-job training of this kind ensures that new workers are prepared for the environment they will find themselves in upon successful completion of the course, ensuring they don’t waste time learning obsolete skills.
Of course, that’s not the only thing businesses can do to help prospective employees. Earlier this year, Intel partnered with Udacity* to launch the new Intel® Edge AI for IoT Developers Nanodegree Program. Designed to train the developer community how to utilise deep learning and computer vision, students must undertake three real-world projects in order to complete the course. This allows those that graduate to show they’re equipped to deploy AI technologies on edge devices. For most people, the course should take around three months to complete.
Perhaps one of the most encouraging factors in all of this is how the very technologies that are at the forefront of Industry 4.0 can also help us to train employees. Connected devices and wearable sensors allow for real-time feedback and sophisticated data collection that can be used to assess progress. Additionally, augmented reality technology can provide more immersive guidance without requiring constant human supervision.
The most crucial step that businesses can take, though, is to recognise the issue and address it now rather than further down the line. Those that do will find themselves in a much stronger position to reap the benefits of Industry 4.0 when it hits its peak.
*Other names and brands may be claimed as the property of others