Industry 4.0 is an extension of a strategic initiative launched by the German government to advance digital manufacturing, and it is regarded as the successor to the three earlier industrial revolutions. The first three advancements refer to specific technologies which have mechanized production, i.e., the steam engine (1765), electricity, gas and oil (1870) and electronics, telecommunications and computers (1969); in contrast, the fourth industrial revolution encompasses an aggregate of technologies and a shift towards smart automation and interconnectedness.
The Industry 4.0 vision refers to a smart manufacturing paradigm whereby technological advances are leveraged to enable a seamlessly integrated operation. Here, digital technologies underpin daily operations, and virtual and physical systems have a lot of crossover. 4.0 concepts are being applied across industries to improve efficiency, ease compliance and accelerate research and development. In this article, we look to the technologies and innovations that will define the future of science.
Automation and robotics
Automation and robotics are key Lab 4.0 technologies that have multifaceted benefits. The implementation of automated processes can accelerate operations and also improve the efficiency of resource use. The use of robotics also provides the opportunity to improve the standardization of processes, thereby reducing error and providing more reliable, reproducible results. Together, these benefits allow analytical laboratories, for example, to provide a more efficient and cost-effective service to their customers.
The concept of automation is not new; however, automated instruments are becoming increasingly available and intuitive. Examples of automation and robotics can be found across many branches of science, including:
Integration Strategies for Digitizing Your Lab
Issues can arise with handling each new data stream that laboratory instruments provide. Integrating your lab equipment with your sample management software or lab information management system (LIMS) is a powerful step towards digitizing your lab. Download this whitepaper to learn more about how to integrate liquid handlers and automated stores.
Artificial intelligence and machine learning: A human–machine collaboration
One common thread across different scientific disciplines is the need for complex information processing, as masses of data are increasingly derived in basic research, clinical studies and bioprocessing and bioengineering environments. This need has arisen partly due to various technological advances, such as the ability to generate -omics data, complex bioprocessing operations and regulatory requirements and the capacity for continuously acquiring information with sensors and biosensors.
With access to so much information, there has been a shift towards data playing an “active” role in science discovery, whereby the mining of data can be used to foster hidden hypotheses. Here, many are initially looking to machine learning (ML), a subset of artificial intelligence (AI) in which algorithms are trained to identify data patterns and perform many basic tasks with precision and speed.
One of the sophisticated processes to have evolved within the ML field is deep learning, i.e., artificial neural networks with increasingly multilayered architectures and improved learning capabilities. Peter Flach, professor of artificial intelligence at the University of Bristol, describes classification as a particular type of machine learning. “It’s the most dominant one but also in a way the least “AI” because it’s very close to statistics. If you are building a ML classifier – i.e., a deep network for example, it’s often useful to not just get classifications of yes/no, spam/non spam, but also to get probabilities out of that. In practice, many ML models are overconfident. And so, what that means is that often with those kinds of ML model, some kind of post processing is needed to change the probabilities to a more reasonable level, as we see in weather forecasting, for example. That process is called calibration.”
In general, AI comprises techniques that allow computers to mimic human behavior and reproduce or improve on human decision-making to solve complex tasks, independently or with minimal human intervention. The successful creation of such systems, however, evidently requires a significant human element and manual cleanup of data sets; in an MIT interview, Moderna’s chief data and artificial intelligence officer Dave Johnson referred to AI implementation as a human-machine collaboration. Flach touches on a similar theme: “To actually have something tangible and real, we do need the data driven aspect. But also, we need a way to feed in the knowledge that humans have – the domain knowledge and maybe the models they derive. This is for two reasons. Essentially, I don’t think many things can be done in a purely data driven way. But secondly, it’s very expensive. For example, deep learning for image analysis requires a lot of computational power to recognize simply when a photo is upside down, and you burn through a tiny bit of rainforest to actually do that. Is that really the best way of doing things?”
Flach continues, “I’m always imaging what Alan Turing would say, if he would come back to Earth today. Okay, you have these enormous supercomputers in the world. And you just use it to squeeze in an extra half a percent out of a benchmark data set of handwritten digit recognition. And he’d say, you’re not asking the right questions. With that computer power, you should ask the hardest questions that you can ask, and then try and solve those. And what we do instead is fix Facebook and Google’s algorithms, which is not really a huge benefit for mankind compared to let’s say, drug discovery. So, I think we should ask the hard questions, but there’s a long way to go.”
The development of mRNA vaccines for the COVID-19 pandemic is one example where investment in big data infrastructure (as well as robotic automation and other digital systems) has had a major impact. Other examples of ML applications are present throughout the biopharmaceutical-focused laboratories; namely, biopharma 4.0, where ML algorithms are being applied to drug target identification and drug repurposing, and for the identification of factors affecting biotherapeutic safety and efficacy. In diagnostics, ML looks set to share the workload of overrun pathology laboratories via medical imaging analysis, and holds the potential to accelerate the diagnosis of rare diseases.
Is Chromatography Ready for the Cloud?
Choosing the right data system can make all the difference. If you deploy a chromatography data system (CDS) in the cloud, what does it mean for your data and what GXP regulations could impact this decision? There is a lot of talk about cloud computing and its benefits, but what exactly is cloud computing? How is it delivered? Are there any benefits for a regulated analytical laboratory? To answer some of these questions, download this eBook.
Digital twins are an excellent example of the cyber–physical overlap which exemplifies the Lab 4.0 vision. A digital twin is a digital replica of a physical entity and is considered to be a key enabler for Industry 4.0. Having a digital replica of a process allows systems to be analyzed and optimized prior to physically building expensive infrastructure. While digital twins are particularly relevant to manufacturing, for example when developing robotics, their potential use in biopharmaceutical development is starting to be explored. A workflow was reported recently whereby digital twin technology was used to model in silico the process characterization of a monoclonal antibody polishing step. The study was reported as being in line with Quality-by-Design principles, while being far less demanding on the steps required for experimental validation.
Xun Xu, professor at the Department of Mechanical and Mechatronics Engineering at the University of Auckland, New Zealand, has seen first-hand how smart manufacturing and design can create a higher quality, tailored product and operating system. In his review of intelligent manufacturing, Xu highlights how advanced information and manufacturing technologies can be used to optimize production and product transactions. While digital twins are one approach, there are others. Digital twin technologies have been used to take on product personalization challenges. Xu, for example, was involved in the development of highly personalized air pollution facial masks. The real-time data from the sensors embedded in the mask is integrated into the digital twin model so that compliance information as well as pollution data can be provisioned to the user. The digital twin is also connected to the cloud; hence the pollution data can also be fed to the air quality surveillance system.
Digital Asset Management in a Pharmaceutical Laboratory
During October 2020, a pharmaceutical lab associate lost several months’ worth of work due to three incubators losing humidity control resulting in the loss of valuable cell lines. Download this case study to discover a solution that identifies asset utilization, determines how assets behave over time and decides whether units are fit for purpose.
The internet of things for greater connectivity
Laboratories of the future will be far more connected than they have been in the past, where data had to be downloaded and manually transferred to computers, and instrument maintenance only happened when problems arose or when someone arranged to get a specialist in. The internet of things (IoT), i.e., the interconnection of devices via the internet, has the potential to optimize many aspects of day-to-day operations and revolutionize what can be achieved in the laboratory. Ease of data management, greater traceability and automated metadata recording all help laboratories to operate and remain compliant in an efficient manner. The use of laboratory information management systems (LIMS) and cloud computing also minimizes the risk of damage from unexpected events (e.g., instrument breakdowns) that may have previously resulted in data loss.
Laboratories have the potential to become far more integrated, and it doesn’t stop there. Flach is involved in SPHERE, a sensor platform for health applications in a residential environment. “It’s basically IoT in the home for medical applications. Data collection in health is incredibly crude. The idea is that we will definitely need technology in healthcare and around aging – we can’t afford to have everybody in hospital. So, we want to collect data that is useful in a health context, while keeping people in their home environment for longer. There are many questions around this, like do we want that? But we can’t really answer the question until we know what the technology can do. It’s very gratifying to have real world data, because sometimes ML can be highly applied or too theoretical. But when things are in balance, as they are in SPHERE, then it can be very productive,” says Flach.
A Digital Revolution in the Pharmaceutical Industry
In a world where we strive to get life-transforming treatments into the hands of patients as quickly as possible, embracing digital transformation is essential. However, to be successful in this new digital era, it is key that pharmaceutical companies have an interoperable digital strategy. Download this whitepaper to discover how a connected digital ecosystem can help.
Considerations for implementing Lab 4.0 technologies
The successful implementation of Lab 4.0 technologies requires a thoughtful, measured approach and the careful consideration of the laboratory’s specific requirements, says Xu: “To say how Lab 4.0 technologies could be applied, I would need to know about people in the laboratory and the role they play, as well as the role of the equipment in the lab – and about the need for them to work together. I would also want to know how people need to work with the data coming from different machines and systems, how people interact with these different devices, what sort of data they have. How do they want to process the data? How do they make decisions about data? How can they turn data into information, and then into knowledge that they can keep and use?”
Flach likens the introduction of these advanced technologies to sophisticated instruments in the laboratory that require training for proper use: “It’s not a matter of just switching it on and letting it do its job, you need to do a lot of work before you use the instrument, and then you need to do a lot of work afterwards to check whether you used it in the right way, whether the results make sense, and so on. And it’s the same with machine learning; it’s an instrument.”
Ultimately, advanced technologies need to support the actual needs of operators in the laboratory. Such sentiments are already being thrown around with the term “industry 5.0”, where human-centricity, sustainability and resilience are promoted. Are we ready for that? Possibly not, says Xu and colleagues, who offer a world of caution against the “proliferation of buzz words” in a recent publication: “There is, and should be, just one journey for a business.” For now, it seems as though industry 4.0 has given laboratories plenty of options to consider.