Are two heads better than one? So it seems, especially when one “head” belongs to a machine with artificial intelligence.
Bringing human cognition and AI together is the hallmark of the Fifth Industrial Revolution, an era, coming soon, in which people and robots will work collaboratively to the benefit of both. Industry 5.0 will push computing beyond the edge to a world in which humans thrive as never before—because of, not in spite of, our robot companions.
Like in the current industrial revolution, Industry 4.0, in Industry 5.0 everything—people, objects, computing devices—will be connected in a vast digital web in which humans seem almost superfluous.
The difference is that, today, people worry and fret: Will I have to give up driving? Will I lose my job to a robot? Is my personal privacy gone forever? In Industry 5.0., these questions won’t exist—because we will have solved the human-AI conundrum.
Enabled by technologies we can only imagine, machines will naturally perform the tasks they do best, and we will be fine with that. With nano-quick processing and seemingly infinite memory, robots, drones, autonomous vehicles and other machines will free us humans from the tediums of daily life and work, enabling our minds to soar to new heights.
The first four revolutions: a recap
Micron IT Director Tim Long describes the first four industrial revolutions as:
- Mechanization. (1780) The first industrial revolution, occurring over about 100 years from the mid-18th to mid-19th centuries, began with the use of water and steam power to mechanize manufacturing processes.
- Electrification. (1870) In the late 19th and early 20th centuries, electric power came to factories, enabling the assembly line and mass production.
- Automation. (1970) Digital technologies including robotics came to the manufacturing process starting around 1970, automating many tasks that humans had previously performed and, with the internet, enabling globalization.
- Connection. (2011) Everything, from cars to computers to robots to toasters, is becoming virtually linked in the Connected Age, communicating with and even controlling one another with minimal human intervention. Factories are on their way to running themselves. as “cyber-physical systems” take charge of not only manufacturing but also procuring, maintenance, and repairs. The internet of things, robotics, and artificial intelligence are the technologies enabling all this autonomy, which, like the human brain, is driven by data, analytics, and memory.
As we know, digital technology has sped up time. Everything happens faster now, which helps to explain why the fourth revolution—the Connected Age--followed so closely on the heels of the third, the Age of Automation. So is it any wonder that we are already looking ahead to Industry 5.0, the Collaborative Age?
Industry 5.0: the human-machine convergence
The Fifth Industrial Revolution will see the convergence of humans and machines—literally and figuratively. Smart phones and applications will give way to technologies that live on our bodies, with virtual assistants murmuring directions in our ear, suggesting restaurants for dinner, making purchases on our behalf, and much more. But the most paradigm-shattering changes will occur in the workplace.
Industry 5.0 is expected to see the transformation of the fourth revolution’s “cyber-physical” manufacturing plants—those using digital technologies to operate factories with minimal human involvement--into “human-cyber-physical” systems.
In this new world, sensors collect data and computers with AI process and analyze that data—at faster and faster speeds, using larger and larger databases to contextualize and categorize it. Machines and robots use the information to make decisions based on programmed algorithms and their own databanks containing “memories” of their past actions and outcomes.
Far from marginalized as some predict, humans hold center stage in this new revolution. Machines serve us, not the other way around.
In this new paradigm, people will work alongside collaborative robots, or “cobots,” teaching them to do their jobs and correcting them when they err. While machines will perform the most menial, repetitive, and dangerous tasks, people will use their intricate, flexible brains to make high-level decisions: designing products and processes, for instance, perhaps using a “digital twin,” a virtual copy of the factory where the product gets made or the environment where it will be used.
Along the way, the factory’s ability to communicate directly with customers will enable it to customize and personalize every product according to individual needs and desires.
Some companies are already forging ahead to the new era. The sportswear company Adidas, for example, is producing running shoes and trainers in small “Smartfactory” plants in Germany and the U.S. These factories use robotics, additive manufacturing (also known as 3D printing), and data analytics to produce shoes for anyone and at any time. When a customer requests an adapted version of an Adidas design, the nearest Smartfactory can produce a pair within a day and deliver it to the consumer shortly afterward—a truly revolutionary development, considering that the company’s main plants in Asia only manufacture shoes in lots of 20,000 per size.
Smartfactories don’t run themselves, but rely on a fairly small human force with computer tablets to program, instruct, guide, and troubleshoot. The speed at which the factories’ robots can process, analyze, and respond to data coming from a plethora of sources--sensors, online orders, other robots, computing devices and wearables used by people—depends on how fast their processors are, and how much memory they have. What is true for human intelligence is also true for artificial intelligence.
For example, let’s turn to medicine. Today, people with type-1 diabetes may use a device that draws blood and measures its glucose levels, then communicates the results to another device which delivers the appropriate amount of insulin to the patient’s blood.
The problem with this method, as with much of medicine today, is its one-size-fits-all approach. We now understand that people differ in their biological makeup and lifestyle choices, and benefit from medicines and doses tailored to their unique needs. In Industry 5.0, devices will use AI to monitor the body’s variables, measure out insulin doses in the exact amount the patient needs and at the exact time they need it, and track the body’s responses to improve their own performance. Should medical procedures be warranted, the intelligent device will provide the data needed to perform those procedures—delivering specifications directly to the factory producing an artificial pancreas, for instance.
Memory makes it work
Responding intelligently to situations is not always easy, as we humans well know. To do so, we must quickly process the information at hand while using our memories to provide context. Our responses can mean the difference between life and death: turning the steering wheel to avoid an accident, pulling an emergency brake to stop malfunctioning equipment, or diagnosing a patient’s injury or illness to determine the most effective treatment.
Artificial intelligence also relies on memory and processing speed to generate the right response at the right time. Experts envision a time in the near future when self-driving cars will sort through streams of data coming from multiple sources to make snap decisions with a failure rate of almost zero; when manufacturing plants scale production up or down, order supplies, ship out finished products, and repair and replace equipment autonomously, and medical devices diagnose illnesses and administer treatments on their own.
The Fifth Industrial Revolution, like the Fourth, will rely on data, devices, and artificial intelligence to grease the wheels of business and commerce. None of these components works without memory—the same as is true of the human mind. Memory, in fact, puts the “intelligence” in AI, providing it with data to run its algorithms, and context for its actions and reactions.
As the technology industry strives to produce memory that works as fast as the human mind or even faster—the “holy grail” of AI—we come ever closer to realizing the dream of a world in which machines work for the betterment of society and humanity. We are tantalizingly close.
More, better, faster processing
“If we want machines to do the things that humans have traditionally done, we need to make machines exceptionally efficient at the things that humans do,” Micron Senior Fellow Mark Helm says.
Everything we do happens as a result of sensory input: going to lunch, laughing at a joke, saying “I love you,” buying a car. To perform each of these actions, we take in information coming from our senses of sight, feeling, taste, hearing, and touch as well as our memories, emotions, beliefs, thoughts, and intuition, and process it all at once. Unlike central processing units (CPUs)—our computing devices—our brains don’t have a discrete number of “cores” where data goes in, gets analyzed and sorted, and gets sent out for an action or result. Our brains break up incoming information and assign each part to its corresponding area of specialty: one area for visual data, another for sounds, another for emotion, and so on.
So instead of using CPUs to process data, AI systems use graphic processing units (GPUs), a different kind of memory chip that works more like our brains do. While a CPU might have two, three, or 26 processing cores on a chip, GPUs have thousands, making them capable of processing thousands of data inputs at once.
Micron’s NAND and GDDR6 DRAM memory enable GPUs to work at the speed of thought. Our NAND memory can hold vast quantities of data in storage, and our DRAM products, including our top-of-the-line GDDR6, yield data to all those GPU processing cores at very fast speeds (“low latency”) and in very large amounts at once (“high bandwidth”). The result is artificial intelligence equipped with a vast and expansive long-term memory that can think fast on its feet and react in near-real-time as humans can do.
GPU was invented for processing streaming, image-rich data for videos and video gaming, and is often used for cryptocurrency mining, as well. Now GPUs have become the go-to technology for AI.
“AI is using GPUs in a lot of circumstances,” Helm said. “GPUs are very good for very specific tasks—better than CPUs, which are great for general purposes, doing a lot of different things.
“Whether it be running a video game to your screen or performing an AI workload or mining for cryptocurrency, their common underlying requirements drive you toward very-low-latency, very-high-bandwidth memory coupled with graphics processor architectures.”
“We want to be out there providing memory and storage products that our customers will utilize to enable new capabilities,” Helm said. “We want to be in a leadership role.”
At Micron, we can envision robots, drones, self-driving cars, and other forms of AI that rival humans for intelligence, learning, and response times. We strive constantly to develop memory that works more and more quickly and efficiently, and are continually introducing products that bring our world ever closer to industry 5.0