Frontier Technologies in Industrial Operations 2025
Page 15 of 26 · WEF_Frontier_Technologies_in_Industrial_Operations_2025.pdf
2.2 Embodied AI – igniting a new era in robotics
AI is not only transforming software but also
automating physical workflows. Embodied AI
integrates AI into physical systems such as
robots, allowing them to perceive and interact
with their environment through dynamic and
complex movements. The agents see the world
via sensors (for example, cameras, radar, lidar
and microphones) and execute actions through actuators such as advanced grippers. Applied to
industrial operations, these agents enhance the
capabilities of existing robotic systems, enabling
more sophisticated automation. By doing so,
they expand the automation scope, overcoming
traditional challenges such as those associated with
handling unstructured environments or manipulating
unstable objects.
Three types of robotic systems FIGURE 2
Robotic software improvementRobotic hardware improvementRobot capabilities enabled by embodied AI
Rule-based robotics
Coding
Performance and reliabilityTraining-based robotics
Training
Task versatility and flexibility
Situation adaptability
Manipulation dexterityContext-based robotics
Zero-shot learning
General understanding and task execution
Human-like dexterity and low-level control
Universal robotic embodiment
Source: Boston Consulting Group.
Notably, three types of robotic systems have
emerged: rule-based, training-based and context-
based (Figure 2). This evolution has been driven by
improvements in both robot hardware and software.
The hardware is becoming more capable, reliable
and flexible. At the same time, the software is
advancing, with improvements in foundation models
and technologies (such as reinforcement learning).8
A five-fingered robotic hand with 24° of freedom
can perform complex tasks with an unprecedented
level of dexterity.9 This is made possible by the
various data sources that can be harnessed to train
AI-enabled robots:
–Real robot data: This data is collected
from the robot motion controllers and can
also be generated by human-guided robot
teleoperation. Although real robot data is the
most accurate, it is limited because it can only
be gathered from deployed robot fleets.
–Synthetic robot data: This data is created in
simulated physics-based environments and is
available in infinite supply. While any scenario
can be simulated, a simulation-to-reality gap is expected to remain due to the diversity of the
real world. This means real robot data will still be
necessary for validation. For example, Foxconn
trains robots in its virtual factory, using digital
twins to generate synthetic data for model
training and to teach robotic arms how to see,
grasp and move objects.10
–Internet-scale human data: Online data,
including human videos, is highly diverse, and
equips AI with a foundation for understanding
the world. It provides valuable information on
how humans interact with objects and how
objects behave. Imitation learning allows the
latest models to learn these skills by mimicking
human actions.
The discussion of the three robot types enabled
by embodied AI centres on these software
advancements, which harness advanced datasets:
Rule-based robotics: Beginning in the 1960s,
industrial robots operated under rule-based systems,
following “if… then” instructions that were manually
coded by experienced robotic engineers. Complex Embodied AI
integrates AI into
physical systems
such as robots,
allowing them
to perceive and
interact with their
environment
through dynamic
and complex
movements.
Frontier Technologies in Industrial Operations
15
Ask AI what this page says about a topic: