3 Exciting developments in AI x Robotics

3 Exciting developments in AI x Robotics

Welcome to the 7 new deep divers who joined us since last Wednesday. If you haven’t already, subscribe and join our community in receiving weekly AI insights, updates, and interviews with industry experts straight to your feed.


DeepDive

Your weekly immersion in AI. 

Advancements in AI systems are driving a shift in robotics research and development – and enabling a new wave of robots that can do more with less human supervision

Here are three of the most exciting developments at the intersection of AI and robotics. 

1. AI is powering autonomous robots that can learn as they do 

Embedding AI models into robotics tech is allowing robots to become more autonomous, and to adapt to their environment by learning as they experience the world around them. 

AI-powered observational learning means that robots can now learn how to perform tasks simply by observing human demonstrations; so they don’t need comprehensive manual programming for each task that we want them to undertake. This is a major step – because it allows robots to acquire new skills much more efficiently. 

Other advanced algorithms include reinforcement learning – which means robots can learn through trial-and-error interactions with their environment. This enhances their ability to make decisions, as they can navigate complex settings without needing a predefined map and a pre-programmed route to follow. 

And then there’s neuromorphic computing: also known as neuromorphic engineering, it’s an advanced technology that mimics the neural structure of the human brain, and gives robots the capacity to process sensory data from their environment and react to that data in real-time. 

2. Human-robot interactions are becoming more sophisticated 

As AI enables new robotic capabilities, developers are focused on improving the ways in which humans and robots can collaborate with one another. 

In particular, advancements in natural language processing (NLP) are enabling a more intuitive communication style between robots and humans. 

In recent years, we’ve seen AI leaders attempt to create robotic systems that can behave in an emotionally intuitive manner. And now AI tech is enabling other forms of intuitive interaction, including the sense of touch.

In a 2024 report published by Science Robotics, three researchers from the Institute of Robotics and Mechatronics at the German Aerospace Center explained that the sense of touch is a property that enables humans to “interact delicately with their physical environment.” 

They wanted to explore the potential of replicating this delicate sense of touch in intuitive human-robot interactions. 

“On the basis of high-resolution joint-force-torque sensing in a redundant arrangement,” they wrote, “we were able to let the robot sensitively feel the surrounding environment and accurately localise touch trajectories in space and time that were applied on its surface by a human.” 

They used a range of learning techniques and artificial neural networks to enable the robot to identify and interpret touch as letters, symbols, or numbers. 

“The intrinsic sense of touch we proposed in this work can serve as the basis for an advanced category of physical human-robot interaction that has not been possible yet, enabling a shift from conventional modalities toward adaptability, flexibility, and intuitive handling.” 

3. Cobots are becoming more valuable for a wider rage of use cases 

This last one is closely related to the first two – because as robotics become better able to learn with and from humans, and as their abilities for intuitive interaction increase, human-robot collaboration becomes more advanced. 

Sensors, vision technologies and smart grippers that leverage AI to analyse sensory data and make real-time adaptations mean that robots can work more safely alongside human workers

So collaborative robot applications can support human workers, make their jobs easier and/or safer, and free up their time for strategic and creative tasks. 

Cobots are now at work in a number of industries, including healthcare and surgery; electronics production and quality control; food and beverage production; manufacturing; and more. 


Tell us about the AI developments that are inspiring you right now 

We want to know about the advancements in AI tech that are getting you excited. What possibilities are just around the corner – and why does AI matter to your future? Open this newsletter on LinkedIn and tell us in the comment section. 

Until next week. 

Related
articles

A co-creative process between AI and humans

Welcome to the 4 new deep divers who joined us since last Wednesday. If you haven’t already, subscribe and join our community in receiving weekly AI insights, updates, and interviews with industry experts straight to your feed. DeepDive Your weekly immersion in AI.  With a background in biotech and

How people to relate to AI (and why companies should care)

Welcome to the 2 new deep divers who joined us since last Wednesday. If you haven’t already, subscribe and join our community in receiving weekly AI insights, updates, and interviews with industry experts straight to your feed. DeepDive Your weekly immersion in AI.   A lot of the discourse around

Detangling the human brain with open-source AI

Welcome to the 10 new deep divers who joined us since last Wednesday. If you haven’t already, subscribe and join our community in receiving weekly AI insights, updates, and interviews with industry experts straight to your feed. DeepDive Your weekly immersion in AI.  The human brain is more complex