
Meta is taking a significant step forward in the field of embodied AI by giving robots the ability to touch and feel. Through partnerships with GelSight and Wonik Robotics, the company is working to commercialize a new range of tactile sensors that will help AI understand the physical world in ways that previously were unattainable. At the core of this push is Meta's belief that for robots to be truly useful, they must not only perform physical tasks but also interpret and interact with their surroundings as naturally as a human would.
Meta recently introduced a range of technologies designed to elevate robotic dexterity and touch perception, including Meta Sparsh, Meta Digit 360, and Meta Digit Plexus platforms.
The centerpiece of this release is Meta Digit 360—a revolutionary fingertip-shaped tactile sensor that can capture touch data with human-level precision. GelSight, a company renowned for its advancements in tactile intelligence, will be manufacturing Digit 360, which aims to provide AI researchers with unprecedented tools to model the physicality of objects and refine human-robot interactions.
Humans rely heavily on touch for many everyday tasks. But for AI systems, physical interactions have been a blind spot. Meta aims to change that with the Digit 360 sensor, a device equipped with 18 sensing features that can detect even minuscule deformations—such as the flex of a tennis ball or a pinprick—with extraordinary sensitivity. With a tactile-specific optical lens capable of registering forces as small as 1 millinewton and spatial details down to 7 microns, Digit 360 represents a major leap forward in robotic perception.
Digit 360 also includes on-device AI that processes information locally, inspired by the reflex arc seen in humans and animals. This means that the sensor can quickly respond to different types of inputs, such as the poke of a needle or the flex of a tennis ball, mimicking the responsiveness of a human finger. This innovation could have far-reaching applications in robotics, enabling AI to interact with its environment in a way that closely resembles human touch.
Through its partnership with GelSight, Meta will bring Digit 360 to the global research community in the coming year. The device's potential extends beyond robotics; it could be instrumental in fields like medicine, prosthetics, virtual reality, and telepresence. For example, in virtual environments, Digit 360 might allow users to interact with objects not just visually but through a digitized sense of touch—grounding their interactions in a realistic representation of object properties.
To integrate touch with action, Meta has also introduced Meta Digit Plexus, a hardware-software platform that seamlessly connects tactile sensors with robotic control systems. Imagine a human hand's intricate communication with the brain—touch signals influencing how fingers move and react. The Plexus aims to bring a similar level of nuanced feedback to robot hands. By integrating various sensors like Digit 360, Meta's platform hopes to enable robots to adapt in real-time to the objects they manipulate, making decisions on how to grasp, hold, or move items.
Wonik Robotics, a South Korean robotics company, has joined forces with Meta to use these technologies in developing the next generation of the Allegro Hand—a robotic hand equipped with tactile sensors based on Meta's Digit Plexus. The Allegro Hand, which is also set for release next year, will provide an advanced tool for researchers working to make robots more responsive and adept at delicate tasks, from assembling intricate components to performing potentially lifesaving operations in healthcare.
Meta's advancements aren't solely about touch; they represent a broader effort to make AI agents more capable partners in everyday human activities. To push this ambition forward, Meta has introduced the PARTNR benchmark—a standardized framework for evaluating how robots plan and reason in collaborative settings. PARTNR is designed to help assess AI models across 100,000 tasks that mimic real-world household activities, offering a path towards robots that aren't just reactive tools but true partners that can operate alongside people safely and efficiently.
PARTNR also leverages Habitat 3.0, a high-speed, realistic simulator that enables researchers to test and train AI models in home-like environments. This allows for large-scale assessments of human-robot collaboration without the safety risks associated with physical experiments. Meta envisions PARTNR as a pivotal resource for transforming AI models from mere agents into collaborative partners, capable of handling complex tasks with a nuanced understanding of human dynamics.
Meta's investment in tactile sensing and dexterous robots brings us closer to a future where AI isn't only about text generation or content recognition—it's about real, human-like interactions with the physical world. Robots that can sense texture, pressure, and movement could be game-changing in sectors like healthcare, manufacturing, and logistics. Consider a future where a robot's "feel" allows it to assist a surgeon in delicate operations or help manage supply chains with a touch of sensitivity that is currently out of reach.
Beyond industrial applications, these technologies could bring new dimensions to consumer experiences. Imagine picking up an object in a virtual game and actually feeling its texture or scanning a piece of fabric while online shopping to experience its material properties firsthand. This would also pave the way for significant advancements in prosthetics, providing a more natural and responsive experience for users.
As GelSight CEO Youssef Benmokhtar notes, the goal is to make tactile sensing ubiquitous and accessible. Through these partnerships and their open approach to sharing data, software, and designs, Meta is positioning itself at the forefront of an evolving field that has immense implications for how AI will integrate into our daily lives. By bridging the gap between the digital and physical realms, Meta's innovations are making the once futuristic vision of robots that "feel" increasingly tangible.