Open in App
  • Local
  • U.S.
  • Election
  • Politics
  • Crime
  • Sports
  • Lifestyle
  • Education
  • Real Estate
  • Newsletter
  • Interesting Engineering

    No skin needed: Robots learn to feel human touch with internal sensors, AI

    By Jijo Malayil,

    22 hours ago

    https://img.particlenews.com/image.php?url=3Bgve4_0v6O1j0A00

    Researchers have developed a method to give robots an innate sense of touch by integrating their internal force-torque sensors with machine learning algorithms.

    The method developed by a team at Germany’s Deutschen Zentrums für Luft- und Raumfahrt (DLR) enables robots to sense and interpret human touch without requiring costly synthetic biomimetic skins or extra sensors to cover their surface.

    The team used high-resolution internal sensors and deep learning to give a robotic arm a full-body sense of touch and accurately detect force details.

    “The intrinsic sense of touch we proposed in this work can serve as the basis for an advanced category of physical human-robot interaction that has not been possible yet, enabling a shift from conventional modalities towards adaptability, flexibility, and intuitive handling,” said researchers in a statement .

    https://img.particlenews.com/image.php?url=0zT5ou_0v6O1j0A00
    The robot’s touch sense, achieved through internal sensors and machine learning, accurately interprets writing/drawing and enables various interactions without extra sensors.

    Human-robot interaction

    Modern robotic systems’ capabilities are advancing rapidly, making them promising collaborators in diverse fields such as manufacturing, space exploration, healthcare, and daily life assistance.

    The integration of human problem-solving and reasoning with robotic precision is an active research area, particularly in human- robot interaction (HRI).

    Various HRI modalities, such as vision-based, voice-recognition, and physical-contact approaches, have been explored, yet achieving intuitive physical interaction remains challenging.

    Enabling robots with a sense of touch is crucial for safe and efficient interactions, allowing for precise identification of physical contacts. According to researchers, traditional force-torque sensors are used for control, but explicit tactile sensing is necessary for detailed contact information.

    Although advances have been made with tactile skins and sensors, challenges remain in coverage, wiring, robustness, and real-time capability.

    According to researchers, robots must be equipped with robust, sensitive sensors that can sense force in order to interact with humans physically. This can become costly and complex when dealing with large or curved robotic surfaces.

    Tactile breakthrough

    The researchers at DLR’s Institute of Robotics and Mechatronics (RMC) used integrated sensors to equip a robot with built-in tactile capabilities.

    They overcome difficulties by using the equipment already included in the Safe Autonomous Robotic Assistant (SARA) system, a robotic arm with force-torque sensors in its joints and base that detect position and direct movement.

    This allows the robot to detect and respond to physical interactions without needing external touch sensors. Using the sensors, the robot can detect where and in what order different forces were applied to its surface.

    https://img.particlenews.com/image.php?url=40P2oV_0v6O1j0A00
    The robot recognizes written digits and executes tasks based on touch trajectories. Virtual buttons on its surface can also trigger commands.

    They combined this ability with deep learning algorithms to then interpret the applied touch. They demonstrated that the robot could recognize numbers or letters traced on its surface by using neural networks to predict each character.

    The team also extended this mechanism to include virtual “buttons” or sliders on the robot’s surfaces that could be used to trigger specific commands or movements.

    Researchers claim the approach endows the system with an intuitive and accurate sense of touch and increases the range of possible physical human-robot interactions.

    “The intrinsic sense of touch we proposed in this work can serve as the basis for an advanced category of physical human-robot interaction that has not been possible yet, enabling a shift from conventional modalities toward adaptability, flexibility, and intuitive handling,” said the team in the study.

    The details of the team’s research were published in the journal Science Robotics .

    Expand All
    Comments / 0
    Add a Comment
    YOU MAY ALSO LIKE
    Most Popular newsMost Popular

    Comments / 0