Open in App
  • Local
  • U.S.
  • Election
  • Politics
  • Crime
  • Sports
  • Lifestyle
  • Education
  • Real Estate
  • Newsletter
  • Interesting Engineering

    Flying robot can navigate collapsed structures easily, aid disaster response

    By Jijo Malayil,

    22 hours ago

    https://img.particlenews.com/image.php?url=2t7tBG_0v1NMKhh00

    Researchers have devised an innovative system for autonomous aerial robot exploration and multi-robot coordination inside abandoned structures.

    The novel approach created by a team at Carnegie Mellon University’s Robotics Institute (RI) could assist first responders in gathering data and making more intelligent judgments following a disaster.

    The research focused on minimizing redundant exploration efforts, optimizing resources, and improving the efficiency of the discovery process.

    “Since this is multirobot exploration, coordination and communication among robots is vital. We designed this system so each robot explores different rooms, maximizing the rooms a set number of drones could explore,” said  Seungchan Kim, a PhD student at RI, in a statement .

    Advanced rescue drones

    Each year, around 100 earthquakes worldwide cause damage, including collapsed buildings and downed power lines. For first responders, assessing the scene and prioritizing rescue efforts is both critical and hazardous.

    In this new method, drones prioritize detecting doors quickly, as significant targets like people are more likely to be found in rooms than in corridors. To identify these crucial entryways, the robots utilize an onboard lidar sensor to analyze the geometric properties of their environment.

    Hovering about six feet above the ground, the drones transform 3D lidar point cloud data into a 2D transform map. This map displays the space’s layout as an image composed of cells or pixels, which the drones then examine for structural features indicating doors and rooms.

    Near the drone, walls are represented as occupied pixels, while an open door or passageway appears as empty pixels. The researchers modeled doors as saddle points in the data, enabling the drones to recognize passageways and navigate through them swiftly. Upon entering a room, the robot’s position is visualized as a circle on the map.

    According to the team , the approach allows the drones to efficiently locate and enter rooms, enhancing their ability to find and assist in rescue operations.

    Lidar boosts efficiency

    There are two primary reasons why researchers chose a lidar sensor instead of a camera.

    Firstly, a sensor requires less processing power than a camera. Secondly, a standard camera would have trouble seeing in dusty or smoky environments commonly found within fallen buildings or at the scenes of natural disasters.

    The robots are not controlled by a single base station. Instead, each robot communicates with the other robots and uses its knowledge of the surroundings to make decisions and choose the best paths.

    The list of doors and rooms that the aerial robots have examined is shared amongst them, and they use this information to avoid previously visited places.

    Tests revealed that the team’s pipeline efficiently assigns tasks among robots and supports methodical room exploration.

    According to researchers, performance evaluations with up to three aerial robots demonstrated that their method outperforms the baseline by 33.4 percent in simulations and 26.4 percent in real-world experiments.

    Due to the 2D transformation of 3D voxels, the approach is limited to single-story buildings and has implicit rather than explicit collision avoidance.

    Future work aims to extend it to multi-floor buildings, explore heterogeneous multi-robot systems, and enhance these systems with advanced vision and learning modules.

    The details of the team’s research were published in the journal IEEE Robotics and Automation Letters.

    Expand All
    Comments / 0
    Add a Comment
    YOU MAY ALSO LIKE
    Most Popular newsMost Popular

    Comments / 0