Open in App
  • U.S.
  • Election
  • Newsletter
  • Digital Camera World

    Ground–breaking new camera replicates movement of the human eye to improve image sharpness

    By Leonie Helm,

    11 hours ago

    https://img.particlenews.com/image.php?url=3vqaWM_0uEZz9UB00

    Scientists from the University of Maryland have taken inspiration from the human eye and developed a new camera system.

    The design mimics the miniscule involuntary movements of the human eye to maintain clear and stable vision.

    The camera is called the Artificial Microsaccade-Enhanced Event Camera (AMI-EV) and the scientific paper is published in Science Robotics .

    Lead author of the research paper Botao He, a computer science PhD student at the University of Maryland said:

    “Event cameras are a relatively new technology better at tracking moving objects than traditional cameras, but today’s event cameras struggle to capture sharp, blur-free images when there’s a lot of motion involved.

    “It’s a big problem because robots and many other technologies - such as self-driving cars - rely on accurate and timely images to react correctly to a changing environment. So, we asked ourselves: how do humans and animals make sure their vision stays focused on a moving object?”

    https://img.particlenews.com/image.php?url=4Omhxz_0uEZz9UB00

    A diagram depicting the novel camera system (AMI-EV). Image courtesy of the UMIACS Computer Vision Laboratory (Image credit: University of Maryland / College of Computer, Mathematical, and Natural Sciences)

    The answer lies in microsaccades, small, rapid eye movements that happen when a person is trying to focus their gaze on a stationary object. Through these movements, the human eye can stay accurately focused on an object and its visual textures, such as color, depth, and shadowing, for prolonged periods of time.

    This research could be the key to improving how robots, self-driving cars, and camera phones see more clearly.

    According to the research team, they replicated microsaccades by inserting a rotating prism inside the AMI-EV camera, to redirect light beams captured by the lens.

    The continuous rotational movement of the prism simulated the movements that naturally occur in the eye, enabling the camera to stabilize the textures of a recorded object.

    He said:

    “We figured that just like how our eyes need those tiny movements to stay focused, a camera could use a similar principle to capture clear and accurate images without motion-caused blurring.”

    A demonstration of how microsaccades counteract visual fading. After a few seconds of fixation (staring) on the red spot in this static image, the background details of this image begin to visually fade. This is because microsaccades have been suppressed during this time and the eye cannot provide effective visual stimulation to prevent peripheral fading. Image courtesy of the UMIACS Computer Vision Laboratory (Image credit: University of Maryland / College of Computer, Mathematical, and Natural Sciences)

    While still in the very early stages of innovation, the technology could be an advancement in camera phones, robotics, and self-driving cars, which famously struggle to recognize humans.

    Researchers also believe that the technology could be hugely beneficial in a wide range of industries that rely on accurate image capture and shape detection.

    Research scientist and senior author of the paper Cornelia Fermüller said:

    “With their unique features, event sensors and AMI-EV are poised to take center stage in the realm of smart wearables.

    “They have distinct advantages over classical cameras—such as superior performance in extreme lighting conditions, low latency and low power consumption. These features are ideal for virtual reality applications, for example, where a seamless experience and the rapid computations of head and body movements are necessary.”

    Expand All
    Comments / 0
    Add a Comment
    YOU MAY ALSO LIKE
    Most Popular newsMost Popular
    Total Apex Sports & Entertainment12 days ago

    Comments / 0