Open in App
  • Local
  • U.S.
  • Election
  • Politics
  • Crime
  • Sports
  • Lifestyle
  • Education
  • Real Estate
  • Newsletter
  • AV Network

    How Miami University of Ohio Went Virtual (Production)

    By AVNetwork Staff,

    18 days ago

    https://img.particlenews.com/image.php?url=3520Gw_0ueAGHiA00

    Miami University in Ohio recently expanded its facilities with the McVey Data Science Building opened in March 2024, housing The Department of Emerging Technology in Business & Design (ETBD). The school turned to Stage Precision for its Immersive and Reactive Lab and XR Stage.

    The Immersive and Reactive Lab and XR Stage is one of the facilities within the new building, home to a 50x70-foot stage and a 44x16-foot ROE LED wall. The integration of advanced technologies, including Stage Precision software, has helped pioneer the new lab as a hub for exploring the boundaries of virtual production (VP) and extended reality (XR) experiences.

    [Meptik Is Ready to Rumble with WWE... Virtually]

    “The groups are learning everything from motion design and creating creative visuals for live music," explained Benjamin Nicholson, assistant teaching professor and Immersive and Reactive Lab and XR Stage Director at Miami University. "They utilize immersive and reactive tools such as Notch, TouchDesigner, and Unreal Engine to make virtual production stages; this is also where they are learning how to use SP from Stage Precision.”

    Nicholson has been an advocate of Stage Precision workflows for several years, having discovered the software during previous work on live events projects. “In the industry, people are talking about Stage Precision and the things that can be achieved through the unified workflow it provides,” continued Nicholson. “In the context of the lab, SP allows us to take the control and management out of several individual native softwares and hardwares and put them all into a single interface that can be used for calibration and control.”

    https://img.particlenews.com/image.php?url=2Vespt_0ueAGHiA00

    (Image credit: Stage Precision)

    The environment at the Immersive and Reactive Lab includes a ROE LED wall, Disguise media servers, nDisplay workflows, and other media pipelines for two cameras. The main camera is a RED Komodo with six Zeiss Prime Lenses and a Canon Zoom Lens. “The most pivotal thing about SP is the lens calibration features. We built lens profiles in SP which take the data input from RedSpy for optical tracking,” explained Nicholson. Camera two offers another great example of the ease of integration of different systems into SP.

    [The Integration Guide to Video Walls 2024]

    When it comes to training the next generation of creative production professionals, the facilities at Miami University are exploring new ways of working that leverages the freedom and flexibility made possible in SP. “For the learners that can already understand the significance of a tool like Stage Precision, they’re enthusiastic about using it in different ways,” says Nicholson. “It’s an advanced program, but training in this from the start will give students a high level of knowledge and understanding that they can use in the real world.”

    Currently, students at the lab are experimenting with a TouchDesigner VP workflow, using SP as the hub for feeding lens and tracking data into the workflow. Additionally, the versatility gained through having a single source of truth in SP sets the new facilities at Miami University apart. “With SP we can run several parallel systems at any one time. We can run a Disguise virtual production, set up a TouchDesigner system or anything else, run them at the same time and switch between them,” said Nicholson. “We have multiple users who can change the SP interface from up to 15 different computers. What Stage Precision is doing is providing a single point of tracking distribution to all the different media servers at once.”

    [A Matter of Touch]

    One of the first VP and XR university facilities to integrate SP into the heart of their curriculum, Nicholson believes that learning these skills will benefit students after graduation and throughout their careers. “SP has removed the need for technical calibration of a single source and allows students to learn the process outside of a native software package, making it a helpful foundational knowledge base that teaches skills that are transferrable across different software and technologies.”

    A close relationship with the Stage Precision team has also been an ongoing support in the integration of an SP-based workflow at the lab. During the onboarding process, the Stage Precision team visited the lab to help configure the proper workflow that would most benefit the projects and learners at the university.

    “SP allows us to have visibility over things that may or may not be happening in the background network, whether it’s time code, tracking or any sort of calibrations,” concluded Nicholson. “Combined with the ability to build custom interfaces so you have complete control over your space and your show, there’s nothing else on the market that can do that. With SP, our systems feel manageable, stable and controllable.”

    Expand All
    Comments / 0
    Add a Comment
    YOU MAY ALSO LIKE
    Most Popular newsMost Popular
    Total Apex Sports & Entertainment5 days ago

    Comments / 0