![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() by Staff Writers Washington DC (SPX) Jun 19, 2020
Google is taking immersive media technology to the next level, showing a practical system for light field video. Wide field of view scenes can be recorded and played back with the ability to move around within the video after it has been captured, revealing new perspectives. Developed by a team of leading research scientists and engineers, the new research shows the ability to record, reconstruct, compress, and deliver high-quality immersive light field videos lightweight enough to be streamed over regular Wi-Fi, advancing the state of the art in the rapidly emerging field of immersive augmented reality (AR) and virtual reality (VR) platforms. In recent years, the immersive AR/VR field has captured mainstream attention for its promise to give people a truly authentic experience in a simulated environment. Want to really feel like you're standing among the Redwoods at Yosemite rather than sitting in the living room? Or watch an artist create a sculpture as if you're with them in the studio? That could be possible with immersive AR/VR technology. Although the field is still nascent, the team at Google has addressed important challenges, making major research headway in immersive light field video. The research team, led by Michael Broxton, Google research scientist, and Paul Debevec, Google senior staff engineer, plans to demonstrate the new system at SIGGRAPH 2020. The conference, which will take place virtually this year, brings together a wide variety of professionals who approach computer graphics and interactive techniques from different perspectives and continues to serve as the industry's premier venue for showcasing forward-thinking ideas and research. "This is the latest culmination of our work in light fields," Broxton, a lead author of the research, says. "We're making this technology practical, bringing us closer to delivering a truly immersive experience to more consumer devices. Photos and videos play a huge role in our day-to-day experience on mobile devices, and we are hoping that someday immersive light field images and videos will play an equally important role in future AR and VR platforms." At SIGGRAPH 2018, Google researchers showcased similar work when they introduced photorealistic light field still images in immersive VR. This new system has added another key piece to the immersive media puzzle: video. Light field videos give users a more dynamic virtual environment with panoramic views of scenes that span more than 180 degrees. They allow users to peek around corners and enjoy a greater sense of depth while in the virtual world. And the system is able to capture content that was challenging for earlier methods, such as reflective surfaces. This translates into a more realistic environment; for instance, sunlight that naturally reflects on ocean waves or light reflecting off the shiny hood of a car shifts naturally with the user's gaze as it would in real life. The team records immersive light field videos with a low-cost rig consisting of 46 action sports cameras mounted to a lightweight acrylic dome. Using DeepView, a machine learning algorithm developed last year by members of the same Google research team, they combine the video streams from each camera into a single 3D representation of the scene being recorded. Their paper introduces a new "layered mesh" representation that consists of a series of concentric layers with semi-transparent textures. Rendering these layers from back to front brings the scene vividly and realistically to life. This method solves the very difficult problem of synthesizing viewpoints that were never captured by the cameras in the first place, enabling the user to experience a natural range of head movement as they explore light field video content. Another breakthrough in this work involves data compression. The idea is not only to develop a system capable of reconstructing video for a truly immersive AR/VR experience but also to access the experience via consumer AR and VR headsets and displays, and even in a web browser. The new system compresses light field video while still preserving its original visual quality, and it does so using conventional texture atlasing and widely supported video codecs. In essence, they have succeeded at bootstrapping a next generation media format off of today's image and video compression techniques. "Users will be able to stream this light field video content over a typical, fast-speed internet connection," Broxton says. "Overcoming this problem opens up this technology to a much wider audience." Debevec sums up the work, stating, "Completing this project feels like we've overcome a major obstacle in making virtual experiences realistic, immersive, distributable, and comfortable. I can't wait to have the experiences the creative AR and VR community will make with this."
Research Report: "Immersive Light Field Video With a Layered Mesh Representation"
![]() ![]() Filmmakers urge EU to stand up to streaming giants Paris (AFP) June 17, 2020 More than a dozen top European film directors have called on the EU to regulate streaming giants like Netflix and Amazon. The group led by the Spanish auteur Pedro Almodovar and the double Cannes Palme d'Or winner Luc Dardenne have asked to meet the EU's Internal Market commissioner Thierry Breton in a letter released Wednesday. The group, which also includes the Romanian Cristian Mungiu, warned that Europe would be reduced to a "colony" of the US and China if its culture is not protected from " ... read more
![]() |
|
The content herein, unless otherwise known to be public domain, are Copyright 1995-2024 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. General Data Protection Regulation (GDPR) Statement Our advertisers use various cookies and the like to deliver the best ad banner available at one time. All network advertising suppliers have GDPR policies (Legitimate Interest) that conform with EU regulations for data collection. By using our websites you consent to cookie based advertising. If you do not agree with this then you must stop using the websites from May 25, 2018. Privacy Statement. Additional information can be found here at About Us. |