Meta revealed two major virtual reality developments at Wednesday’s Connect 2025 conference: Hyperscape Capture transforms physical locations into photorealistic digital environments, while Horizon TV brings comprehensive streaming capabilities to Quest headsets. These launches demonstrate the company’s ongoing commitment to metaverse development despite its recent pivot toward AI-enabled eyewear.
Scanning Physical Rooms into Photorealistic Virtual Spaces
The company released Hyperscape Capture in early access for Quest 3 and Quest 3S users aged 18 and older, marking the first time consumers can generate their own photorealistic VR environments. The technology employs Gaussian splatting methodology, enabling users to scan rooms within minutes using their headset cameras, with cloud-based processing delivering the accessible VR replica within two to four hours.
Meta’s demonstration revealed a two-phase capture workflow: users initially scan their room to establish a scene mesh, then walk through the space while moving closer to objects for enhanced detail capture. The resulting digital replicas stream from Meta’s servers using technology the company identifies as Project Avalanche.
“This represents the initial step toward realizing our vision of photorealistic social telepresence,” Meta stated in their announcement. The company showcased celebrity locations including Gordon Ramsay’s Los Angeles kitchen, Chance the Rapper’s House of Kicks studio, and the UFC octagon at the Las Vegas Apex facility.

What Users Can Expect from Hyperscape Technology
Currently, users can only view their personal scans, though Meta plans to introduce private link sharing capabilities soon, along with integration into Horizon Worlds. The scanning process leverages the Quest headset’s existing camera array without requiring additional hardware.
The Gaussian splatting technique represents a significant advancement in 3D environment reconstruction, offering higher visual fidelity compared to traditional mesh-based approaches. This methodology captures lighting, textures, and spatial details with remarkable accuracy.
Cloud processing handles the computational demands of converting raw scan data into streamable VR environments. The two-to-four-hour processing window allows Meta’s servers to optimize the captured data for smooth playback across Quest devices.
VR Streaming Enters a New Phase
Alongside Hyperscape Capture, Meta introduced Horizon TV as a dedicated streaming platform for Quest users. This positions the company to compete more directly in the entertainment space, transforming Quest headsets into versatile viewing devices beyond gaming and productivity applications.
The timing of these announcements signals Meta’s strategy to expand VR use cases while maintaining infrastructure investments in metaverse technologies. Despite public attention shifting toward AI-powered smart glasses, the company continues advancing its virtual reality ecosystem with consumer-focused features.
Celebrity partnerships for Hyperscape demonstrations serve both promotional and practical purposes, showcasing the technology’s capability to capture diverse environments while generating public interest. These high-profile spaces will likely become accessible to Quest users as the sharing features roll out.
Meta’s approach with Hyperscape Capture democratizes 3D environment creation, previously requiring specialized equipment and expertise. This accessibility could accelerate user-generated content within virtual reality platforms, potentially creating new social experiences and applications.
The phased rollout strategy allows Meta to gather feedback and refine the technology before broader release. Early access participants will help identify optimization opportunities and use cases that could shape future development.
Post a comment