美通社

2026-01-24 05:00

Inside the Computing Power Behind Spatial Filmmaking: Hugh Hou Goes Hands-On at GIGABYTE Suite During CES 2026

LOS ANGELES, Jan. 24, 2026 /PRNewswire/ -- At CES 2026, VR filmmaker and educator Hugh Hou led a live spatial computing demonstration inside the GIGABYTE suite, showing how immersive video is created in real production environments, not in theory or controlled lab conditions.

Inside the Computing Power Behind Spatial Filmmaking: Hugh Hou Goes Hands-On at GIGABYTE Suite During CES 2026
Inside the Computing Power Behind Spatial Filmmaking: Hugh Hou Goes Hands-On at GIGABYTE Suite During CES 2026

The session gave attendees a close look at a complete spatial filmmaking pipeline, from capture through post-production and final playback. Instead of relying on pre-rendered content, the workflow was executed live on the show floor, reflecting the same processes used in commercial XR projects and placing clear demands on system stability, performance consistency, and thermal reliability. The experience culminated with attendees viewing a two-minute spatial film trailer across Meta Quest, Apple Vision Pro, and the newly launched Galaxy XR headsets, alongside a 3D tablet display offering an additional 180-degree viewing option.

Where AI Fits Into Real Creative Workflows

AI was presented not as a feature highlight, but as a practical tool embedded into everyday editing tasks. During the demo, AI-assisted enhancement, tracking, and preview processes helped speed up iteration without interrupting creative flow.

Footage captured on cinema-grade immersive cameras moved through industry-standard software including Adobe Premiere Pro and DaVinci Resolve. AI-based upscaling, noise reduction, and detail refinement were applied to meet the visual requirements of immersive VR, where any artifact or softness becomes immediately noticeable across a 360-degree viewing environment.

Why Platform Design Matters for Spatial Computing

Supporting the entire workflow was a custom-built GIGABYTE AI PC designed specifically for sustained spatial video workloads. The system combined an AMD Ryzen™ 7 9800X3D processor with a Radeon™ AI PRO R9700 AI TOP GPU, providing the memory bandwidth and continuous AI performance required for real-time 8K spatial video playback and rendering. Equally critical, the X870E AORUS MASTER X3D ICE motherboard delivered stable power and signal integrity, allowing the workflow to run predictably throughout the live demonstration.

The experience concluded with attendees viewing a finished spatial film trailer across Meta Quest, Apple Vision Pro, and Galaxy XR devices.

By enabling a demanding spatial filmmaking workflow to operate live and repeatedly at CES, GIGABYTE demonstrated how platform-level system design turns complex immersive production into something creators can rely on, not just experiment with.

source: GIGABYTE

人氣文章
財經新聞
評論
專題
專業版
HV2
精裝版
SV2
串流版
IQ 登入
強化版
TQ
強化版
MQ

大國博弈

關稅戰

貨幣攻略

說說心理話

理財秘笈

Wonder in Art

北上食買玩

香港周圍遊

山今養生智慧

輕鬆護老

照顧者 情緒健康