美通社

2026-01-24 05:00

Inside the Computing Power Behind Spatial Filmmaking: Hugh Hou Goes Hands-On at GIGABYTE Suite During CES 2026

LOS ANGELES, Jan. 24, 2026 /PRNewswire/ -- At CES 2026, VR filmmaker and educator Hugh Hou led a live spatial computing demonstration inside the GIGABYTE suite, showing how immersive video is created in real production environments, not in theory or controlled lab conditions.

Inside the Computing Power Behind Spatial Filmmaking: Hugh Hou Goes Hands-On at GIGABYTE Suite During CES 2026
Inside the Computing Power Behind Spatial Filmmaking: Hugh Hou Goes Hands-On at GIGABYTE Suite During CES 2026

The session gave attendees a close look at a complete spatial filmmaking pipeline, from capture through post-production and final playback. Instead of relying on pre-rendered content, the workflow was executed live on the show floor, reflecting the same processes used in commercial XR projects and placing clear demands on system stability, performance consistency, and thermal reliability. The experience culminated with attendees viewing a two-minute spatial film trailer across Meta Quest, Apple Vision Pro, and the newly launched Galaxy XR headsets, alongside a 3D tablet display offering an additional 180-degree viewing option.

Where AI Fits Into Real Creative Workflows

AI was presented not as a feature highlight, but as a practical tool embedded into everyday editing tasks. During the demo, AI-assisted enhancement, tracking, and preview processes helped speed up iteration without interrupting creative flow.

Footage captured on cinema-grade immersive cameras moved through industry-standard software including Adobe Premiere Pro and DaVinci Resolve. AI-based upscaling, noise reduction, and detail refinement were applied to meet the visual requirements of immersive VR, where any artifact or softness becomes immediately noticeable across a 360-degree viewing environment.

Why Platform Design Matters for Spatial Computing

Supporting the entire workflow was a custom-built GIGABYTE AI PC designed specifically for sustained spatial video workloads. The system combined an AMD Ryzen™ 7 9800X3D processor with a Radeon™ AI PRO R9700 AI TOP GPU, providing the memory bandwidth and continuous AI performance required for real-time 8K spatial video playback and rendering. Equally critical, the X870E AORUS MASTER X3D ICE motherboard delivered stable power and signal integrity, allowing the workflow to run predictably throughout the live demonstration.

The experience concluded with attendees viewing a finished spatial film trailer across Meta Quest, Apple Vision Pro, and Galaxy XR devices.

By enabling a demanding spatial filmmaking workflow to operate live and repeatedly at CES, GIGABYTE demonstrated how platform-level system design turns complex immersive production into something creators can rely on, not just experiment with.

source: GIGABYTE

【你點睇?】黎智英國安案判囚廿年,你點睇判刑結果?你認為西方啟動談判的可能性如何?► 立即投票

人氣文章
財經新聞
評論
專題
專業版
HV2
精裝版
SV2
串流版
IQ 登入
強化版
TQ
強化版
MQ

etnet初心不變 風雨無阻 與你並肩投資路,立即加入成為etnet YouTube頻道會員!

獨家優惠【etnet x 環球海產】 用戶專享全場95折,特價貨品更可折上折,立即選購五星級酒店御用海鮮!

樂本健 x etnet健康網購 | 購物滿額即送免費禮品

大國博弈

關稅戰

貨幣攻略

說說心理話

理財秘笈

Wonder in Art

北上食買玩

香港周圍遊

山今養生智慧

輕鬆護老

照顧者 情緒健康