Augmented Reality is going to change the way humans interact. Various companies have started to build the foundational infrastructure and tools to create the AR ecosystem and AR experiences on mobile devices. But to provide similar or even more computationally intensive immersive AR experiences for thin clients like AR glasses, one needs to take a step back and understand the strict power and thermal limitations and think of an architecture which allows you to offload compute to a beefier server in a privacy/context aware, latency sensitive and scalable fashion. There are lot of challenging areas when it comes to shipping camera frames to server for computation ranging from operating real time transport at scale, leveraging GPU’s at scale for ML and render operations for both calling (like Augmented Calling) and non-calling scenarios. Camera frames (RGB and possibly depth) happen to be the prime driving force for Augmented Reality and to be able to process this video data at scale is a necessity for scaling AR experiences in the future. This talk will focus on some of the work which Meta has done in this domain and how industry as a whole needs to come together to solve some of these challenges in order to build the future of high fidelity, low latency immersive AR experiences.
- WATCH NOW
- 2024 EVENTS
- PAST EVENTS
- 2023
- 2022
- February
- RTC @Scale 2022
- March
- Systems @Scale Spring 2022
- April
- Product @Scale Spring 2022
- May
- Data @Scale Spring 2022
- June
- Systems @Scale Summer 2022
- Networking @Scale Summer 2022
- August
- Reliability @Scale Summer 2022
- September
- AI @Scale 2022
- November
- Networking @Scale Fall 2022
- Video @Scale Fall 2022
- December
- Systems @Scale Winter 2022
- 2021
- 2020
- 2019
- 2018
- 2017
- 2016
- 2015
- Blog & Video Archive
- Speaker Submissions