The metaverse and its Extended Reality (XR) component not only need a lot of data, but also very compact head mounted displays (HMD) to deliver on their promise of extremely immersive experiences. To keep the HMD as minimal as possible, data cannot all be generated at the edge and will need to move over a variety of networks (DSL, cable, satellite, 5G, Wi-Fi) before it reaches the end-user. In this short presentation we will expose how trying to improve the end-user experience by using multi-focal plane displays led the research team at Adeia to propose a new method to encode content at the source to minimize the footprint of the HMD for the comfort of the user.
The metaverse and its Extended Reality (XR) component not only need a lot of data, but also very compact head mounted displays (HMD) to deliver on their promise of extremely immersive experiences. To keep the HMD as minimal as possible, data cannot all be generated at the edge and will need to move over a variety of networks (DSL, cable, satellite, 5G, Wi-Fi) before it reaches the end-user. In this short presentation we will expose how trying to improve the end-user experience by using multi-focal plane displays led the research team at Adeia to propose a new method to encode content at the source to minimize the footprint of the HMD for the comfort of the user.