Adeia Blog

All Blogs

July 31, 2023

The Role of Vision-based Interfaces in Redefining Digital Entertainment User Experiences

The Role of Vision-based Interfaces in Redefining Digital Entertainment User Experiences

Vision-based interface (VBI) technologies are poised to become one of the next major breakthroughs in elevating the consumer experience across a wide range of connected devices.

According to a recent Parks Associates white paper , VBI technologies provide hands-free control in place of – or to supplement – other forms of interaction and control input, such as tactile (touch) or vocal (voice) input. The principle of VBI relies primarily upon computer vision.

VBI is an exciting area of innovation that could revolutionize how people think about human-machine interfaces to accomplish many of today’s tasks, reducing friction to enhance a wide range of experiences. Because of this, many sectors are exploring the implications of integrating VBI into their market offerings.

To learn more about VBI, we sat down with Serhad Doken, Chief Technology Officer at Adeia. Here is what he had to say:

Q: Tell us about vision-based interface technologies and how they redefine the digital entertainment user experience.

Serhad Doken: Vision-based interface is emerging as one of the next important developments for the digital entertainment sector. It will likely be the next-generation user interface, enabling device manufacturers and network service providers – including pay-TV operators and OTT (over-the-top)/streaming providers – to optimize operations and reduce friction.

It will offer an alternative to historically available user interfaces, including keyboards, mouse devices, touchscreens, and more recently, voice interfaces. Vision-based interfaces will not necessarily replace these existing technologies across all use cases. It will, however, augment and enhance interfaces and provide exciting new opportunities for consumers in a variety of entertainment applications.

Q: What is the state of VBI technology maturity?

Doken: VBI technology itself has reached a fairly high level of maturity. It is already widely used to biometrically verify identity on personal devices, smart-home access and even major public transportation venues such as airport entry. Having said that, we expect to see VBI introduced into much broader areas of the market over the rest of the decade.

To get an idea of the potential, look no further than security surveillance cameras. Today, VBI adds a layer to surveillance equipment by running computer vision algorithms on security images or footage, and then making automated decisions and actions based on the output of those algorithms.

Additional actions often include text message alerts and even, in some instances, emergency calls to authorities. VBI is also being used with increasing frequency by organizations to streamline workflows, enable warehouse automation and optimize manufacturing processes. Visual sensors, for instance, are now commonly used for quality control, presence sensing, or positioning and orienting other applications that enhance safety and performance.

The next major step for VBI will occur in the consumer world, where it will introduce new ways to manage and navigate the next generation of communication devices, entertainment services and even gaming experiences.

Q: How do you see emerging VBI technologies changing the behavior of consumers across their digital engagement with the world?

Doken:  VBI could drastically change the behavior of consumers as they interact with devices to access a wide range of experiences. Today, mouse devices and finger pads on touchscreens are used to hover over and click icons or links. And for TVs, most people still use remote controls to access programs and navigate through their content options.

In the near future, users will be able to accomplish the same result with just their eyes. This will be possible through eye tracking, where a user can select a menu option, drag an icon or even double click with the blink of an eye.

These technologies have been around for a while – most people may remember the introduction of Google Glass a decade ago. One of the things missing then was a mature surrounding ecosystem to support the experiences. That is why we have not yet seen mass adoption. New technologies, however, are rapidly maturing and are now going through a ‘hype cycle’ before large-scale adoption.

Q: That is interesting. What, specifically, are you seeing across the ecosystem – televisions, mobile devices, tablets – to integrate vision-based interfaces into consumer offerings? Is there an interoperability platform that you see forming around these solutions?

Doken: Yes, I am seeing a lot of important activity take place to support VBI applications. For example, there is now a much higher level of interoperability between and across different manufacturers of intelligent devices, including mobile phones, smart watches and wireless earbuds. All three categories function independently, but as technology and consumer usage evolve, we have seen these offerings increasingly function together and create much better user experiences.

We expect a similar dynamic to play out with VBI. One area we expect to explode is the smart glasses sector. Many people routinely use several high-resolution screens for work, watching entertainment or playing games. Smart glasses can give users access to an infinite number of virtual monitors, spread out over 360 degrees of view. More importantly, smart glasses are portable, so you can take them, and the desired experiences, anywhere.

Q: What role do you see the cloud playing in enabling the portability of access to content, applications or experiences throughout the consumer’s digital day?

Doken: The cloud – and high-speed networking, which has recently been enhanced with the deployment of 5G technology – will play a critical role in ensuring portability.

Think about a consumer wearing smart glasses to a business convention with the intention of identifying the name and job title of fellow attendees. A positive user experience requires split-second data augmentation to match who people are in the context of where they are.

Major cloud providers are already working on scenarios just like this. And they are making progress. Today, cloud infrastructures are hundreds of miles away from data centers. This has prompted the creation of mini cloud data centers, or ‘cloudlets’, that can be virtually deployed to edge environments closer to venues of operation.

Theoretically, these cloudlets working in concert with fast communication – 5G wireless or local wireless networks connected to 10G cables – enable the quick transmission of data from the cloud to a data center and then to a cloudlet near the user. Memory will also be critical in optimizing services consumers and business will demand through the rest of the decade and beyond. That is why Adeia is also making ground-breaking research and development investments in hybrid bonding and node technology to enhance the performance of chips that will play an important role in vision-based interface capabilities.

Hybrid bonding and node technologies will increasingly be present across the entire ecosystem from end point devices and edge computing cloudlets to the infrastructures that support cloud service providers.

More progress remains to be made. Many stars need to align for these scenarios to become a reality. But a base of technology already exists, and the major players are getting ready to support these types of visual-based interface applications.

Q: Are you seeing coordination around these issues take place?

Doken: Absolutely. We are seeing coordination between key players in the market who are showcasing very interesting prototypes and proofs-of-concept around the world. Those key players include pay-TV operators, OTT/streaming services, SoC (System-on-Chip) manufacturers, consumer device manufacturers and customer premises equipment (CPE) manufacturers.

It is an exhilarating time, but a full system experience has yet to be created. In so doing, it will also be important to maintain a continuum of experience. In other words, as we move toward the future and embrace new technologies, it is imperative to ensure some level of compatibility with existing technologies and services.

Q: What role will Adeia play in helping to bring vision-based interface experiences to market?

Doken: That’s my favorite question so far. At Adeia, we are technologists. We focus on the inflection points of the markets and specialize in solving tough technical problems to enable future products and services.

VBI is one of the next milestone innovations that will usher in an entirely new generation of interfaces, consumer experiences, applications and services. It will create a paradigm shift for the entire ecosystem.

It won’t be easy to realize this vision. There are a lot of challenges that still need to be addressed. But the intersection of what could be and how to get it done is exactly where Adeia thrives.

Adeia Takes Center Stage: Unveiling the Future of Hybrid Bonding Technology

Improving Internet Performance for Time-Sensitive Applications

The Wi-Fi Sharing Dilemma

Multi-Device Video Consumption and AI are Reshaping the E-Commerce Landscape

Serhad Doken

Chief Technology Officer

Serhad Doken is responsible for the technology roadmap, research strategy and advanced R&D projects. Mr. Doken previously was the Executive, Director of Innovation & Product Realization at Verizon where he drove new 5G and Mobile Edge Computing powered services for Consumer and Enterprise Businesses. Prior to Verizon, Mr. Doken was VP, Innovation Partners at InterDigital focused on technology strategy and external R&D projects and partnerships. Prior to InterDigital, Mr. Doken worked on emerging mobile technology incubation at Qualcomm. Prior to this, Mr. Doken held positions at Cisco Systems, Nortel Networks and PSI AG. Mr. Doken is an inventor on 30 issued worldwide patents over 90 worldwide applications. Mr. Doken has a Computer Engineering degree from Bosphorus University and has completed the M&A Executive Education Program at The Wharton School and the New Ventures Executive Education Program at Harvard Business School.