Technology: AI meets AV Features 12/03/2018 Artificial intelligence is generating lots of buzz in other verticals. Tim Kridel explores what AV can learn from those and how vendors such as Avaya and Harman are applying AI. A decade ago, Steve Jobs introduced the iPhone by explaining why it didn’t include a stylus. “We’re going to use the best pointing device in the world: our ï¬ngers,” he said. “We’re born with 10 of them.” We’re also born with a voice, which is rapidly emerging as another user interface (UI), including for pro AV systems. That’s largely because of advances in artiï¬ cial intelligence (AI), which keeps getting better at understanding people. That capability often is referred to as “natural-language understanding” or “natural- language processing.” In April, Harman partnered with IBM Watson to develop what it calls “voice-enabled cognitive rooms” for verticals such as healthcare and hospitality. The solutions begin shipping this year and include Harman soundbars embedded with IBM Watson’s AI technology, which lets people simply talk to the equipment to get information or to get it to do something. For example, instead of using a hotel room’s thermostat, guests could say, “Turn up the heat” or “Turn on the air and set it to 20.” “It’s all about making it simple and easy for guests,” says David McKinney, vice president of Harman’s Hospitality Customer Solutions unit. What’s the business case? For hotels and other businesses, a big part of AI- powered AV’s appeal is that it helps them save money. For example, convention centres, libraries and other large venues use digital signage to help people ï¬ nd their way around on their own. So they save money because they don’t need as many, or any, staff to help with wayï¬ nding. AI has the potential to extend that efï¬ciency to many other areas. Suppose a hotel room has a smart speaker such as Amazon Alexa or Google Home, and it’s connected to multiple departments, including maintenance and housekeeping. Now when guests say, “I need more towels,” or “There’s no hot water,” the system can automatically alert staff—but without the need for additional staff at the front desk to ï¬ eld and relay those calls. That scenario also is an example of one way AV ï¬rms can add value: by identifying tasks that can be automated. For instance, an AV consultant could analyse the front desk’s inbound calls to determine the 25 most common guest inquiries and then develop an AI solution capable of ï¬ elding and routing them without staff involvement. If that analysis also shows how many personnel hours that would save, it would help justify the project’s budget. That type of analysis also highlights how AI ties in with another buzzword: big data. Suppose the AI is embedded in wayfinding digital signage, and people keep asking about the same half-dozen places. That could point to a need to update the signage content to anticipate those questions so people no longer feel a need to ask them. “There’s a lot of data that comes out of how these systems are used, [such as] what sorts of commands are coming in,” McKinney says. Another business driver involves brand reputation. For example, the AI system could be programmed to recognise words that indicate a person’s emotion. If it’s negative, the system could alert a staff member to resolve the problem. At your service In enterprise applications, AI also could help integrators, vendors and their clients provide better user experiences while lowering support costs. For example, in October, Avaya announced the A.I.Connect initiative to develop AI solutions for applications such as contact centres and unified communications. An enterprise AV/ IT help desk is a contact centre, so it’s worth looking at how AI use cases from other contact centre applications could be adapted. One A.I.Connect partner is Nuance, which specialises in natural language understanding for applications such as interactive voice response (IVR) phone systems and virtual assistants. Nuance has discussed how an AI platform could ingest all of the manuals, FAQs and other collateral for a product and use that to power a virtual assistant to support that product. In the AV world, one example is an enterprise help desk where the AI-powered virtual assistant fields questions about how to connect a laptop to a projector. Another possibility is an AV integrator or vendor that uses a virtual assistant to support its products. In consumer-facing contact centres, virtual assistants often can handle up to 80% of inquiries. So if AV vendors, integrators or their customers can achieve comparable automation, it would free up AV/IT staff. “The variety of use cases for applying AI to improving customer experiences is somewhat staggering,” says Eric Rossman, Avaya vice president, alliances and partnerships. “Companies that are focused on implementing digital channels are looking heavily towards expert systems-based chatbots and virtual assistants, which rely upon semantic analysis, natural language speech recognition and rule- based pattern matching capabilities.” Of course, virtual assistants won’t be able to handle every inquiry, especially technically complex ones. In those cases, AI still could play a role by taking over some of the work that help desk staff typically do during a call. One possibility is listening on the call for certain keywords, such as product names. “Being able to proactively place guidance and related resources in the hands of the agent without them having to manually search knowledge bases and other internal sources for those materials only makes the customer interactions go smoother and flow more naturally,” Rossman says. “This ‘agent augmentation’ capability can easily leverage AI- enabled applications that data mine the wealth of knowledge bases, help desk tickets, even internal video training and recorded webinars that a company may have, learning to identify common themes and recurring answers that can form ready-made results for both automated and human-assisted interactions.” AV also could adapt voice biometrics, which some contact centres and virtual assistants use to authenticate users so they don’t have to remember a PIN or password. One possibility is a conference room where the AI identifies each presenter by voice and automatically downloads their content from the cloud to the projector or display. That would alleviate the common frustration of trying to figure out an unfamiliar AV system. New skills required? Some of these scenarios might seem a bit outside of pro AV’s traditional wheelhouse. But so are energy efficiency, digital signage content creation and the Internet of Things, which are just three examples of areas that some AV firms have expanded into. Selling and supporting new technologies usually means AV professionals need to add new skills, such as iOS expertise when the iPad emerged as a touchpanel alternative and content source. How many new skills depends on how deep they want to get into a new market and what their vendors offer. AI is no exception. For instance, integrators selling Harman-IBM Watson solutions wouldn’t have to hire, say, speech scientists to design and support voice- powered systems. Instead, they can focus on installing mics and loudspeakers. “We’ve built software applications to enable people to install and make it easy to set up a mass deployment,” McKinney says. “[For] integrators doing those sort of control systems already, a lot of their skill sets can be ported into that.” AI also could give integrators and end users new ways to maximise the effectiveness and ROI of traditional AV systems. In retail, for example, AI could analyse camera feeds to determine how certain demographics react to certain content on digital signage. “We are seeing a growing number of retailers either adding or looking to add audience measurement technologies to serve two functions,” says Jason Cremins, founder and CEO of Signagelive, which is working with AdMobilize on AI analytics. “The ï¬rst is to collect viewer data that can be analysed against the proof of play (media logs) and proof of display (device status data) that we collect and report within our platform. Adding proof of view completes the dataset, allowing them to [apply] POS sales data and other internal and external metrics (e.g., weather) to provide a deep insight into the impact of their digital signage network and content strategy. “The second use case is using the data gathered to dynamically shape and schedule the media playing on the digital signage displays. In this scenario, the scheduled content is adjusted at the point of playback to optimise the content shown based on the insights gathered.” For retailers and other businesses that use digital signage, one longstanding challenge is quantifying the reach and effectiveness of both the displays’ locations and the content on them. AI enables them to get deeper, actionable insights that wouldn’t be practical or possible if humans did that analysis. Quicker analysis also means businesses can react faster. “One thing we tell all of our partners is to initially correlate the data to the brief or RFP that will drive the investment in digital signage in the ï¬ rst place,” says Mike Neel, AdMobilize global head of marketing/sales. “We often ï¬ nd that the data that can be provided greatly improves the KPIs associated with the investment in digital signage. “Compelling content is by and large dictated by the old adage ‘right place, right time.’ Real- time data helps facilitate that.”