The Coming Automation of Call Centers?

https://crooked.com/podcast/andrew-yang-on-the-universal-basic-income-and-why-he-hates-the-penny/

I was listening to an interview with businessman Andrew Yang, one of the many Democratic presidential candidates in the US, during which he listed the jobs he felt were most vulnerable to automation in the near future (it was part the case he was making for a basic annual income).

One of those occupations will no doubt be very familiar to you:

Google just recently demo-ed AI that can do the work of an average call center worker and there are two-and-a-half million call center workers in the United States. They make 14 bucks an hour, average education — high school. So, when you start digging in, you realize that we’re going to automate away the most common jobs in our society.

In fact, the man who has been greeted as a local savior for re-opening the Sydney Call Centre, Anthony Marlowe himself, expressed his enthusiasm for automation to the Corridor Business Journal (CBJ) well before he ever turned up here:

The benefits of advancing technology have been a game changer in the industry, Mr. Marlowe said, and customers who once might have wanted 100 call center operators to interact with their customers may now want half that many operators, [emphasis mine] and more digital services.

“We skipped the offshore call center trend,” he said. “We’re not skipping the trend in digital transcending automation and modernization. Our software offering puts us at an advantage of that curve.”

 

Eerily human

News that ACOA is providing Marlowe a $500,000 loan to help him expand the Sydney call center, and that he plans to use that loan for “asset/equipment acquisition” and “technology” (among other things) immediately got me to wondering whether that would be automating equipment and technology.

Google Duplex presentation.

Because Google does seems to be coming for the call center industry.

The tech giant unveiled Duplex, its automated voice assistant, during a developer conference in May 2018. As Sarah Kuranda wrote in The Information:

Google CEO Sundar Pichai demonstrated how a cutting-edge, computer-generated voice assistant called Duplex could call up a restaurant or hair salon and make an appointment without the person on the other end ever realizing they were talking with a robot. The technology was as controversial as it was impressive, drawing sharp criticism from people concerned about its ethical implications.

What Mr. Pichai didn’t mention is that the technology could be more than just a nifty trick to help users save a bit of time on reservations. Some big companies are in the very early stages of testing Google’s technology for use in other applications, such as call centers, where it might be able to replace some of the work currently done by humans, according to a person familiar with the plans.

Kuranda reported that Google was further along in “natural language processing” than its competitors — like Amazon, which plans to sell the technology behind its voice assistant, Alexa. The Google assistant can understand the same question asked different ways and can respond to follow-up questions, which could give the company an edge in the cloud-based customer call center market which is expected to be worth $20.9 billion by 2022 (up from about $6.8 billion in 2017).

Google says the technology is geared toward “handling repetitive tasks,” wrote Kuranda:

That would make it a natural fit for things like call centers, where humans spend a big chunk of their time answering the same questions over and over again, says Rand Hindi, CEO at Snips, a company that also develops voice assistants. “This would be a very, very natural extension [for voice technology],” he said.

But those ethical implications won’t be easily dismissed. Google, as noted above, was criticized after that initial demo because the people interacting with the voice assistant didn’t know they were talking to a computer — or that they were being recorded. (Kuranda says Google has programmed “ums” and “ahhhs” into the voice assistant’s responses to make it seem more human). Google has since changed things up — during its most recent demo, the voice assistant declared itself to be a computer and told callers they were being recorded.

The various reviews I’ve read say the technology remains very dependent on human beings — the New York Times did a test this May, having Duplex make restaurant reservations for it, and one successful call — featuring an “eerily human” Irish voice — turned out to have been made by a call center worker, not the automated assistant. Google told the paper that about 25% of calls placed through Duplex started with a human and that about 15% of those that began with the automated system had a human “intervene at some point.”

 

Helping Lily

At this point, the smart money seems to be on artificial intelligence (AI) assisting call center workers rather than replacing them all.  But Marlowe himself has suggested the technology means a center now employing 100 people could get by with 50.

I’ve already encountered this phenomenon, albeit with an obviously automated assistant: you call a helpline, the software takes you as far as it can but if you ask it something it can’t answer, it will connect you to a human (if it doesn’t just abruptly end the call).

Or you’ll be on hold, waiting to speak to a human, and you’ll be encouraged (repeatedly) to go online and solve your own problem, possibly with the assistance of a chatbot. (This is kind of the telephone equivalent of facing such long lines for a human cashier at Walmart you cave and go to one of the much more numerous self check-outs.)

How Vonage Business sees the relationship between AI and call center agents.

According to Vonage Business, AI’s role will be to assist call center agents quietly in the background, such that callers don’t even know there’s AI involved. Vonage gives an example of how this will work, telling the story of Lily, who gets into an accident on a cold night and calls roadside assistance. Like most things involving AI, it’s kind of creepy:

Even before the call is answered, the contact center’s AI judges it to be urgent. In a few milliseconds, it made that determination by acting on the context of the call:

  • The caller-ID was associated with her account.
  • She has been a customer for 10 years but hasn’t called the rescue line once.
  • Other customers with a similar profile to Lily call only when they really need help.
  • The weather in Lily’s home city is cold enough that she could risk hypothermia if she is stuck without heat for too long.

The call center agent (who doesn’t get a name in this story) does the talking while the virtual assistant listens in and provides help unprompted — it hears Lily say she needs a tow truck and the agent’s screen suddenly shows a map of the area with the nearest tow trucks marked. It hears Lily say she knows which street she’s on but is uncertain precisely where and:

…displays a message on the agent’s screen:

Send customer photo of local landmark?

Without explicitly responding to the software’s question, the agent tells Lily that she is sending her a text message with a photo of a local landmark and instructs Lily to let her know if she sees it. With that prompt, the AI sends Lily the photo and waits to hear what she tells the agent. She sees the building; the AI updates her location accordingly.

The agent says that she’ll send a tow truck. The virtual assistant hears this and sends details of the job directly to the tow truck driver, then displays an estimated wait time on the agent’s screen.

There are other examples — Lily wants to change her payment method — that are handled entirely by the software and it’s clear where there is all heading. The AI is busy listening to the agent’s calls, teaching itself to replace her:

Already software is available that reviews chat transcripts and call recordings to analyze sentiment and pinpoint the moment that, for example, a customer lost their temper. By analyzing thousands of calls and transcripts, a machine learning tool could learn what vocabulary and vocal qualities show that someone is becoming dissatisfied and also what types of response disarm the caller.

And while humans will always (I suspect) be better than software at detecting anger in other humans, they cost $12 an hour. They get sick. They get stressed. They ask for raises. They have lives.

My guess is as soon as AI reaches the point where it can sound sufficiently sympathetic to Lily, as she stands in the cold by her battered vehicle, the agent will be gone. Vonage agrees:

So, perhaps in 20 years, we’ll have natural, flowing voice conversations with AI agents in contact centers.

And while I keep reading that companies are very serious about customer service and know how easily they can lose your business, I’ve also encountered some seriously horrendous customer service from companies (like my cell phone company) that take a lot of my money but seem more concerned with keeping costs down than with keeping me happy. So I’m not convinced AI will have to be all that good at mimicking humans to take over completely — look at what’s happening at Walmart, self-checkouts aren’t all that good and yet, they are clearly taking over.

But I’m also wondering how far AI will penetrate — could software ever, for instance, take over regional economic development? Could an automated assistant evaluate funding applications? Would it approve a $500,000 loan to a call center operator in 2019 in the name of job creation? Or would it analyze reams of applicable data in seconds and come back with:

“Computer says no?”