AM: Hello, and thank you for calling the medical clinic. To continue in Spanish, press 1.
CALLER: (sighs) Okay.
AM: To make an appointment press 2. For test results press 3. For endoscopy press 4.
CALLER: Are you serious?
AM: For radiology press 5. For pharmacy press 6.....
HOST: For people who need health care, automated phone systems can be more than frustrating.
Wes Hawkins: And you can get disconnected and then you have to dial again and start back where you were or wait on hold.
Caitlin Donovan: Our case managers on average, take 22 phone calls to resolve a case and part of the reason for this is because healthcare is so complicated.
Carl Marelli: A lot of the work that was done, was to understand why people called into the contact center today. Not only the data science, to understand the words and how we could really understand what the question is, but also where do we look to find the answer?
HOST: This is Speed to Modern Tech, an original podcast from KPMG. I’m Tori Weldon. Each episode, we'll bring you a problem many businesses are facing and the story of how technology was used to tackle it.
Today, making automated phone systems more human -- and how technology is being used to prevent people from getting stuck on hold.
HOST: Wes Hawkins has been dealing with the health care system for as long as he can remember.
Wes Hawkins: The first memory I have of knowing that something was “quote, unquote” wrong with me or different, I was in my dad's truck. He had a purple and black Ford ranger and we're driving down the main street in town. And he's explaining to me that twice a day, I will have to take pills, as well as about a tablespoon of inhaled nebulizing treatment, which I'll breathe in through a machine of compressed air.
And that is my very first memory knowing that, okay, my life is going to be a little different.
HOST: At age 3, Wes was diagnosed with cystic fibrosis, a genetic disease affecting the digestive system and the lungs.
Wes Hawkins: CF or cystic fibrosis feels like you're breathing through a straw or you take numerous repetitive, deep breaths without exhaling. So you're taking a deep breath, and another deep breath on top of it, and another deep breath on top of it - and trying to talk and communicate and walk. That causes a lot of stress on your body, increases your blood pressure, increases your heart rate, makes it very hard to live a somewhat normal life.
HOST: Yet somehow, Wes did manage to have a normal childhood. Growing up in rural Oklahoma, his parents let him do all the things other kids were doing.
Wes Hawkins: So they never let any part of my childhood be affected just because I have CF. Oh, you can't go play sports because you have CF. Oh, you can't go play in the mud. Um, all those things that a normal kid needs in life, they weren't going to change that just because I had a different diagnosis than my best friend or the neighbor.
CF really didn't start targeting me until I was 15 when I had my first, spontaneous pneumothorax, which is your lung collapsing. And that was my very first hospital stay.
HOST: That first hospital stay lasted two weeks. And over the next few years, Wes needed more medical care.
Yet he managed to follow his childhood path of doing what everyone else was doing. He moved out, went to college, and got a job he loved – managing a large restaurant.
Then in 2019, his health took a serious turn. On Mother's day weekend he started coughing up blood and was admitted to hospital. Within a few weeks, he was waiting for a double lung transplant. Even once a donor had been found, the surgery was extremely risky. Wes’s recovery took a long time.
In the end, though, the sacrifice was worth it. Today, Wes feels better than ever. But he still spends a lot of time managing his care --- including hours on the phone with insurers and other health care systems.
Wes Hawkins: When I started managing the harder stuff - the insurance, the co-pays, these systems, a lot of these people don't realize how broken it really is.
I don't know the exact script, but it's like: “please listen as our options have changed” - that's the worst because if I do listen to all the options, what if that's not listed, then I still obviously press one. And sometimes they say press zero to talk to an operator. Sometimes zero doesn't work.
And that is when the whole transfer system gets very stressful and you can get disconnected and then you have to dial again and start back where you were or wait on hold.
HOST: Customer calls centers are nothing new -- and are used in many industries to handle basic transactions. But in health care, there is a lot more at stake. The insurance system is complex -- and getting the right information about what services are covered directly impacts people's care. Patients with complex health needs often spend many hours on the phone each week.
Wes admits that for him, long hours on hold have just become part of his everyday life. So he's come up with his own strategy for coping.
Wes Hawkins: I'm on the road a lot. I drive back and forth from Oklahoma city to my hometown, Ponca to Stillwater. So that's when I try to scrape out time to be on the phone, so I can listen to that hold music or whatever I need to do. So I don't have to carry around my phone all day with me and be in a bad mood while I'm trying to watch TV, enjoy my day off, or read.
HOST: Caitlin Donovan believes people like Wes shouldn't have to come up with work arounds to get the health information they need. As a Senior Director with the National Patient Advocate Foundation, she spends her days trying to make navigating health systems simpler for patients across the US.
Caitlin Donovan: We work with people who maybe can't get their insurance to approve a treatment that their doctor prescribed, or they've received a really large bill in the mail and they have cancer. And they don't know who else to turn to. Usually by the time they come to us, they've tried a couple of different avenues of trying to figure it out themselves and they’ve run up against a wall.
HOST: At this stage, case managers from the foundation are usually assigned to help patients. But even with this guidance, it is still a slow process.
Caitlin Donovan: One of the things you have to know about how this works is how time-consuming it is. Our case managers on average take 22 phone calls to resolve a case.
HOST: Caitlin doesn't have to rely on case managers or patients to understand what is wrong with the health care bureaucracy. She has her own story to tell, about trying to correct mistakes in her health bills after the birth of her third child.
Caitlin Donovan: And so in order to get my money back, it took me 25 phone calls outgoing from just me to resolve it. But I couldn't just walk down to my local hospital where I was being billed from to resolve the issue because their billing office had been contracted out to a place in Boston.
I'm in New Jersey. So every single time I called I would have to wait for the menu. I would have to press different menu options. There are usually three different levels to get through. And then I'd have to speak to a different person. There was no direct line option. I would have to keep records every single time with reference numbers, going back now several months, trying to track down this money.
HOST: Caitlin says for her, this battle with bureaucracy was sort of an experiment -- she didn't need answers to survive, and she wasn't suffering from an illness. But most of the patients she works with don't have a choice.
Caitlin Donovan: I would say at this point, almost every single time I talk to a patient it involves some component of being frustrated at the bureaucracy. Inevitably, it comes up with: “I made 20 different phone calls, I was on hold for an hour, they dropped the call and never called me back”. And what's sad is that we've come to expect it and it's become part of the system in a way that's kind of made cynics of all of us.
HOST: In the health care sector, customer contact centers can be a Catch-22. If they work well, it's a simple way to get patients the information they need. But if they don't work well --- it's frustrating.
Carl Marelli: And if you don't fit into that cookie cutter, you have a one-off question, you have to go through this sort of artificial tree. Right? You got to think, okay, why am I calling? I guess it's sort of a benefits question. So I'm gonna hit two for benefits. Okay. Now I'm gonna hit three and it's very kind of mechanical.
HOST: That's Carl Marelli, a Director with KPMG Lighthouse. His group specializes in Artificial Intelligence, data, and analytics. As an engineer, he learned about AI working on the IBM Watson project, and as he explains it:
Carl Marelli: Cognitive technology was sort of in my blood.
HOST: Yet even with this background in artificial intelligence, Carl and his team had their work cut out for them. The contact center system they had to overhaul was dated, and used pretty basic integrated voice response technology. A conversational AI system would be a vast improvement.
Carl Marelli: A lot of the work that was done before we kicked off, it was to understand the primary call drivers, the primary intents of why people called into the contact center today. And there was a whole bunch of them that spanned anywhere from benefits type questions, to questions about a claim, to what we called “find a provider”.
HOST: And it wasn't just patients calling the contact center. Health care staff were also using the phone system. It all added up to tens of thousands of questions the AI would need to answer.
That's where Arthur Franke comes in. He's a Conversational Data Scientist at KPMG. And after reviewing all the different users and the questions they were asking the contact center, it quickly became clear there was a wide divide.
Arthur Franke: As a lay person, I'm going to ask about things in very plain terms. I'm not a trained medical practitioner. I'm going to say: I'm going to see my doctor. Whereas someone who works in an office is going to be using more jargon, diagnostic codes, the proper names for specific procedures, which is both good and bad.
It's good because it's more specific, but it's more complex for the machine to understand because now we have to make the machine understand that very specific medical domain jargon.
HOST: Arthur says they spent a lot of time analyzing the inputs -- the questions people were asking. Knowing what people were looking for was key to figuring out what the answer should be.
But finding the right answer -- the output -- was also pretty complex.
Arthur Franke: There are different types of care that someone can receive. And then the health plan has a way of categorizing those things.
If I have an ultrasound or an x-ray, all of those become medical imaging. But for some different types of ultrasounds or different types of x-rays, it may fall into different, special categories. So we had to understand how do you turn a request for information about a general type of procedure into something that's specific enough to get to the answer in the health systems plan information?
HOST: Arthur and his team used machine learning to train the AI to understand what a patient was looking for --- and where to find the answer. But medical terminology is practically its own language. And the way that patients asked questions didn’t always map to the correct answer.
Carl Marelli: Another part of the problem was what if they called and asked about a body part or a condition, right?
And doing that mapping from, hey, I need somebody to look at this rash to, okay, that's a dermatologist. I'm going to find you a list of dermatologists. Right. And doing that mapping is something that an AI doesn't automatically know, we have to teach it, right.
HOST: Teaching the AI system involved generating hundreds of potential patient questions. Yet even with all this data, they still managed to come across questions that would stump the system.
Arthur Franke: Anytime you put a conversational system or a voice system in front of folks and ask them to ask questions, you undoubtedly come up with new ways of folks asking for things.
Um, one interesting case was where we had been thinking of the doctor that someone sees from year to year as a primary care physician, but someone who had grown up in a country where that role is referred to as a general practitioner came in and asked: “I need to make an appointment with my GP” or “what's the copay for visiting my GP”, and that stumped our system the first time around.
So we then had to go teach the system that a GP and a PCP are the same thing.
HOST: Arthur and his team spent hundreds of hours ensuring the AI system could make these links between different words and medical terms - and that it all integrated with the existing tech stack.
Yet there remained another challenge. How to respond to questions that did not have a single answer?
Arthur Franke: So if someone's asking for information on an ultrasound and they're given a choice between a kidney ultrasound, an abdominal ultrasound or a pelvic ultrasound, they might ask what's the difference between those three?
And having content ready to answer those kinds of quick explanatory questions required working with many different subject matter experts, in many different areas of medicine to come up with: this is the right thing to say.
HOST: Figuring out the right thing to say was key to building an AI contact center that people would use.
When people ask a question verbally, voice systems need to give a simple, direct answer.
As Carl explains it, this is very different from a regular web search, where we’re used to getting a long list of options.
Carl Marelli: Even if you get a whole bunch of nonsensical results, as long as your result is within the top few, that's perfectly fine because you could look at it at a glance, it's visual.
You can select the right thing, kind of like a Google search. But with voice, it doesn't really work that way. Right? If you get a bunch of nonsensical responses with a voice experience, it just, it just breaks down. Right? It's very frustrating. It's very slow and we can't work like that.
HOST: This question of how to communicate verbally was a whole different problem to solve.
There was a lot of debate about how to phrase answers and how much detail a caller could process over the phone. And then there was a need to match the cadence of a real conversation -- to make the interaction as close to human as possible.
Arthur Franke: Anytime you're developing a conversational experience, you want it to feel natural. There was a technical challenge around getting systems to respond quick enough that the voice system could keep up the pace of conversation. For historical reasons, the client that we were working with had a few systems that had the information that we needed.
But those systems were configured to provide it within 10 seconds. That's fine if you're loading a web page or if you're loading some form that's going to be sent by mail. But if you and I are having a conversation and I suddenly pause for 10 seconds, you're going to think I don't know the answer and you're going to hang up the phone.
So we had to make sure that those responses could be delivered in optimally less than two seconds.
HOST: Although the goal was to provide callers with verbal answers, the team realized that approach might not work with every call.
If a caller was looking for a list of doctors and their addresses, the system would ask if they would like the list emailed to them, or sent by text.
Calls without a clear answer, or where someone's health was at risk, went directly to a human.
Finally, after many long days designing the AI system and integrating it into an existing tech stack, it was ready to take real-life calls. Arthur and his team watched with anticipation.
Arthur Franke: There is a moment where we started to see the first sets of calls coming through. And seeing them get to completion where we had completion defined as someone was getting a list of doctors or someone was getting a summary of their benefits and they were saying thank you at the end of it, that was success.
HOST: For patients like Wes Hawkins, it’s the kind of progress he's been waiting for.
Wes Hawkins: These systems or a system such as, you know, the phone tree - they were set up a while ago.
And it's one of those things, if it's not broke, don't fix it. But a lot of these people don't realize how broken it really is. If your phone cuts out more times than not, and you're not getting an answer and you have to redial, well, how many times are you really going to redial in that day?
HOST: In a world where people live more and more of their life online, there is a demand for connection -- 24 hours a day.
People want immediate answers. And with conversational AI, they get those answers without ever being put on hold.
AI can also free up more time for healthcare providers, since less time on the phone means more time on face-to-face patient care.
Arthur says it all adds up to a broader future for conversational AI technology.
Arthur Franke: I'm very optimistic about this technology. I look at folks like my niece who's in first grade – she's never known a world where she could not talk to her computer or her phone and get a response from an automated system. Um, I think it's going to become table stakes for any company, any organization who has customers, whether internal or external, who are asking for information and need to be able to get that information wherever they are.
HOST: You've been listening to Speed to Modern Tech, an original podcast from KPMG. I'm Tori Weldon.
Todd Lohr: And I’m Todd Lohr, the head of technology enablement at KPMG.
If you want to know more about the technologies and the people you heard about in this story, click on the link in the show notes.
HOST: And don’t forget to subscribe and leave a review in your favorite podcasting app. We’ll be back with more stories in two weeks.