Across the first three months of 2020, the Health Care IT Advisor research team has conducted 30 research interviews about artificial intelligence (AI) in health care. So far, our conversations have captured insight and perspective from health system executives, clinicians, data analysts, vendors, academics, educators, and consultants.
How you can use AI to combat Covid-19 right now
We have identified four important themes from these conversations. We will continue to explore these and other important issues and applications of AI—including its recent deployments against Covid-19—throughout 2020.
Our conversations unearthed three main reasons why AI initiatives often fail to realize value:
1. Initiatives often are not linked to a clinical or business goal;
2. They do not have executive-level buy-in; and
3. Health systems are not working with the right building blocks or capabilities in place to begin with.
We spoke with Tushar Mehrotra, senior VP of analytics at Optum, who noted that without a clear tie to organizational goals, AI pilots are nothing more than an academic exercise. "All of these efforts are irrelevant unless you can tie them to a business model and define some return on investment," said Mehrotra. (Daily Briefing is published by Advisory Board, a division of Optum, which is a wholly owned subsidiary of UnitedHealth Group.)
Paul Bleicher, principal at Evident Health Strategies and the former CEO of OptumLabs, had a similar perspective: "The single most difficult problem in every aspect of scientific or analytic pursuit is asking the right questions. It's very easy for technologists to go out into the field and just do things because they have the data and computing power, but on its own is not very helpful."
In addition, AI initiatives need the support and investment from key decision makers to make a long-term impact. When we spoke to Ray Deiotte, chief data officer at Centura Health, he noted that "the biggest impediment to executing an enterprise-wide AI strategy, in any industry, is not having enough 'top-down' understanding of impact and influence to adopt and invest in data and analytics as an enterprise capability."
Finally, our interviewees stressed that you need to begin AI projects with the right tools in place. This can refer to a number of factors, including having enough high-quality data sets, adequate staff skillsets, and powerful IT infrastructure. Health care organizations often train their AI using large claim datasets, which are useful, but still one dimensional—bringing in new sources of data (e.g., genomic, behavioral, social determinants) will bring in a whole new level of algorithmic sophistication.
Having enough talent is another obvious but persistent barrier for many health care providers trying to leverage AI. There is a clear demand for hiring data scientists these days, but there are other important complementary and emerging roles, such as Chief Data Officers. Larger health care organizations will likely look to invest in talent that's AI-savvy to truly derive value from large-scale AI efforts, assuming they have the right leadership buy-in and data-driven culture in place.
Data, analytics, and advanced AI capabilities can offer a range of actionable insights, but these technological assets need to be embedded into workflow to truly convince physicians and other end users of a model’s value. The inability to incorporate new AI solutions into pre-existing workflow is a major reason why organizations aren't able to scale. This sentiment was echoed by Matt Seefeld, Executive VP at MedEvolve, when he told us that "the most important piece of workflow automation is making sure your staff are focused on the tasks that will deliver the greatest impact."
Our conversations unearthed two main considerations when determining your AI approach:
1. Be realistic about the scope of in-house AI projects.
2. Don't overlook established partnerships when considering AI.
The reality is that not every health care provider has an internal team of data scientists ready to collect data, train a model, deploy it, and maintain stringent ongoing performance monitoring. So choose your battles. When we spoke with Jim Bonnette, executive VP at Optum, he noted that health care providers should be more self-aware here: "When we speak to health care organizations we have to ask them why they think they have a core skill in AI. You're not a tech company and you never will be, so don't pretend to be."
For those organizations that decide to start working with AI, it's best to scope down initial efforts around what is feasible and strategically valuable. We spoke to Donald Combs, VP and founding dean of the School of Health Professions at the Eastern Virginia Medical School, who noted, "A crucial step to understanding AI is knowing the difference between general and specific AI. Almost everyone wants to jump into general AI, but those applications won't occur for some time. Specific (or narrow) AI applications should be the focus today given the many use cases already occurring. The second step is evaluating how to use specific AI for your hospital's top three to five service lines to reduce costs, improve outcomes, or drive some other business goal."
For most organizations, the answer is to outsource at least some of their AI functions or partner with industry vendors. Many health providers are naturally starting by looking at their existing EHRs, considering the amount of time, money, and energy they've already invested in these systems. Bonnette noted that health systems "have invested so much on EHRs and now they’re capital starved and don’t have money to spend on IT. They don’t have the margins and they are going to be looking for ways to reduce their costs. Given all of that, outsourced AI tools can be attractive.” For many organizations, using an existing system instead of bringing in yet another AI vendor could mean fewer overall costs and easier integration.
We also spoke to Seth Hain, SVP of research and development at Epic, to learn more about their AI capabilities. Hain noted that Epic offers customers access to a machine-learning library that provides ready-to-deploy algorithms for a number of applications across acute care (e.g., predicting sepsis), population health management, and other operational improvements (e.g., patient flow). Hain highlighted that algorithms are in use across hundreds of Epic's customers, and they're already making an impact. For example, Ochsner Health System used Epic's machine-learning platform to improve patient deterioration predictions, leading to a 44% reduction in adverse events outside of the ICU during a 90-day pilot.
Of course many other health providers will look beyond their EHR for solutions, depending on their goals and the functionality they seek. The challenge with this approach is being inundated with offers from young companies offering immature or untested technology. We spoke with a Chief Nursing Officer at a hospital located in the Silicon Valley who expressed how trying to evaluate new solutions is burdensome: "I get emails every day with sales pitches from local AI startups."
In many of our interviews, executives stressed that AI is not only about the technology. There were additional non-IT factors to consider:
1. Adopt a culture of learning and agility; and
2. Embed this learning culture into your long-term AI strategy.
With so many potential AI applications and its potential impact to clinician workflow, becoming a learning health system is key. That is one of the main takeaways from our discussion with MedStar Montgomery Medical Center President Thomas Senker and Frederick Finelli, VP of medical affairs. Both Senker and Finelli feel that MedStar's real advantage is their culture of being a learning organization with a lean operating system. MedStar is willing to try new things and fail—the key is bouncing back quickly from failure. "Our whole philosophy is built on the idea of continuous improvement," said Finelli.
MedStar understood that they couldn't rush to establish this culture, so they approached AI initiatives with patience and set longer timeframes for when they expect an ROI. AI solutions are not viewed as magical tools that work right away—they require a lot of time thinking about workflow and process management. An organization's long-term strategic persistence also applies to leadership. The CEO for example, is often a crucial stakeholder to champion new initiatives and sustain them as an enterprise-wide priority.
There are two common fears that arise in many discussions about AI:
1. That AI will result in widespread loss of human jobs; and
2. That increased reliance on AI will ruin the physician-patient relationship.
There is one discussion that inevitably comes up when talking about AI in health care: the threat to human jobs. Automation has reduced the need for humans to perform a number of repetitive, low-skill work in various industries and health care is no different. However, AI can help to repurpose where we spend our time, opening up opportunities for higher-value work (not to mention the growth in new job roles and services that come with building, testing, deploying, and maintaining AI solutions). "It's not necessarily that AI is going to replace your job, but it will help to ensure you have the right people in the right places," said Seefeld.
When we spoke with John Showalter, Chief Product Officer at Jvion, he had a similar take, noting, "AI is augmenting clinicians, not replacing them. Clinically, we want to know that we improved a patient’s quality of life, which requires clinicians to always be kept in the loop."
Another concern we've been hearing is that AI, through its use of robotic process automation or virtual assistants, can start to "de-humanize" health care. For some, the core of medicine has always been about the human connection between a patient and a clinician. However, much of the disruption we see today in the patient-clinician experience is coming from time-intensive documentation with the EHR. New AI tools and the rise of ambient computing are opening up new ways to relieve clinicians of some of their existing burdens, allowing them to take more time to engage with their patients and re-establish that human touch to care. David Hurwitz, associate CMIO at Allscripts, agreed with this idea, noting that "AI can get humans talking to humans again" if it is implemented the right way.
Create your free account to access 1 resource, including the latest research and webinars.
You have 1 free members-only resource remaining this month.
1 free members-only resources remaining
1 free members-only resources remaining
Never miss out on the latest innovative health care content tailored to you.