cart order cont lang curr
faq about policies shipping free order testimonials search
cat cart
price descr rew
home polices track testim viewcart cont

As precision medicine efforts gain speed, more and more health systems are working hard to get their arms around vast troves of fast-moving data. And they're relying on technology - which is evolving at an equally rapid pace - to help them manage it all.

Indeed, IT shops, clinicians and data scientists are working to make sense of these myriad information types, and they're putting systems in place - many of them driven by artificial intelligence and machine learning algorithms - to help maintain the flow of useful data to drive decision-making.

Michael Draugelis, chief data scientist at Penn Medicine has some experience developing AI-enabled predictive analytics initiatives, focused on everything from heart failure readmissions to inpatient safety risks. During a HIMSS Learning Center webinar this past week, he offered some tips for other health systems looking to make the most of their data, getting it it to the right clinicians with optimal effectiveness and efficiency.

He calls it "real-time knowledge management to enable consistent model performance."

At its core it refers to "how you really make use of these technologies in operations," said Draugelis. "How do you make sure sure that the performance you've demonstrated in the laboratory is realized in the wild? And what are the booby traps in the real world that are going to stop you?"

Getting consistent repeatable performance as data moves within the clinical setting is not an easy proposition, he said. But there are ways of managing people, process and technology to help improve the odds.

"How do you know you're tuning your system in an optimized way?" said Draugelis. "How do you know you hired enough folks for intervention, how do you know the algorithm is good enough? This is a new area we're all kind of trailblazing together."

As important as the leading-edge AI analytics tools are to making this happen, nothing works without people, of course, and that's been a crucial factor at Penn Medicine.

"One new role here is a human factors scientist, which has turned out to be essential to tuning the technology to the interventions," he said. Before her arrival, "we would really get bogged down with the clinical team, trying to find the path to the right intervention, so it's really great to have someone who's targeted to that."

Data science departments have to cater to many stakeholders - CFOs, CMOs, COOs, CIOs, CQOs - who each have different needs and expectations:

"They often include hundreds of people on the continuum of care," he said. And the fact that the projects are so involved means that Penn Medicine tends to focus on just three or so every year, each with a very specific focus.

The health system has a three- to five-year roadmap, with three main missions along the way: drive automated data collection, enable prescriptive prediction, and create interactive decision support, aligned with care pathways.

Starting in 2016, Penn has focused dually on continuity of care and acute care, with an eye toward projects of increasing complexity within each sphere. The technology is there to help capitalize on an ever-expanding array of data types, but that doesn't make it easy.

"We're deploying these new technologies in a domain that isn't ready for it, wasn't built for it," said Draugelis. "We have the digital records in the EHR, but they weren't built for the high-precision, real-time algorithms running all the time."

For instance, "most of the data are collected manually, with non-random bias and error in them, and that is a challenge for these algorithms," he said. "It's still good strong data, but it could be a lot more effective if we automate it."

Crucial to those harnessing efforts is the Penn Signals platform, a predictive pipeline that processes real-time data from an array of clinical systems.

"We glom onto other open source projects whenever we can," said Draugelis. "If you're building predictive applications with machine learning, you're going to be better off using open-source technology."

That open-minded approach (pardon the pun) is essential, since work like this is so often a process of trial and error.

"We made lots of assumptions when we built the first iteration of our pipeline," he said. "One by one they were all violated."

But driving innovation using AI is learning curve, and during the Learning Center webinar, Draugelis offered some real-world insights derived from Penn Medicine projects that are well worth tuning in to hear. "We're eager to share our experience," he said.