The health apps developed at Johns Hopkins must meet high standards for accuracy and clinical benefit.
Would you spend $4.99 for an app to monitor your blood pressure?
If you said yes, you’re not alone. In 2014 and 2015, consumers downloaded one such app at least 148,000 times, placing it among the 50 most popular iPhone apps for 156 days.
The app gave blood pressure readings when users held the top edge of their phone against the left side of their chest while placing their right index finger over the phone’s camera. It received a high average rating of four stars in online reviews, with users saying they enjoyed using the app and thought it was accurate.
But were the results correct? Johns Hopkins cardiologist Seth Martin decided to find out.
In collaboration with Timothy Plante (now at the University of Vermont) and colleagues, he tested the blood pressure app on 85 volunteers, and compared the results with standard blood pressure readings using a guideline-based protocol. Their findings, published in JAMA Internal Medicine in 2016, showed that more than 75% of people with blood pressure in the hypertensive range received falsely reassuring information. The positive results, they say, may explain why users liked the app so much.
Potentially dangerous inaccuracies like that may be common in health-related apps — but it’s impossible to know for certain because most are launched with little or no outside review. A 2012 analysis of diabetes apps paints a bleak picture, though: Just five demonstrated a meaningful clinical benefit, out of 280 studied.
Within this fast-growing and lightly regulated marketplace, Johns Hopkins Medicine is setting its own high standards for accuracy and clinical value. To meet those criteria, the institution recently launched two review boards — one for apps developed by Johns Hopkins clinicians and another for applications or systems developed elsewhere and used at Johns Hopkins.
The Digital Health Scorecard
Patients often ask gastroenterologist Simon Mathews what apps they can use to help them navigate their digestive issues. Mathews can’t confidently recommend particular products, he says, because he doesn’t know if they are accurate or useful. Reviews from users, clinicians or technology experts tend to be subjective, he says.
Mathews decided to address the problem by creating a scorecard that evaluates health apps with a uniform set of guidelines.
In addition, gastroenterologist Simon Mathews, head of clinical innovation at the Armstrong Institute for Patient Safety and Quality, is leading efforts to develop a Digital Health Scorecard that provides a standardized approach for evaluating health technologies, including apps (see sidebar and image below).
“Johns Hopkins is a place that deeply respects the importance of science and evidence to guide what we do,” Martin says.
Strict Criteria for Apps Developed or Used at Johns Hopkins
Patients, physicians and other consumers can now choose from more than 300,000 health-related apps, and about 200 new ones launch each day. Yet hardly any of these downloads are tested for accuracy or clinical value. And many companies market their apps directly to patients, often promising tangible health benefits for just a few dollars.
The most popular downloads promise help managing diabetes, hypertension, depression or heart disease. Among the functions they offer, they can provide education about an illness; connect people who have similar conditions or concerns; help users track symptoms; or give reminders to take medications, exercise, or follow other disease management protocols.
Of these, Food and Drug Administration approval is only required for health apps that claim to make a diagnosis, such as a heart rate monitor that promises to detect arrhythmias. And even then, a tool some might consider diagnostic can avoid regulatory review with the caveat that it is for “recreational use only,” notes Martin.
“There are so many apps that it’s hard to regulate,” he says.
In 2018, the Johns Hopkins Technology Innovation Center (TIC), with the Johns Hopkins Medicine offices of the general counsel, marketing and communications, information technology and Johns Hopkins Technology Ventures, created a review board to ensure that apps branded with the Johns Hopkins name meet its standards. Among those criteria, the apps must have clinical benefit, preserve patient privacy and improve on existing options, says Patrick Ostendarp, product development lead in the TIC.
It helps that the apps are typically developed and shepherded to the marketplace by Johns Hopkins clinicians who deeply understand the illness or problem that the technology addresses, he notes. Apps used for clinical trials require approval from the Institutional Review Board to ensure they meet all ethical requirements, such as not misleading or harming participants.
A separate review committee looks at outside apps or technologies that Johns Hopkins employees want to use in their practices. Launched in 2016, the committee examines technology requests made by clinicians and other Johns Hopkins Medicine staffers, to make sure the applications or systems align with the institution’s security requirements and strategic goals, says committee chair Andrew Frake.
A Johns Hopkins App Success Story
One app that went through the Johns Hopkins review process and is now providing real clinical benefit is the LDL Cholesterol Calculator created by Martin and fellow cardiologist Steven Jones.
The cardiologists saw that the traditional way to measure cholesterol underestimated the desirable LDL cholesterol range for patients with a high risk for heart attack and stroke. In 2013, they launched a more complex and accurate method, based on a study of more than 1 million people. The new calculation became part of the Epic medical record system at Johns Hopkins, making it an easily accessible aspect of clinical care in the health system.
But Martin and Jones wanted clinicians outside of Johns Hopkins to benefit, so they created an app that would do the calculation. Christopher Doyle, senior IT manager at the Technology Innovation Center, led the product’s review, working with Johns Hopkins software developers to improve the user interface and create a mechanism for incorporating updates.
“Having it go through the Hopkins vetting process gives it that credibility that’s important,” says Martin. “The process ensures that our cholesterol calculator is an app that actually has evidence behind it, that actually helps patients, and that the data adheres to privacy measures.”
To date, more than 10,000 clinicians have downloaded it for use with their patients. Researchers have tested the system of calculation in different datasets throughout the world, confirming its superior accuracy and usefulness for clinical care, he says.
“I would like to see Hopkins really leading the way in showing how we bring together the right team members with different expertise to develop apps that really work, and study them in a rigorous way,” says Martin. “I think we can lead the world on this.”
PUBLISHED IN Dome November/December 2019