The popular drug discount app deceptively shared details on users’ illnesses and medicines with ad firms, regulators said in a legal complaint.
Millions of Americans have used GoodRx, a drug discount app, to search for lower prices on prescriptions like antidepressants, H.I.V. medications and treatments for sexually transmitted diseases at their local drugstores. But U.S. regulators say the app’s coupons and convenience came at a high cost for users: wrongful disclosure of their intimate health information.
On Wednesday, the Federal Trade Commission accused the app’s developer, GoodRx Holdings, of sharing sensitive personal data on millions of users’ prescription medications and illnesses with companies like Facebook and Google without authorization.
The company’s information-sharing practices, the agency said, violated a federal rule requiring health apps and fitness trackers that collect personal health details to notify consumers of data breaches.
While GoodRx agreed to settle the case, it said it disagreed with the agency’s allegations and admitted no wrongdoing.
The crackdown on GoodRx comes at a moment of heightened concern over the leaking of sensitive health information, particularly in states that have banned or severely limited abortions. And it underscores the F.T.C.’s intensifying efforts to push digital health services to beef up their user privacy and security protections.
The F.T.C.’s case against GoodRx could upend widespread user-profiling and ad-targeting practices in the multibillion-dollar digital health industry, and it puts companies on notice that regulators intend to curb the nearly unfettered trade in consumers’ health details.
Over the last two decades, start-ups and giant tech companies have introduced a range of fitness devices, smartwatches and fertility apps. But unlike a person’s blood test results and other patient information collected by doctors and hospitals — which are protected by the federal Health Insurance Portability and Accountability Act, known as HIPAA — personal health details that tens of millions of consumers enter into apps or search for online, like the names of drugs or diseases, are specifically covered by few legal protections.
From 2017 to 2020, GoodRx uploaded the contact information of users who had bought certain medications, like birth control or erectile dysfunction pills, to Facebook so that the drug discount app could identify its users’ social media profiles, the F.T.C. said in a legal complaint.
GoodRx then used the personal information to target users with ads for medications on Facebook and Instagram, the complaint said, “all of which was visible to Facebook.” GoodRx also targeted users who had looked up information on sexually transmitted diseases on HeyDoctor, the company’s telemedicine service, with ads for HeyDoctor’s S.T.D. testing services, the complaint said.
Those data disclosures, regulators said, flouted public promises the company had made to “never provide advertisers any information that reveals a personal health condition.”
If a judge approves the proposed federal settlement order, GoodRx will be permanently barred from sharing users’ health information for advertising purposes. To settle the case, the company also agreed to pay a $1.5 million civil penalty for violating the health breach notification rule.
The F.T.C. is employing new legal approaches and remedies in the GoodRx case as part of its effort to bolster safeguards for the personal information collected by health apps, trackers and sites.
This is the first time that the agency has brought an enforcement action using its Health Breach Notification Rule. That rule requires health apps and connected devices that collect or use personal health information, like an individual’s heart rate or menstruation history, to notify users of breaches like cyberattacks or the unauthorized sharing of their health data. This is also the first time that a proposed F.T.C. consent order is seeking to prohibit a company from sharing users’ health data for advertising purposes.
“Digital health companies and mobile apps should not cash in on consumers’ extremely sensitive and personally identifiable health information,” Samuel Levine, director of the F.T.C.’s bureau of consumer protection, said in a statement. “The F.T.C. is serving notice that it will use all of its legal authority to protect American consumers’ sensitive data from misuse and illegal exploitation.”
GoodRx, based in Santa Monica, Calif., said in a statement that user privacy was one of its most important priorities. The company added that the settlement with the agency focused on issues that GoodRx resolved three years ago, before the F.T.C. inquiry began.
“While we had used vendor technologies to advertise in a way that we believe was compliant with all applicable regulations and that remains common practice among many health, consumer and government websites, we are proud that we took action to be an industry leader on privacy practices,” the GoodRx statement said.
Since 2017, more than 55 million people have used or visited GoodRx’s mobile apps or website, the F.T.C. said. From 2017 to 2020, the company “revealed extremely intimate and sensitive details” to third party ad tech and marketing firms like Facebook, Google, Criteo and Twilio, the complaint said, repeatedly violating its public promises not to do so. The data that was disclosed, the agency said, could link users to chronic physical and mental health issues including substance abuse.
GoodRx also did not put limits on how Facebook, Google and other companies could use its customers’ health information, federal regulators said, giving the third parties the ability to use the data for internal business purposes like research and product development.
Regulators said GoodRx also “failed to maintain sufficient” protections for users’ personal information like adequate formal, written privacy and data-sharing policies.
The F.T.C.’s case centers on GoodRx’s use of tracking tools from Facebook, Google and other companies. Millions of sites and apps use such tools — known as “pixels” and “software development kits” — to track and share data on their users’ activities with third parties for business purposes like ad targeting and user analytics.
Such tracking tools can collect information like users’ first and last names, email addresses, mobile phone numbers, unique device ID codes, locations, genders and Internet Protocol, or I.P., addresses. They can also record details on user activities like opening an app, clicking on a link or looking at information on a specific illness or medication.
While such data sharing is widespread in consumer sectors like retail and travel, the F.T.C. complaint said GoodRx’s use of tracking tools to share personal health data with advertising platforms had led to deceptive and unauthorized data disclosures — an argument that challenges business as usual in the digital health industry.
GoodRx said it removed the Facebook tracking pixel nearly three years ago.
Over the last few years, the F.T.C. has intensified its focus on health privacy.
In 2021, the F.T.C. accused the developer of Flo, a health tracking app used by more than 100 million women, of misleading users about its data-handling practices by sharing intimate health details about their periods and pregnancies with Google and Facebook. Flo agreed to a settlement with the agency that prohibited the company from misleading users on privacy and required it to obtain consent from them before sharing their health details.
But the agency’s case against GoodRx takes a tougher stance. The proposed order against GoodRx would permanently bar the company from sharing its users’ health data with third parties for advertising purposes. If approved by a federal judge, the order would also require GoodRx to direct Facebook, Google and other third parties with which it shared details about users’ medications and illnesses to delete that information.
In other words, the F.T.C., under its activist chair, Lina M. Khan, is angling to prohibit some longstanding tech industry data practices.
“Going forward, I believe the commission should approach data privacy and security protections by considering substantive limits,” Ms. Khan said in a statement in 2021, rather than just procedural protections that tend to sidestep “more fundamental questions about whether certain types of data collection and processing should be permitted in the first place.”
NYT Reporter: Natasha Singer is a business reporter covering health technology, education technology and consumer privacy. @natashanyt